Compare commits

..

No commits in common. "master" and "2021.12.25" have entirely different histories.

1426 changed files with 64976 additions and 122285 deletions

View File

@ -1,8 +0,0 @@
root = true
[**.py]
charset = utf-8
indent_size = 4
indent_style = space
trim_trailing_whitespace = true
insert_final_newline = true

2
.gitattributes vendored
View File

@ -2,5 +2,3 @@
Makefile* text whitespace=-tab-in-indent
*.sh text eol=lf
*.md diff=markdown
*.py diff=python

View File

@ -1,14 +1,7 @@
name: Broken site support
description: Report issue with yt-dlp on a supported site
description: Report broken or misfunctioning site
labels: [triage, site-bug]
body:
- type: checkboxes
attributes:
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
description: Fill all fields even if you think it is irrelevant for the issue
options:
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
required: true
- type: checkboxes
id: checklist
attributes:
@ -16,15 +9,15 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm reporting that yt-dlp is broken on a **supported** site
- label: I'm reporting a broken site
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- label: I've verified that I'm running yt-dlp version **2021.12.25**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
- label: I've checked that all provided URLs are alive and playable in a browser
required: true
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
@ -33,46 +26,37 @@ body:
id: region
attributes:
label: Region
description: Enter the country/region that the site is accessible from
placeholder: India
description: "Enter the region the site is accessible from"
placeholder: "India"
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide an explanation of your issue in an arbitrary form.
Provide any additional information, any suggested solutions, and as much context and examples as possible
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
- type: checkboxes
id: verbose
attributes:
label: Provide verbose output that clearly demonstrates the problem
options:
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
required: true
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
required: false
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
required: true
- type: textarea
id: log
attributes:
label: Complete Verbose Output
label: Verbose log
description: |
It should start like this:
Provide the complete verbose output of yt-dlp **that clearly demonstrates the problem**.
Add the `-Uv` flag to your command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version 2021.12.25 (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
yt-dlp is up to date (2021.12.25)
<more lines>
render: shell
validations:

View File

@ -2,13 +2,6 @@ name: Site support request
description: Request support for a new site
labels: [triage, site-request]
body:
- type: checkboxes
attributes:
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
description: Fill all fields even if you think it is irrelevant for the issue
options:
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
required: true
- type: checkboxes
id: checklist
attributes:
@ -18,13 +11,13 @@ body:
options:
- label: I'm reporting a new site support request
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- label: I've verified that I'm running yt-dlp version **2021.12.25**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
- label: I've checked that all provided URLs are alive and playable in a browser
required: true
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
@ -33,8 +26,8 @@ body:
id: region
attributes:
label: Region
description: Enter the country/region that the site is accessible from
placeholder: India
description: "Enter the region the site is accessible from"
placeholder: "India"
- type: textarea
id: example-urls
attributes:
@ -50,41 +43,31 @@ body:
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide any additional information
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
- type: checkboxes
id: verbose
attributes:
label: Provide verbose output that clearly demonstrates the problem
options:
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
required: true
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
required: false
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
required: true
- type: textarea
id: log
attributes:
label: Complete Verbose Output
label: Verbose log
description: |
It should start like this:
Provide the complete verbose output **using one of the example URLs provided above**.
Add the `-Uv` flag to your command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version 2021.12.25 (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
yt-dlp is up to date (2021.12.25)
<more lines>
render: shell
validations:

View File

@ -2,13 +2,6 @@ name: Site feature request
description: Request a new functionality for a supported site
labels: [triage, site-enhancement]
body:
- type: checkboxes
attributes:
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
description: Fill all fields even if you think it is irrelevant for the issue
options:
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
required: true
- type: checkboxes
id: checklist
attributes:
@ -16,13 +9,13 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm requesting a site-specific feature
- label: I'm reporting a site feature request
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- label: I've verified that I'm running yt-dlp version **2021.12.25**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
- label: I've checked that all provided URLs are alive and playable in a browser
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
@ -31,56 +24,48 @@ body:
id: region
attributes:
label: Region
description: Enter the country/region that the site is accessible from
placeholder: India
description: "Enter the region the site is accessible from"
placeholder: "India"
- type: textarea
id: example-urls
attributes:
label: Example URLs
description: |
Example URLs that can be used to demonstrate the requested feature
placeholder: |
value: |
https://www.youtube.com/watch?v=BaW_jenozKc
validations:
required: true
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide an explanation of your site feature request in an arbitrary form.
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
Provide any additional information, any suggested solutions, and as much context and examples as possible
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
- type: checkboxes
id: verbose
attributes:
label: Provide verbose output that clearly demonstrates the problem
options:
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
required: true
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
required: false
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
required: true
- type: textarea
id: log
attributes:
label: Complete Verbose Output
label: Verbose log
description: |
It should start like this:
Provide the complete verbose output of yt-dlp that demonstrates the need for the enhancement.
Add the `-Uv` flag to your command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version 2021.12.25 (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
yt-dlp is up to date (2021.12.25)
<more lines>
render: shell
validations:

View File

@ -1,14 +1,7 @@
name: Core bug report
name: Bug report
description: Report a bug unrelated to any particular site or extractor
labels: [triage, bug]
body:
- type: checkboxes
attributes:
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
description: Fill all fields even if you think it is irrelevant for the issue
options:
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
required: true
- type: checkboxes
id: checklist
attributes:
@ -18,54 +11,46 @@ body:
options:
- label: I'm reporting a bug unrelated to a specific site
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- label: I've verified that I'm running yt-dlp version **2021.12.25**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
- label: I've checked that all provided URLs are alive and playable in a browser
required: true
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide an explanation of your issue in an arbitrary form.
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
Provide any additional information, any suggested solutions, and as much context and examples as possible
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
- type: checkboxes
id: verbose
attributes:
label: Provide verbose output that clearly demonstrates the problem
options:
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
required: true
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
required: false
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
required: true
- type: textarea
id: log
attributes:
label: Complete Verbose Output
label: Verbose log
description: |
It should start like this:
Provide the complete verbose output of yt-dlp **that clearly demonstrates the problem**.
Add the `-Uv` flag to **your** command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version 2021.12.25 (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
yt-dlp is up to date (2021.12.25)
<more lines>
render: shell
validations:

View File

@ -2,13 +2,6 @@ name: Feature request
description: Request a new functionality unrelated to any particular site or extractor
labels: [triage, enhancement]
body:
- type: checkboxes
attributes:
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
description: Fill all fields even if you think it is irrelevant for the issue
options:
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
required: true
- type: checkboxes
id: checklist
attributes:
@ -16,51 +9,22 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm requesting a feature unrelated to a specific site
- label: I'm reporting a feature request
required: true
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
- label: I've verified that I'm running yt-dlp version **2021.12.25**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide an explanation of your site feature request in an arbitrary form.
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
Provide any additional information, any suggested solutions, and as much context and examples as possible
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
- type: checkboxes
id: verbose
attributes:
label: Provide verbose output that clearly demonstrates the problem
options:
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
required: false
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
- type: textarea
id: log
attributes:
label: Complete Verbose Output
description: |
It should start like this:
placeholder: |
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Proxy map: {}
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
<more lines>
render: shell

View File

@ -2,19 +2,6 @@ name: Ask question
description: Ask yt-dlp related question
labels: [question]
body:
- type: checkboxes
attributes:
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
description: Fill all fields even if you think it is irrelevant for the issue
options:
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
required: true
- type: markdown
attributes:
value: |
### Make sure you are **only** asking a question and not reporting a bug or requesting a feature.
If your question contains "isn't working" or "can you add", this is most likely the wrong template.
If you are in doubt whether this is the right template, **USE ANOTHER TEMPLATE**!
- type: checkboxes
id: checklist
attributes:
@ -22,51 +9,44 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm asking a question and **not** reporting a bug or requesting a feature
- label: I'm asking a question and **not** reporting a bug/feature request
required: true
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions including closed ones
required: true
- type: textarea
id: question
attributes:
label: Please make sure the question is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information and as much context and examples as possible
label: Question
description: |
Ask your question in an arbitrary form.
Please make sure it's worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
Provide any additional information and as much context and examples as possible.
If your question contains "isn't working" or "can you add", this is most likely the wrong template
placeholder: WRITE QUESTION HERE
validations:
required: true
- type: checkboxes
id: verbose
attributes:
label: Provide verbose output that clearly demonstrates the problem
options:
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
required: false
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
- type: textarea
id: log
attributes:
label: Complete Verbose Output
label: Verbose log
description: |
It should start like this:
If your question involes a yt-dlp command, provide the complete verbose output of that command.
Add the `-Uv` flag to **your** command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version 2021.12.01 (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
yt-dlp is up to date (2021.12.01)
<more lines>
render: shell

View File

@ -3,6 +3,3 @@ contact_links:
- name: Get help from the community on Discord
url: https://discord.gg/H5MNcFW63r
about: Join the yt-dlp Discord for community-powered support!
- name: Matrix Bridge to the Discord server
url: https://matrix.to/#/#yt-dlp:matrix.org
about: For those who do not want to use Discord

View File

@ -1,8 +1,7 @@
name: Broken site support
description: Report issue with yt-dlp on a supported site
description: Report broken or misfunctioning site
labels: [triage, site-bug]
body:
%(no_skip)s
- type: checkboxes
id: checklist
attributes:
@ -10,15 +9,15 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm reporting that yt-dlp is broken on a **supported** site
- label: I'm reporting a broken site
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
- label: I've checked that all provided URLs are alive and playable in a browser
required: true
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
@ -27,14 +26,38 @@ body:
id: region
attributes:
label: Region
description: Enter the country/region that the site is accessible from
placeholder: India
description: "Enter the region the site is accessible from"
placeholder: "India"
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide an explanation of your issue in an arbitrary form.
Provide any additional information, any suggested solutions, and as much context and examples as possible
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
- type: textarea
id: log
attributes:
label: Verbose log
description: |
Provide the complete verbose output of yt-dlp **that clearly demonstrates the problem**.
Add the `-Uv` flag to your command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version %(version)s (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
yt-dlp is up to date (%(version)s)
<more lines>
render: shell
validations:
required: true
%(verbose)s

View File

@ -2,7 +2,6 @@ name: Site support request
description: Request support for a new site
labels: [triage, site-request]
body:
%(no_skip)s
- type: checkboxes
id: checklist
attributes:
@ -12,13 +11,13 @@ body:
options:
- label: I'm reporting a new site support request
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
- label: I've checked that all provided URLs are alive and playable in a browser
required: true
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
@ -27,8 +26,8 @@ body:
id: region
attributes:
label: Region
description: Enter the country/region that the site is accessible from
placeholder: India
description: "Enter the region the site is accessible from"
placeholder: "India"
- type: textarea
id: example-urls
attributes:
@ -44,9 +43,32 @@ body:
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide any additional information
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
- type: textarea
id: log
attributes:
label: Verbose log
description: |
Provide the complete verbose output **using one of the example URLs provided above**.
Add the `-Uv` flag to your command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version %(version)s (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
yt-dlp is up to date (%(version)s)
<more lines>
render: shell
validations:
required: true
%(verbose)s

View File

@ -2,7 +2,6 @@ name: Site feature request
description: Request a new functionality for a supported site
labels: [triage, site-enhancement]
body:
%(no_skip)s
- type: checkboxes
id: checklist
attributes:
@ -10,13 +9,13 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm requesting a site-specific feature
- label: I'm reporting a site feature request
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
- label: I've checked that all provided URLs are alive and playable in a browser
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
@ -25,24 +24,49 @@ body:
id: region
attributes:
label: Region
description: Enter the country/region that the site is accessible from
placeholder: India
description: "Enter the region the site is accessible from"
placeholder: "India"
- type: textarea
id: example-urls
attributes:
label: Example URLs
description: |
Example URLs that can be used to demonstrate the requested feature
placeholder: |
value: |
https://www.youtube.com/watch?v=BaW_jenozKc
validations:
required: true
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide an explanation of your site feature request in an arbitrary form.
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
Provide any additional information, any suggested solutions, and as much context and examples as possible
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
- type: textarea
id: log
attributes:
label: Verbose log
description: |
Provide the complete verbose output of yt-dlp that demonstrates the need for the enhancement.
Add the `-Uv` flag to your command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version %(version)s (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
yt-dlp is up to date (%(version)s)
<more lines>
render: shell
validations:
required: true
%(verbose)s

View File

@ -1,8 +1,7 @@
name: Core bug report
name: Bug report
description: Report a bug unrelated to any particular site or extractor
labels: [triage, bug]
body:
%(no_skip)s
- type: checkboxes
id: checklist
attributes:
@ -12,22 +11,47 @@ body:
options:
- label: I'm reporting a bug unrelated to a specific site
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
- label: I've checked that all provided URLs are alive and playable in a browser
required: true
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide an explanation of your issue in an arbitrary form.
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
Provide any additional information, any suggested solutions, and as much context and examples as possible
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
- type: textarea
id: log
attributes:
label: Verbose log
description: |
Provide the complete verbose output of yt-dlp **that clearly demonstrates the problem**.
Add the `-Uv` flag to **your** command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version %(version)s (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
yt-dlp is up to date (%(version)s)
<more lines>
render: shell
validations:
required: true
%(verbose)s

View File

@ -2,7 +2,6 @@ name: Feature request
description: Request a new functionality unrelated to any particular site or extractor
labels: [triage, enhancement]
body:
%(no_skip)s
- type: checkboxes
id: checklist
attributes:
@ -10,22 +9,22 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm requesting a feature unrelated to a specific site
- label: I'm reporting a feature request
required: true
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
- label: I've verified that I'm running yt-dlp version **%(version)s**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
- type: textarea
id: description
attributes:
label: Provide a description that is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information, any suggested solutions, and as much context and examples as possible
label: Description
description: |
Provide an explanation of your site feature request in an arbitrary form.
Please make sure the description is worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
Provide any additional information, any suggested solutions, and as much context and examples as possible
placeholder: WRITE DESCRIPTION HERE
validations:
required: true
%(verbose_optional)s

View File

@ -2,13 +2,6 @@ name: Ask question
description: Ask yt-dlp related question
labels: [question]
body:
%(no_skip)s
- type: markdown
attributes:
value: |
### Make sure you are **only** asking a question and not reporting a bug or requesting a feature.
If your question contains "isn't working" or "can you add", this is most likely the wrong template.
If you are in doubt whether this is the right template, **USE ANOTHER TEMPLATE**!
- type: checkboxes
id: checklist
attributes:
@ -16,22 +9,44 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm asking a question and **not** reporting a bug or requesting a feature
- label: I'm asking a question and **not** reporting a bug/feature request
required: true
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
required: true
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
required: true
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
required: true
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions including closed ones
required: true
- type: textarea
id: question
attributes:
label: Please make sure the question is worded well enough to be understood
description: See [is-the-description-of-the-issue-itself-sufficient](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-description-of-the-issue-itself-sufficient)
placeholder: Provide any additional information and as much context and examples as possible
label: Question
description: |
Ask your question in an arbitrary form.
Please make sure it's worded well enough to be understood, see [is-the-description-of-the-issue-itself-sufficient](https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient).
Provide any additional information and as much context and examples as possible.
If your question contains "isn't working" or "can you add", this is most likely the wrong template
placeholder: WRITE QUESTION HERE
validations:
required: true
%(verbose_optional)s
- type: textarea
id: log
attributes:
label: Verbose log
description: |
If your question involes a yt-dlp command, provide the complete verbose output of that command.
Add the `-Uv` flag to **your** command line you run yt-dlp with (`yt-dlp -Uv <your command line>`), copy the WHOLE output and insert it below.
It should look similar to this:
placeholder: |
[debug] Command-line config: ['-Uv', 'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Portable config file: yt-dlp.conf
[debug] Portable config: ['-i']
[debug] Encodings: locale cp1252, fs utf-8, stdout utf-8, stderr utf-8, pref cp1252
[debug] yt-dlp version 2021.12.01 (exe)
[debug] Python version 3.8.8 (CPython 64bit) - Windows-10-10.0.19041-SP0
[debug] exe versions: ffmpeg 3.0.1, ffprobe 3.0.1
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite, websockets
[debug] Proxy map: {}
yt-dlp is up to date (2021.12.01)
<more lines>
render: shell

View File

@ -1,42 +1,28 @@
**IMPORTANT**: PRs without the template will be CLOSED
### Description of your *pull request* and other information
<!--
Explanation of your *pull request* in arbitrary form goes here. Please **make sure the description explains the purpose and effect** of your *pull request* and is worded well enough to be understood. Provide as much **context and examples** as possible
-->
ADD DESCRIPTION HERE
Fixes #
<details open><summary>Template</summary> <!-- OPEN is intentional -->
<!--
# PLEASE FOLLOW THE GUIDE BELOW
## Please follow the guide below
- You will be asked some questions, please read them **carefully** and answer honestly
- Put an `x` into all the boxes `[ ]` relevant to your *pull request* (like [x])
- Put an `x` into all the boxes [ ] relevant to your *pull request* (like that [x])
- Use *Preview* tab to see how your *pull request* will actually look like
-->
---
### Before submitting a *pull request* make sure you have:
- [ ] At least skimmed through [contributing guidelines](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#developer-instructions) including [yt-dlp coding conventions](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#yt-dlp-coding-conventions)
- [ ] [Searched](https://github.com/yt-dlp/yt-dlp/search?q=is%3Apr&type=Issues) the bugtracker for similar pull requests
- [ ] Checked the code with [flake8](https://pypi.python.org/pypi/flake8)
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check all of the following options that apply:
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check one of the following options:
- [ ] I am the original author of this code and I am willing to release it under [Unlicense](http://unlicense.org/)
- [ ] I am not the original author of this code but it is in public domain or released under [Unlicense](http://unlicense.org/) (provide reliable evidence)
### What is the purpose of your *pull request*?
- [ ] Fix or improvement to an extractor (Make sure to add/update tests)
- [ ] New extractor ([Piracy websites will not be accepted](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy))
- [ ] Core bug fix/improvement
- [ ] New feature (It is strongly [recommended to open an issue first](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#adding-new-feature-or-making-overarching-changes))
- [ ] Bug fix
- [ ] Improvement
- [ ] New extractor
- [ ] New feature
</details>
---
### Description of your *pull request* and other information
Explanation of your *pull request* in arbitrary form goes here. Please make sure the description explains the purpose and effect of your *pull request* and is worded well enough to be understood. Provide as much context and examples as possible.

10
.github/banner.svg vendored

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 24 KiB

View File

@ -1,563 +1,418 @@
name: Build Artifacts
on:
workflow_call:
inputs:
version:
required: true
type: string
channel:
required: false
default: stable
type: string
unix:
default: true
type: boolean
linux_static:
default: true
type: boolean
linux_arm:
default: true
type: boolean
macos:
default: true
type: boolean
macos_legacy:
default: true
type: boolean
windows:
default: true
type: boolean
windows32:
default: true
type: boolean
origin:
required: false
default: ''
type: string
secrets:
GPG_SIGNING_KEY:
required: false
workflow_dispatch:
inputs:
version:
description: |
VERSION: yyyy.mm.dd[.rev] or rev
required: true
type: string
channel:
description: |
SOURCE of this build's updates: stable/nightly/master/<repo>
required: true
default: stable
type: string
unix:
description: yt-dlp, yt-dlp.tar.gz
default: true
type: boolean
linux_static:
description: yt-dlp_linux
default: true
type: boolean
linux_arm:
description: yt-dlp_linux_aarch64, yt-dlp_linux_armv7l
default: true
type: boolean
macos:
description: yt-dlp_macos, yt-dlp_macos.zip
default: true
type: boolean
macos_legacy:
description: yt-dlp_macos_legacy
default: true
type: boolean
windows:
description: yt-dlp.exe, yt-dlp_min.exe, yt-dlp_win.zip
default: true
type: boolean
windows32:
description: yt-dlp_x86.exe
default: true
type: boolean
origin:
description: Origin
required: false
default: 'current repo'
type: choice
options:
- 'current repo'
permissions:
contents: read
name: Build
on: workflow_dispatch
jobs:
process:
build_unix:
runs-on: ubuntu-latest
outputs:
origin: ${{ steps.process_origin.outputs.origin }}
steps:
- name: Process origin
id: process_origin
run: |
echo "origin=${{ inputs.origin == 'current repo' && github.repository || inputs.origin }}" | tee "$GITHUB_OUTPUT"
unix:
needs: process
if: inputs.unix
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0 # Needed for changelog
- uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install Requirements
run: |
sudo apt -y install zip pandoc man sed
- name: Prepare
run: |
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python devscripts/update_changelog.py -vv
python devscripts/make_lazy_extractors.py
- name: Build Unix platform-independent binary
run: |
make all tar
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
chmod +x ./yt-dlp
cp ./yt-dlp ./yt-dlp_downgraded
version="$(./yt-dlp --version)"
./yt-dlp_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
downgraded_version="$(./yt-dlp_downgraded --version)"
[[ "$version" != "$downgraded_version" ]]
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
yt-dlp
yt-dlp.tar.gz
compression-level: 0
linux_static:
needs: process
if: inputs.linux_static
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Build static executable
env:
channel: ${{ inputs.channel }}
origin: ${{ needs.process.outputs.origin }}
version: ${{ inputs.version }}
run: |
mkdir ~/build
cd bundle/docker
docker compose up --build static
sudo chown "${USER}:docker" ~/build/yt-dlp_linux
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
chmod +x ~/build/yt-dlp_linux
cp ~/build/yt-dlp_linux ~/build/yt-dlp_linux_downgraded
version="$(~/build/yt-dlp_linux --version)"
~/build/yt-dlp_linux_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
downgraded_version="$(~/build/yt-dlp_linux_downgraded --version)"
[[ "$version" != "$downgraded_version" ]]
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
~/build/yt-dlp_linux
compression-level: 0
linux_arm:
needs: process
if: inputs.linux_arm
permissions:
contents: read
packages: write # for creating cache
runs-on: ubuntu-latest
strategy:
matrix:
architecture:
- armv7
- aarch64
version_suffix: ${{ steps.version_suffix.outputs.version_suffix }}
ytdlp_version: ${{ steps.bump_version.outputs.ytdlp_version }}
upload_url: ${{ steps.create_release.outputs.upload_url }}
sha256_bin: ${{ steps.sha256_bin.outputs.sha256_bin }}
sha512_bin: ${{ steps.sha512_bin.outputs.sha512_bin }}
sha256_tar: ${{ steps.sha256_tar.outputs.sha256_tar }}
sha512_tar: ${{ steps.sha512_tar.outputs.sha512_tar }}
steps:
- uses: actions/checkout@v4
with:
path: ./repo
- name: Virtualized Install, Prepare & Build
uses: yt-dlp/run-on-arch-action@v2
with:
# Ref: https://github.com/uraimo/run-on-arch-action/issues/55
env: |
GITHUB_WORKFLOW: build
githubToken: ${{ github.token }} # To cache image
arch: ${{ matrix.architecture }}
distro: ubuntu18.04 # Standalone executable should be built on minimum supported OS
dockerRunArgs: --volume "${PWD}/repo:/repo"
install: | # Installing Python 3.10 from the Deadsnakes repo raises errors
apt update
apt -y install zlib1g-dev libffi-dev python3.8 python3.8-dev python3.8-distutils python3-pip
python3.8 -m pip install -U pip setuptools wheel
# Cannot access any files from the repo directory at this stage
python3.8 -m pip install -U Pyinstaller mutagen pycryptodomex websockets brotli certifi secretstorage cffi
- uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.8'
- name: Install packages
run: sudo apt-get -y install zip pandoc man
- name: Set version suffix
id: version_suffix
env:
PUSH_VERSION_COMMIT: ${{ secrets.PUSH_VERSION_COMMIT }}
if: "env.PUSH_VERSION_COMMIT == ''"
run: echo ::set-output name=version_suffix::$(date -u +"%H%M%S")
- name: Bump version
id: bump_version
run: |
python devscripts/update-version.py ${{ steps.version_suffix.outputs.version_suffix }}
make issuetemplates
- name: Push to release
id: push_release
run: |
git config --global user.name github-actions
git config --global user.email github-actions@example.com
git add -u
git commit -m "[version] update" -m "Created by: ${{ github.event.sender.login }}" -m ":ci skip all"
git push origin --force ${{ github.event.ref }}:release
echo ::set-output name=head_sha::$(git rev-parse HEAD)
- name: Update master
id: push_master
env:
PUSH_VERSION_COMMIT: ${{ secrets.PUSH_VERSION_COMMIT }}
if: "env.PUSH_VERSION_COMMIT != ''"
run: git push origin ${{ github.event.ref }}
- name: Get Changelog
id: get_changelog
run: |
changelog=$(cat Changelog.md | grep -oPz '(?s)(?<=### ${{ steps.bump_version.outputs.ytdlp_version }}\n{2}).+?(?=\n{2,3}###)') || true
echo "changelog<<EOF" >> $GITHUB_ENV
echo "$changelog" >> $GITHUB_ENV
echo "EOF" >> $GITHUB_ENV
run: |
cd repo
python3.8 devscripts/install_deps.py -o --include build
python3.8 devscripts/install_deps.py --include pyinstaller --include secretstorage # Cached version may be out of date
python3.8 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python3.8 devscripts/make_lazy_extractors.py
python3.8 -m bundle.pyinstaller
- name: Build lazy extractors
id: lazy_extractors
run: python devscripts/make_lazy_extractors.py
- name: Run Make
run: make all tar
- name: Get SHA2-256SUMS for yt-dlp
id: sha256_bin
run: echo "::set-output name=sha256_bin::$(sha256sum yt-dlp | awk '{print $1}')"
- name: Get SHA2-256SUMS for yt-dlp.tar.gz
id: sha256_tar
run: echo "::set-output name=sha256_tar::$(sha256sum yt-dlp.tar.gz | awk '{print $1}')"
- name: Get SHA2-512SUMS for yt-dlp
id: sha512_bin
run: echo "::set-output name=sha512_bin::$(sha512sum yt-dlp | awk '{print $1}')"
- name: Get SHA2-512SUMS for yt-dlp.tar.gz
id: sha512_tar
run: echo "::set-output name=sha512_tar::$(sha512sum yt-dlp.tar.gz | awk '{print $1}')"
if ${{ vars.UPDATE_TO_VERIFICATION && 'true' || 'false' }}; then
arch="${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}"
chmod +x ./dist/yt-dlp_linux_${arch}
cp ./dist/yt-dlp_linux_${arch} ./dist/yt-dlp_linux_${arch}_downgraded
version="$(./dist/yt-dlp_linux_${arch} --version)"
./dist/yt-dlp_linux_${arch}_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
downgraded_version="$(./dist/yt-dlp_linux_${arch}_downgraded --version)"
[[ "$version" != "$downgraded_version" ]]
fi
- name: Install dependencies for pypi
env:
PYPI_TOKEN: ${{ secrets.PYPI_TOKEN }}
if: "env.PYPI_TOKEN != ''"
run: |
python -m pip install --upgrade pip
pip install setuptools wheel twine
- name: Build and publish on pypi
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
if: "env.TWINE_PASSWORD != ''"
run: |
rm -rf dist/*
python setup.py sdist bdist_wheel
twine upload dist/*
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: build-bin-linux_${{ matrix.architecture }}
path: | # run-on-arch-action designates armv7l as armv7
repo/dist/yt-dlp_linux_${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}
compression-level: 0
- name: Install SSH private key
env:
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
if: "env.BREW_TOKEN != ''"
uses: webfactory/ssh-agent@v0.5.3
with:
ssh-private-key: ${{ env.BREW_TOKEN }}
- name: Update Homebrew Formulae
env:
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
if: "env.BREW_TOKEN != ''"
run: |
git clone git@github.com:yt-dlp/homebrew-taps taps/
python3 devscripts/update-formulae.py taps/Formula/yt-dlp.rb "${{ steps.bump_version.outputs.ytdlp_version }}"
git -C taps/ config user.name github-actions
git -C taps/ config user.email github-actions@example.com
git -C taps/ commit -am 'yt-dlp: ${{ steps.bump_version.outputs.ytdlp_version }}'
git -C taps/ push
macos:
needs: process
if: inputs.macos
permissions:
contents: read
actions: write # For cleaning up cache
runs-on: macos-12
- name: Create Release
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ steps.bump_version.outputs.ytdlp_version }}
release_name: yt-dlp ${{ steps.bump_version.outputs.ytdlp_version }}
commitish: ${{ steps.push_release.outputs.head_sha }}
body: |
#### [A description of the various files]((https://github.com/yt-dlp/yt-dlp#release-files)) are in the README
---
### Changelog:
${{ env.changelog }}
draft: false
prerelease: false
- name: Upload yt-dlp Unix binary
id: upload-release-asset
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./yt-dlp
asset_name: yt-dlp
asset_content_type: application/octet-stream
- name: Upload Source tar
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./yt-dlp.tar.gz
asset_name: yt-dlp.tar.gz
asset_content_type: application/gzip
build_macos:
runs-on: macos-11
needs: build_unix
outputs:
sha256_macos: ${{ steps.sha256_macos.outputs.sha256_macos }}
sha512_macos: ${{ steps.sha512_macos.outputs.sha512_macos }}
sha256_macos_zip: ${{ steps.sha256_macos_zip.outputs.sha256_macos_zip }}
sha512_macos_zip: ${{ steps.sha512_macos_zip.outputs.sha512_macos_zip }}
steps:
- uses: actions/checkout@v4
# NB: Building universal2 does not work with python from actions/setup-python
- name: Restore cached requirements
id: restore-cache
uses: actions/cache/restore@v4
env:
SEGMENT_DOWNLOAD_TIMEOUT_MINS: 1
with:
path: |
~/yt-dlp-build-venv
key: cache-reqs-${{ github.job }}
- name: Install Requirements
run: |
- uses: actions/checkout@v2
# In order to create a universal2 application, the version of python3 in /usr/bin has to be used
# Pyinstaller is pinned to 4.5.1 because the builds are failing in 4.6, 4.7
- name: Install Requirements
run: |
brew install coreutils
python3 -m venv ~/yt-dlp-build-venv
source ~/yt-dlp-build-venv/bin/activate
python3 devscripts/install_deps.py -o --include build
python3 devscripts/install_deps.py --print --include pyinstaller > requirements.txt
# We need to ignore wheels otherwise we break universal2 builds
python3 -m pip install -U --no-binary :all: -r requirements.txt
# We need to fuse our own universal2 wheels for curl_cffi
python3 -m pip install -U delocate
mkdir curl_cffi_whls curl_cffi_universal2
python3 devscripts/install_deps.py --print -o --include curl-cffi > requirements.txt
for platform in "macosx_11_0_arm64" "macosx_11_0_x86_64"; do
python3 -m pip download \
--only-binary=:all: \
--platform "${platform}" \
-d curl_cffi_whls \
-r requirements.txt
done
( # Overwrite x86_64-only libs with fat/universal2 libs or else Pyinstaller will do the opposite
# See https://github.com/yt-dlp/yt-dlp/pull/10069
cd curl_cffi_whls
mkdir -p curl_cffi/.dylibs
python_libdir=$(python3 -c 'import sys; from pathlib import Path; print(Path(sys.path[1]).parent)')
for dylib in lib{ssl,crypto}.3.dylib; do
cp "${python_libdir}/${dylib}" "curl_cffi/.dylibs/${dylib}"
for wheel in curl_cffi*macos*x86_64.whl; do
zip "${wheel}" "curl_cffi/.dylibs/${dylib}"
done
done
)
python3 -m delocate.cmd.delocate_fuse curl_cffi_whls/curl_cffi*.whl -w curl_cffi_universal2
python3 -m delocate.cmd.delocate_fuse curl_cffi_whls/cffi*.whl -w curl_cffi_universal2
for wheel in curl_cffi_universal2/*cffi*.whl; do
mv -n -- "${wheel}" "${wheel/x86_64/universal2}"
done
python3 -m pip install --force-reinstall -U curl_cffi_universal2/*cffi*.whl
/usr/bin/python3 -m pip install -U --user pip Pyinstaller==4.5.1 mutagen pycryptodomex websockets
- name: Bump version
id: bump_version
run: /usr/bin/python3 devscripts/update-version.py
- name: Build lazy extractors
id: lazy_extractors
run: /usr/bin/python3 devscripts/make_lazy_extractors.py
- name: Run PyInstaller Script
run: /usr/bin/python3 pyinst.py --target-architecture universal2 --onefile
- name: Upload yt-dlp MacOS binary
id: upload-release-macos
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ needs.build_unix.outputs.upload_url }}
asset_path: ./dist/yt-dlp_macos
asset_name: yt-dlp_macos
asset_content_type: application/octet-stream
- name: Get SHA2-256SUMS for yt-dlp_macos
id: sha256_macos
run: echo "::set-output name=sha256_macos::$(sha256sum dist/yt-dlp_macos | awk '{print $1}')"
- name: Get SHA2-512SUMS for yt-dlp_macos
id: sha512_macos
run: echo "::set-output name=sha512_macos::$(sha512sum dist/yt-dlp_macos | awk '{print $1}')"
- name: Prepare
run: |
python3 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python3 devscripts/make_lazy_extractors.py
- name: Build
run: |
source ~/yt-dlp-build-venv/bin/activate
python3 -m bundle.pyinstaller --target-architecture universal2 --onedir
(cd ./dist/yt-dlp_macos && zip -r ../yt-dlp_macos.zip .)
python3 -m bundle.pyinstaller --target-architecture universal2
- name: Run PyInstaller Script with --onedir
run: /usr/bin/python3 pyinst.py --target-architecture universal2 --onedir
- uses: papeloto/action-zip@v1
with:
files: ./dist/yt-dlp_macos
dest: ./dist/yt-dlp_macos.zip
- name: Upload yt-dlp MacOS onedir
id: upload-release-macos-zip
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ needs.build_unix.outputs.upload_url }}
asset_path: ./dist/yt-dlp_macos.zip
asset_name: yt-dlp_macos.zip
asset_content_type: application/zip
- name: Get SHA2-256SUMS for yt-dlp_macos.zip
id: sha256_macos_zip
run: echo "::set-output name=sha256_macos_zip::$(sha256sum dist/yt-dlp_macos.zip | awk '{print $1}')"
- name: Get SHA2-512SUMS for yt-dlp_macos
id: sha512_macos_zip
run: echo "::set-output name=sha512_macos_zip::$(sha512sum dist/yt-dlp_macos.zip | awk '{print $1}')"
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
chmod +x ./dist/yt-dlp_macos
cp ./dist/yt-dlp_macos ./dist/yt-dlp_macos_downgraded
version="$(./dist/yt-dlp_macos --version)"
./dist/yt-dlp_macos_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
downgraded_version="$(./dist/yt-dlp_macos_downgraded --version)"
[[ "$version" != "$downgraded_version" ]]
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
dist/yt-dlp_macos
dist/yt-dlp_macos.zip
compression-level: 0
- name: Cleanup cache
if: steps.restore-cache.outputs.cache-hit == 'true'
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
cache_key: cache-reqs-${{ github.job }}
repository: ${{ github.repository }}
branch: ${{ github.ref }}
run: |
gh extension install actions/gh-actions-cache
gh actions-cache delete "${cache_key}" -R "${repository}" -B "${branch}" --confirm
- name: Cache requirements
uses: actions/cache/save@v4
with:
path: |
~/yt-dlp-build-venv
key: cache-reqs-${{ github.job }}
macos_legacy:
needs: process
if: inputs.macos_legacy
runs-on: macos-12
steps:
- uses: actions/checkout@v4
- name: Install Python
# We need the official Python, because the GA ones only support newer macOS versions
env:
PYTHON_VERSION: 3.10.5
MACOSX_DEPLOYMENT_TARGET: 10.9 # Used up by the Python build tools
run: |
# Hack to get the latest patch version. Uncomment if needed
#brew install python@3.10
#export PYTHON_VERSION=$( $(brew --prefix)/opt/python@3.10/bin/python3 --version | cut -d ' ' -f 2 )
curl "https://www.python.org/ftp/python/${PYTHON_VERSION}/python-${PYTHON_VERSION}-macos11.pkg" -o "python.pkg"
sudo installer -pkg python.pkg -target /
python3 --version
- name: Install Requirements
run: |
brew install coreutils
python3 devscripts/install_deps.py --user -o --include build
python3 devscripts/install_deps.py --user --include pyinstaller
- name: Prepare
run: |
python3 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python3 devscripts/make_lazy_extractors.py
- name: Build
run: |
python3 -m bundle.pyinstaller
mv dist/yt-dlp_macos dist/yt-dlp_macos_legacy
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
chmod +x ./dist/yt-dlp_macos_legacy
cp ./dist/yt-dlp_macos_legacy ./dist/yt-dlp_macos_legacy_downgraded
version="$(./dist/yt-dlp_macos_legacy --version)"
./dist/yt-dlp_macos_legacy_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
downgraded_version="$(./dist/yt-dlp_macos_legacy_downgraded --version)"
[[ "$version" != "$downgraded_version" ]]
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
dist/yt-dlp_macos_legacy
compression-level: 0
windows:
needs: process
if: inputs.windows
build_windows:
runs-on: windows-latest
needs: build_unix
outputs:
sha256_win: ${{ steps.sha256_win.outputs.sha256_win }}
sha512_win: ${{ steps.sha512_win.outputs.sha512_win }}
sha256_py2exe: ${{ steps.sha256_py2exe.outputs.sha256_py2exe }}
sha512_py2exe: ${{ steps.sha512_py2exe.outputs.sha512_py2exe }}
sha256_win_zip: ${{ steps.sha256_win_zip.outputs.sha256_win_zip }}
sha512_win_zip: ${{ steps.sha512_win_zip.outputs.sha512_win_zip }}
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with: # 3.8 is used for Win7 support
python-version: "3.8"
- name: Install Requirements
run: | # Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
python devscripts/install_deps.py -o --include build
python devscripts/install_deps.py --include curl-cffi
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-6.7.0-py3-none-any.whl"
- uses: actions/checkout@v2
# 3.8 is used for Win7 support
- name: Set up Python 3.8
uses: actions/setup-python@v2
with:
python-version: '3.8'
- name: Install Requirements
# Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
run: |
python -m pip install --upgrade pip setuptools wheel py2exe
pip install "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-4.5.1-py3-none-any.whl" mutagen pycryptodomex websockets
- name: Bump version
id: bump_version
env:
version_suffix: ${{ needs.build_unix.outputs.version_suffix }}
run: python devscripts/update-version.py ${{ env.version_suffix }}
- name: Build lazy extractors
id: lazy_extractors
run: python devscripts/make_lazy_extractors.py
- name: Run PyInstaller Script
run: python pyinst.py
- name: Upload yt-dlp.exe Windows binary
id: upload-release-windows
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ needs.build_unix.outputs.upload_url }}
asset_path: ./dist/yt-dlp.exe
asset_name: yt-dlp.exe
asset_content_type: application/vnd.microsoft.portable-executable
- name: Get SHA2-256SUMS for yt-dlp.exe
id: sha256_win
run: echo "::set-output name=sha256_win::$((Get-FileHash dist\yt-dlp.exe -Algorithm SHA256).Hash.ToLower())"
- name: Get SHA2-512SUMS for yt-dlp.exe
id: sha512_win
run: echo "::set-output name=sha512_win::$((Get-FileHash dist\yt-dlp.exe -Algorithm SHA512).Hash.ToLower())"
- name: Prepare
run: |
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python devscripts/make_lazy_extractors.py
- name: Build
run: |
python -m bundle.pyinstaller
python -m bundle.pyinstaller --onedir
Move-Item ./dist/yt-dlp.exe ./dist/yt-dlp_real.exe
Compress-Archive -Path ./dist/yt-dlp/* -DestinationPath ./dist/yt-dlp_win.zip
- name: Run PyInstaller Script with --onedir
run: python pyinst.py --onedir
- uses: papeloto/action-zip@v1
with:
files: ./dist/yt-dlp
dest: ./dist/yt-dlp_win.zip
- name: Upload yt-dlp Windows onedir
id: upload-release-windows-zip
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ needs.build_unix.outputs.upload_url }}
asset_path: ./dist/yt-dlp_win.zip
asset_name: yt-dlp_win.zip
asset_content_type: application/zip
- name: Get SHA2-256SUMS for yt-dlp_win.zip
id: sha256_win_zip
run: echo "::set-output name=sha256_win_zip::$((Get-FileHash dist\yt-dlp_win.zip -Algorithm SHA256).Hash.ToLower())"
- name: Get SHA2-512SUMS for yt-dlp_win.zip
id: sha512_win_zip
run: echo "::set-output name=sha512_win_zip::$((Get-FileHash dist\yt-dlp_win.zip -Algorithm SHA512).Hash.ToLower())"
- name: Install Requirements (py2exe)
run: |
python devscripts/install_deps.py --include py2exe
- name: Build (py2exe)
run: |
python -m bundle.py2exe
Move-Item ./dist/yt-dlp.exe ./dist/yt-dlp_min.exe
Move-Item ./dist/yt-dlp_real.exe ./dist/yt-dlp.exe
- name: Run py2exe Script
run: python setup.py py2exe
- name: Upload yt-dlp_min.exe Windows binary
id: upload-release-windows-py2exe
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ needs.build_unix.outputs.upload_url }}
asset_path: ./dist/yt-dlp.exe
asset_name: yt-dlp_min.exe
asset_content_type: application/vnd.microsoft.portable-executable
- name: Get SHA2-256SUMS for yt-dlp_min.exe
id: sha256_py2exe
run: echo "::set-output name=sha256_py2exe::$((Get-FileHash dist\yt-dlp.exe -Algorithm SHA256).Hash.ToLower())"
- name: Get SHA2-512SUMS for yt-dlp_min.exe
id: sha512_py2exe
run: echo "::set-output name=sha512_py2exe::$((Get-FileHash dist\yt-dlp.exe -Algorithm SHA512).Hash.ToLower())"
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
foreach ($name in @("yt-dlp","yt-dlp_min")) {
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
$version = & "./dist/${name}.exe" --version
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
$downgraded_version = & "./dist/${name}_downgraded.exe" --version
if ($version -eq $downgraded_version) {
exit 1
}
}
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
dist/yt-dlp.exe
dist/yt-dlp_min.exe
dist/yt-dlp_win.zip
compression-level: 0
windows32:
needs: process
if: inputs.windows32
build_windows32:
runs-on: windows-latest
needs: build_unix
outputs:
sha256_win32: ${{ steps.sha256_win32.outputs.sha256_win32 }}
sha512_win32: ${{ steps.sha512_win32.outputs.sha512_win32 }}
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.8"
architecture: "x86"
- name: Install Requirements
run: |
python devscripts/install_deps.py -o --include build
python devscripts/install_deps.py
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-6.7.0-py3-none-any.whl"
- uses: actions/checkout@v2
# 3.7 is used for Vista support. See https://github.com/yt-dlp/yt-dlp/issues/390
- name: Set up Python 3.7 32-Bit
uses: actions/setup-python@v2
with:
python-version: '3.7'
architecture: 'x86'
- name: Install Requirements
run: |
python -m pip install --upgrade pip setuptools wheel
pip install "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-4.5.1-py3-none-any.whl" mutagen pycryptodomex websockets
- name: Bump version
id: bump_version
env:
version_suffix: ${{ needs.build_unix.outputs.version_suffix }}
run: python devscripts/update-version.py ${{ env.version_suffix }}
- name: Build lazy extractors
id: lazy_extractors
run: python devscripts/make_lazy_extractors.py
- name: Run PyInstaller Script for 32 Bit
run: python pyinst.py
- name: Upload Executable yt-dlp_x86.exe
id: upload-release-windows32
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ needs.build_unix.outputs.upload_url }}
asset_path: ./dist/yt-dlp_x86.exe
asset_name: yt-dlp_x86.exe
asset_content_type: application/vnd.microsoft.portable-executable
- name: Get SHA2-256SUMS for yt-dlp_x86.exe
id: sha256_win32
run: echo "::set-output name=sha256_win32::$((Get-FileHash dist\yt-dlp_x86.exe -Algorithm SHA256).Hash.ToLower())"
- name: Get SHA2-512SUMS for yt-dlp_x86.exe
id: sha512_win32
run: echo "::set-output name=sha512_win32::$((Get-FileHash dist\yt-dlp_x86.exe -Algorithm SHA512).Hash.ToLower())"
- name: Prepare
run: |
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python devscripts/make_lazy_extractors.py
- name: Build
run: |
python -m bundle.pyinstaller
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
foreach ($name in @("yt-dlp_x86")) {
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
$version = & "./dist/${name}.exe" --version
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
$downgraded_version = & "./dist/${name}_downgraded.exe" --version
if ($version -eq $downgraded_version) {
exit 1
}
}
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
dist/yt-dlp_x86.exe
compression-level: 0
meta_files:
if: always() && !cancelled()
needs:
- process
- unix
- linux_static
- linux_arm
- macos
- macos_legacy
- windows
- windows32
finish:
runs-on: ubuntu-latest
needs: [build_unix, build_windows, build_windows32, build_macos]
steps:
- uses: actions/download-artifact@v4
with:
path: artifact
pattern: build-bin-*
merge-multiple: true
- name: Make SHA2-SUMS files
run: |
cd ./artifact/
# make sure SHA sums are also printed to stdout
sha256sum -- * | tee ../SHA2-256SUMS
sha512sum -- * | tee ../SHA2-512SUMS
- name: Make Update spec
run: |
cat >> _update_spec << EOF
# This file is used for regulating self-update
lock 2022.08.18.36 .+ Python 3\.6
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
EOF
- name: Sign checksum files
env:
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
if: env.GPG_SIGNING_KEY != ''
run: |
gpg --batch --import <<< "${{ secrets.GPG_SIGNING_KEY }}"
for signfile in ./SHA*SUMS; do
gpg --batch --detach-sign "$signfile"
done
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: build-${{ github.job }}
path: |
_update_spec
SHA*SUMS*
compression-level: 0
overwrite: true
- name: Make SHA2-256SUMS file
env:
SHA256_BIN: ${{ needs.build_unix.outputs.sha256_bin }}
SHA256_TAR: ${{ needs.build_unix.outputs.sha256_tar }}
SHA256_WIN: ${{ needs.build_windows.outputs.sha256_win }}
SHA256_PY2EXE: ${{ needs.build_windows.outputs.sha256_py2exe }}
SHA256_WIN_ZIP: ${{ needs.build_windows.outputs.sha256_win_zip }}
SHA256_WIN32: ${{ needs.build_windows32.outputs.sha256_win32 }}
SHA256_MACOS: ${{ needs.build_macos.outputs.sha256_macos }}
SHA256_MACOS_ZIP: ${{ needs.build_macos.outputs.sha256_macos_zip }}
run: |
echo "${{ env.SHA256_BIN }} yt-dlp" >> SHA2-256SUMS
echo "${{ env.SHA256_TAR }} yt-dlp.tar.gz" >> SHA2-256SUMS
echo "${{ env.SHA256_WIN }} yt-dlp.exe" >> SHA2-256SUMS
echo "${{ env.SHA256_PY2EXE }} yt-dlp_min.exe" >> SHA2-256SUMS
echo "${{ env.SHA256_WIN32 }} yt-dlp_x86.exe" >> SHA2-256SUMS
echo "${{ env.SHA256_WIN_ZIP }} yt-dlp_win.zip" >> SHA2-256SUMS
echo "${{ env.SHA256_MACOS }} yt-dlp_macos" >> SHA2-256SUMS
echo "${{ env.SHA256_MACOS_ZIP }} yt-dlp_macos.zip" >> SHA2-256SUMS
- name: Upload 256SUMS file
id: upload-sums
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ needs.build_unix.outputs.upload_url }}
asset_path: ./SHA2-256SUMS
asset_name: SHA2-256SUMS
asset_content_type: text/plain
- name: Make SHA2-512SUMS file
env:
SHA512_BIN: ${{ needs.build_unix.outputs.sha512_bin }}
SHA512_TAR: ${{ needs.build_unix.outputs.sha512_tar }}
SHA512_WIN: ${{ needs.build_windows.outputs.sha512_win }}
SHA512_PY2EXE: ${{ needs.build_windows.outputs.sha512_py2exe }}
SHA512_WIN_ZIP: ${{ needs.build_windows.outputs.sha512_win_zip }}
SHA512_WIN32: ${{ needs.build_windows32.outputs.sha512_win32 }}
SHA512_MACOS: ${{ needs.build_macos.outputs.sha512_macos }}
SHA512_MACOS_ZIP: ${{ needs.build_macos.outputs.sha512_macos_zip }}
run: |
echo "${{ env.SHA512_BIN }} yt-dlp" >> SHA2-512SUMS
echo "${{ env.SHA512_TAR }} yt-dlp.tar.gz" >> SHA2-512SUMS
echo "${{ env.SHA512_WIN }} yt-dlp.exe" >> SHA2-512SUMS
echo "${{ env.SHA512_WIN_ZIP }} yt-dlp_win.zip" >> SHA2-512SUMS
echo "${{ env.SHA512_PY2EXE }} yt-dlp_min.exe" >> SHA2-512SUMS
echo "${{ env.SHA512_WIN32 }} yt-dlp_x86.exe" >> SHA2-512SUMS
echo "${{ env.SHA512_MACOS }} yt-dlp_macos" >> SHA2-512SUMS
echo "${{ env.SHA512_MACOS_ZIP }} yt-dlp_macos.zip" >> SHA2-512SUMS
- name: Upload 512SUMS file
id: upload-512sums
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ needs.build_unix.outputs.upload_url }}
asset_path: ./SHA2-512SUMS
asset_name: SHA2-512SUMS
asset_content_type: text/plain

View File

@ -1,65 +0,0 @@
name: "CodeQL"
on:
push:
branches: [ 'master', 'gh-pages', 'release' ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ 'master' ]
schedule:
- cron: '59 11 * * 5'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'python' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Use only 'java' to analyze code written in Java, Kotlin or both
# Use only 'javascript' to analyze code written in JavaScript, TypeScript or both
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
with:
category: "/language:${{matrix.language}}"

View File

@ -1,32 +1,5 @@
name: Core Tests
on:
push:
paths:
- .github/**
- devscripts/**
- test/**
- yt_dlp/**.py
- '!yt_dlp/extractor/*.py'
- yt_dlp/extractor/__init__.py
- yt_dlp/extractor/common.py
- yt_dlp/extractor/extractors.py
pull_request:
paths:
- .github/**
- devscripts/**
- test/**
- yt_dlp/**.py
- '!yt_dlp/extractor/*.py'
- yt_dlp/extractor/__init__.py
- yt_dlp/extractor/common.py
- yt_dlp/extractor/extractors.py
permissions:
contents: read
concurrency:
group: core-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
on: [push, pull_request]
jobs:
tests:
name: Core Tests
@ -35,27 +8,24 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest]
# CPython 3.8 is in quick-test
python-version: ['3.9', '3.10', '3.11', '3.12', pypy-3.8, pypy-3.10]
os: [ubuntu-18.04]
# py3.9 is in quick-test
python-version: [3.7, 3.8, 3.10-dev, pypy-3.6, pypy-3.7]
run-tests-ext: [sh]
include:
# atleast one of each CPython/PyPy tests must be in windows
# atleast one of the tests must be in windows
- os: windows-latest
python-version: '3.8'
- os: windows-latest
python-version: '3.12'
- os: windows-latest
python-version: pypy-3.9
python-version: 3.6
run-tests-ext: bat
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install test requirements
run: python3 ./devscripts/install_deps.py --include test --include curl-cffi
- name: Install pytest
run: pip install pytest
- name: Run tests
continue-on-error: False
run: |
python3 -m yt_dlp -v || true # Print debug head
python3 ./devscripts/run_tests.py core
run: ./devscripts/run_tests.${{ matrix.run-tests-ext }} core
# Linter is in quick-test

View File

@ -1,48 +1,28 @@
name: Download Tests
on: [push, pull_request]
permissions:
contents: read
jobs:
quick:
name: Quick Download Tests
tests:
name: Download Tests
if: "contains(github.event.head_commit.message, 'ci run dl')"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.9
- name: Install test requirements
run: python3 ./devscripts/install_deps.py --include dev
- name: Run tests
continue-on-error: true
run: python3 ./devscripts/run_tests.py download
full:
name: Full Download Tests
if: "contains(github.event.head_commit.message, 'ci run dl all')"
runs-on: ${{ matrix.os }}
strategy:
fail-fast: true
matrix:
os: [ubuntu-latest]
python-version: ['3.10', '3.11', '3.12', pypy-3.8, pypy-3.10]
os: [ubuntu-18.04]
python-version: [3.7, 3.8, 3.9, 3.10-dev, pypy-3.6, pypy-3.7]
run-tests-ext: [sh]
include:
# atleast one of each CPython/PyPy tests must be in windows
- os: windows-latest
python-version: '3.8'
- os: windows-latest
python-version: pypy-3.9
python-version: 3.6
run-tests-ext: bat
steps:
- uses: actions/checkout@v4
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install test requirements
run: python3 ./devscripts/install_deps.py --include dev
- name: Install pytest
run: pip install pytest
- name: Run tests
continue-on-error: true
run: python3 ./devscripts/run_tests.py download
run: ./devscripts/run_tests.${{ matrix.run-tests-ext }} download

View File

@ -1,39 +1,33 @@
name: Quick Test
on: [push, pull_request]
permissions:
contents: read
jobs:
tests:
name: Core Test
if: "!contains(github.event.head_commit.message, 'ci skip all')"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python 3.8
uses: actions/setup-python@v5
- uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.8'
python-version: 3.9
- name: Install test requirements
run: python3 ./devscripts/install_deps.py --include test
run: pip install pytest pycryptodomex
- name: Run tests
run: |
python3 -m yt_dlp -v || true
python3 ./devscripts/run_tests.py core
check:
name: Code check
run: ./devscripts/run_tests.sh core
flake8:
name: Linter
if: "!contains(github.event.head_commit.message, 'ci skip all')"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
- uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.8'
- name: Install dev dependencies
run: python3 ./devscripts/install_deps.py -o --include static-analysis
python-version: 3.9
- name: Install flake8
run: pip install flake8
- name: Make lazy extractors
run: python3 ./devscripts/make_lazy_extractors.py
- name: Run ruff
run: ruff check --output-format github .
- name: Run autopep8
run: autopep8 --diff .
run: python devscripts/make_lazy_extractors.py
- name: Run flake8
run: flake8 .

View File

@ -1,30 +0,0 @@
name: Release (master)
on:
push:
branches:
- master
paths:
- "yt_dlp/**.py"
- "!yt_dlp/version.py"
- "bundle/*.py"
- "pyproject.toml"
- "Makefile"
- ".github/workflows/build.yml"
concurrency:
group: release-master
permissions:
contents: read
jobs:
release:
if: vars.BUILD_MASTER != ''
uses: ./.github/workflows/release.yml
with:
prerelease: true
source: master
permissions:
contents: write
packages: write # For package cache
actions: write # For cleaning up cache
id-token: write # mandatory for trusted publishing
secrets: inherit

View File

@ -1,43 +0,0 @@
name: Release (nightly)
on:
schedule:
- cron: '23 23 * * *'
permissions:
contents: read
jobs:
check_nightly:
if: vars.BUILD_NIGHTLY != ''
runs-on: ubuntu-latest
outputs:
commit: ${{ steps.check_for_new_commits.outputs.commit }}
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Check for new commits
id: check_for_new_commits
run: |
relevant_files=(
"yt_dlp/*.py"
':!yt_dlp/version.py'
"bundle/*.py"
"pyproject.toml"
"Makefile"
".github/workflows/build.yml"
)
echo "commit=$(git log --format=%H -1 --since="24 hours ago" -- "${relevant_files[@]}")" | tee "$GITHUB_OUTPUT"
release:
needs: [check_nightly]
if: ${{ needs.check_nightly.outputs.commit }}
uses: ./.github/workflows/release.yml
with:
prerelease: true
source: nightly
permissions:
contents: write
packages: write # For package cache
actions: write # For cleaning up cache
id-token: write # mandatory for trusted publishing
secrets: inherit

View File

@ -1,384 +0,0 @@
name: Release
on:
workflow_call:
inputs:
prerelease:
required: false
default: true
type: boolean
source:
required: false
default: ''
type: string
target:
required: false
default: ''
type: string
version:
required: false
default: ''
type: string
workflow_dispatch:
inputs:
source:
description: |
SOURCE of this release's updates:
channel, repo, tag, or channel/repo@tag
(default: <current_repo>)
required: false
default: ''
type: string
target:
description: |
TARGET to publish this release to:
channel, tag, or channel@tag
(default: <source> if writable else <current_repo>[@source_tag])
required: false
default: ''
type: string
version:
description: |
VERSION: yyyy.mm.dd[.rev] or rev
(default: auto-generated)
required: false
default: ''
type: string
prerelease:
description: Pre-release
default: false
type: boolean
permissions:
contents: read
jobs:
prepare:
permissions:
contents: write
runs-on: ubuntu-latest
outputs:
channel: ${{ steps.setup_variables.outputs.channel }}
version: ${{ steps.setup_variables.outputs.version }}
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
target_repo_token: ${{ steps.setup_variables.outputs.target_repo_token }}
target_tag: ${{ steps.setup_variables.outputs.target_tag }}
pypi_project: ${{ steps.setup_variables.outputs.pypi_project }}
pypi_suffix: ${{ steps.setup_variables.outputs.pypi_suffix }}
head_sha: ${{ steps.get_target.outputs.head_sha }}
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Process inputs
id: process_inputs
run: |
cat << EOF
::group::Inputs
prerelease=${{ inputs.prerelease }}
source=${{ inputs.source }}
target=${{ inputs.target }}
version=${{ inputs.version }}
::endgroup::
EOF
IFS='@' read -r source_repo source_tag <<<"${{ inputs.source }}"
IFS='@' read -r target_repo target_tag <<<"${{ inputs.target }}"
cat << EOF >> "$GITHUB_OUTPUT"
source_repo=${source_repo}
source_tag=${source_tag}
target_repo=${target_repo}
target_tag=${target_tag}
EOF
- name: Setup variables
id: setup_variables
env:
source_repo: ${{ steps.process_inputs.outputs.source_repo }}
source_tag: ${{ steps.process_inputs.outputs.source_tag }}
target_repo: ${{ steps.process_inputs.outputs.target_repo }}
target_tag: ${{ steps.process_inputs.outputs.target_tag }}
run: |
# unholy bash monstrosity (sincere apologies)
fallback_token () {
if ${{ !secrets.ARCHIVE_REPO_TOKEN }}; then
echo "::error::Repository access secret ${target_repo_token^^} not found"
exit 1
fi
target_repo_token=ARCHIVE_REPO_TOKEN
return 0
}
source_is_channel=0
[[ "${source_repo}" == 'stable' ]] && source_repo='yt-dlp/yt-dlp'
if [[ -z "${source_repo}" ]]; then
source_repo='${{ github.repository }}'
elif [[ '${{ vars[format('{0}_archive_repo', env.source_repo)] }}' ]]; then
source_is_channel=1
source_channel='${{ vars[format('{0}_archive_repo', env.source_repo)] }}'
elif [[ -z "${source_tag}" && "${source_repo}" != */* ]]; then
source_tag="${source_repo}"
source_repo='${{ github.repository }}'
fi
resolved_source="${source_repo}"
if [[ "${source_tag}" ]]; then
resolved_source="${resolved_source}@${source_tag}"
elif [[ "${source_repo}" == 'yt-dlp/yt-dlp' ]]; then
resolved_source='stable'
fi
revision="${{ (inputs.prerelease || !vars.PUSH_VERSION_COMMIT) && '$(date -u +"%H%M%S")' || '' }}"
version="$(
python devscripts/update-version.py \
-c "${resolved_source}" -r "${{ github.repository }}" ${{ inputs.version || '$revision' }} | \
grep -Po "version=\K\d+\.\d+\.\d+(\.\d+)?")"
if [[ "${target_repo}" ]]; then
if [[ -z "${target_tag}" ]]; then
if [[ '${{ vars[format('{0}_archive_repo', env.target_repo)] }}' ]]; then
target_tag="${source_tag:-${version}}"
else
target_tag="${target_repo}"
target_repo='${{ github.repository }}'
fi
fi
if [[ "${target_repo}" != '${{ github.repository}}' ]]; then
target_repo='${{ vars[format('{0}_archive_repo', env.target_repo)] }}'
target_repo_token='${{ env.target_repo }}_archive_repo_token'
${{ !!secrets[format('{0}_archive_repo_token', env.target_repo)] }} || fallback_token
pypi_project='${{ vars[format('{0}_pypi_project', env.target_repo)] }}'
pypi_suffix='${{ vars[format('{0}_pypi_suffix', env.target_repo)] }}'
fi
else
target_tag="${source_tag:-${version}}"
if ((source_is_channel)); then
target_repo="${source_channel}"
target_repo_token='${{ env.source_repo }}_archive_repo_token'
${{ !!secrets[format('{0}_archive_repo_token', env.source_repo)] }} || fallback_token
pypi_project='${{ vars[format('{0}_pypi_project', env.source_repo)] }}'
pypi_suffix='${{ vars[format('{0}_pypi_suffix', env.source_repo)] }}'
else
target_repo='${{ github.repository }}'
fi
fi
if [[ "${target_repo}" == '${{ github.repository }}' ]] && ${{ !inputs.prerelease }}; then
pypi_project='${{ vars.PYPI_PROJECT }}'
fi
echo "::group::Output variables"
cat << EOF | tee -a "$GITHUB_OUTPUT"
channel=${resolved_source}
version=${version}
target_repo=${target_repo}
target_repo_token=${target_repo_token}
target_tag=${target_tag}
pypi_project=${pypi_project}
pypi_suffix=${pypi_suffix}
EOF
echo "::endgroup::"
- name: Update documentation
env:
version: ${{ steps.setup_variables.outputs.version }}
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
if: |
!inputs.prerelease && env.target_repo == github.repository
run: |
python devscripts/update_changelog.py -vv
make doc
- name: Push to release
id: push_release
env:
version: ${{ steps.setup_variables.outputs.version }}
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
if: |
!inputs.prerelease && env.target_repo == github.repository
run: |
git config --global user.name "github-actions[bot]"
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
git add -u
git commit -m "Release ${{ env.version }}" \
-m "Created by: ${{ github.event.sender.login }}" -m ":ci skip all :ci run dl"
git push origin --force ${{ github.event.ref }}:release
- name: Get target commitish
id: get_target
run: |
echo "head_sha=$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"
- name: Update master
env:
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
if: |
vars.PUSH_VERSION_COMMIT != '' && !inputs.prerelease && env.target_repo == github.repository
run: git push origin ${{ github.event.ref }}
build:
needs: prepare
uses: ./.github/workflows/build.yml
with:
version: ${{ needs.prepare.outputs.version }}
channel: ${{ needs.prepare.outputs.channel }}
origin: ${{ needs.prepare.outputs.target_repo }}
permissions:
contents: read
packages: write # For package cache
actions: write # For cleaning up cache
secrets:
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
publish_pypi:
needs: [prepare, build]
if: ${{ needs.prepare.outputs.pypi_project }}
runs-on: ubuntu-latest
permissions:
id-token: write # mandatory for trusted publishing
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install Requirements
run: |
sudo apt -y install pandoc man
python devscripts/install_deps.py -o --include build
- name: Prepare
env:
version: ${{ needs.prepare.outputs.version }}
suffix: ${{ needs.prepare.outputs.pypi_suffix }}
channel: ${{ needs.prepare.outputs.channel }}
target_repo: ${{ needs.prepare.outputs.target_repo }}
pypi_project: ${{ needs.prepare.outputs.pypi_project }}
run: |
python devscripts/update-version.py -c "${{ env.channel }}" -r "${{ env.target_repo }}" -s "${{ env.suffix }}" "${{ env.version }}"
python devscripts/update_changelog.py -vv
python devscripts/make_lazy_extractors.py
sed -i -E '0,/(name = ")[^"]+(")/s//\1${{ env.pypi_project }}\2/' pyproject.toml
- name: Build
run: |
rm -rf dist/*
make pypi-files
printf '%s\n\n' \
'Official repository: <https://github.com/yt-dlp/yt-dlp>' \
'**PS**: Some links in this document will not work since this is a copy of the README.md from Github' > ./README.md.new
cat ./README.md >> ./README.md.new && mv -f ./README.md.new ./README.md
python devscripts/set-variant.py pip -M "You installed yt-dlp with pip or using the wheel from PyPi; Use that to update"
make clean-cache
python -m build --no-isolation .
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
verbose: true
publish:
needs: [prepare, build]
permissions:
contents: write
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/download-artifact@v4
with:
path: artifact
pattern: build-*
merge-multiple: true
- uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Generate release notes
env:
head_sha: ${{ needs.prepare.outputs.head_sha }}
target_repo: ${{ needs.prepare.outputs.target_repo }}
target_tag: ${{ needs.prepare.outputs.target_tag }}
run: |
printf '%s' \
'[![Installation](https://img.shields.io/badge/-Which%20file%20to%20download%3F-white.svg?style=for-the-badge)]' \
'(https://github.com/${{ github.repository }}#installation "Installation instructions") ' \
'[![Discord](https://img.shields.io/discord/807245652072857610?color=blue&labelColor=555555&label=&logo=discord&style=for-the-badge)]' \
'(https://discord.gg/H5MNcFW63r "Discord") ' \
'[![Donate](https://img.shields.io/badge/_-Donate-red.svg?logo=githubsponsors&labelColor=555555&style=for-the-badge)]' \
'(https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators "Donate") ' \
'[![Documentation](https://img.shields.io/badge/-Docs-brightgreen.svg?style=for-the-badge&logo=GitBook&labelColor=555555)]' \
'(https://github.com/${{ github.repository }}' \
'${{ env.target_repo == github.repository && format('/tree/{0}', env.target_tag) || '' }}#readme "Documentation") ' \
${{ env.target_repo == 'yt-dlp/yt-dlp' && '\
"[![Nightly](https://img.shields.io/badge/Nightly%20builds-purple.svg?style=for-the-badge)]" \
"(https://github.com/yt-dlp/yt-dlp-nightly-builds/releases/latest \"Nightly builds\") " \
"[![Master](https://img.shields.io/badge/Master%20builds-lightblue.svg?style=for-the-badge)]" \
"(https://github.com/yt-dlp/yt-dlp-master-builds/releases/latest \"Master builds\")"' || '' }} > ./RELEASE_NOTES
printf '\n\n' >> ./RELEASE_NOTES
cat >> ./RELEASE_NOTES << EOF
#### A description of the various files are in the [README](https://github.com/${{ github.repository }}#release-files)
---
$(python ./devscripts/make_changelog.py -vv --collapsible)
EOF
printf '%s\n\n' '**This is a pre-release build**' >> ./PRERELEASE_NOTES
cat ./RELEASE_NOTES >> ./PRERELEASE_NOTES
printf '%s\n\n' 'Generated from: https://github.com/${{ github.repository }}/commit/${{ env.head_sha }}' >> ./ARCHIVE_NOTES
cat ./RELEASE_NOTES >> ./ARCHIVE_NOTES
- name: Publish to archive repo
env:
GH_TOKEN: ${{ secrets[needs.prepare.outputs.target_repo_token] }}
GH_REPO: ${{ needs.prepare.outputs.target_repo }}
version: ${{ needs.prepare.outputs.version }}
channel: ${{ needs.prepare.outputs.channel }}
if: |
inputs.prerelease && env.GH_TOKEN != '' && env.GH_REPO != '' && env.GH_REPO != github.repository
run: |
title="${{ startswith(env.GH_REPO, 'yt-dlp/') && 'yt-dlp ' || '' }}${{ env.channel }}"
gh release create \
--notes-file ARCHIVE_NOTES \
--title "${title} ${{ env.version }}" \
${{ env.version }} \
artifact/*
- name: Prune old release
env:
GH_TOKEN: ${{ github.token }}
version: ${{ needs.prepare.outputs.version }}
target_repo: ${{ needs.prepare.outputs.target_repo }}
target_tag: ${{ needs.prepare.outputs.target_tag }}
if: |
env.target_repo == github.repository && env.target_tag != env.version
run: |
gh release delete --yes --cleanup-tag "${{ env.target_tag }}" || true
git tag --delete "${{ env.target_tag }}" || true
sleep 5 # Enough time to cover deletion race condition
- name: Publish release
env:
GH_TOKEN: ${{ github.token }}
version: ${{ needs.prepare.outputs.version }}
target_repo: ${{ needs.prepare.outputs.target_repo }}
target_tag: ${{ needs.prepare.outputs.target_tag }}
head_sha: ${{ needs.prepare.outputs.head_sha }}
if: |
env.target_repo == github.repository
run: |
title="${{ github.repository == 'yt-dlp/yt-dlp' && 'yt-dlp ' || '' }}"
title+="${{ env.target_tag != env.version && format('{0} ', env.target_tag) || '' }}"
gh release create \
--notes-file ${{ inputs.prerelease && 'PRERELEASE_NOTES' || 'RELEASE_NOTES' }} \
--target ${{ env.head_sha }} \
--title "${title}${{ env.version }}" \
${{ inputs.prerelease && '--prerelease' || '' }} \
${{ env.target_tag }} \
artifact/*

25
.gitignore vendored
View File

@ -14,26 +14,19 @@ cookies
*.frag.urls
*.info.json
*.live_chat.json
*.meta
*.part*
*.tmp
*.temp
*.unknown_video
*.ytdl
.cache/
*.3gp
*.ape
*.ass
*.avi
*.desktop
*.f4v
*.flac
*.flv
*.gif
*.jpeg
*.jpg
*.lrc
*.m4a
*.m4v
*.mhtml
@ -41,18 +34,13 @@ cookies
*.mov
*.mp3
*.mp4
*.mpg
*.mpga
*.oga
*.ogg
*.opus
*.png
*.sbv
*.srt
*.ssa
*.swf
*.swp
*.tt
*.ttml
*.url
*.vtt
@ -67,7 +55,7 @@ cookies
# Python
*.pyc
*.pyo
.*_cache
.pytest_cache
wine-py2exe/
py2exe.log
build/
@ -75,7 +63,6 @@ dist/
zip/
tmp/
venv/
.venv/
completions/
# Misc
@ -91,8 +78,6 @@ updates_key.pem
*.egg-info
.tox
*.class
*.isorted
*.stackdump
# Generated
AUTHORS
@ -104,7 +89,7 @@ README.txt
*.tar.gz
*.zsh
*.spec
test/testdata/sigs/player-*.js
test/testdata/player-*.js
# Binary
/youtube-dl
@ -118,11 +103,11 @@ yt-dlp.zip
*.iml
.vscode
*.sublime-*
*.code-workspace
# Lazy extractors
*/extractor/lazy_extractors.py
# Plugins
ytdlp_plugins/
yt-dlp-plugins
ytdlp_plugins/extractor/*
!ytdlp_plugins/extractor/__init__.py
!ytdlp_plugins/extractor/sample.py

View File

@ -1,14 +0,0 @@
repos:
- repo: local
hooks:
- id: linter
name: Apply linter fixes
entry: ruff check --fix .
language: system
types: [python]
require_serial: true
- id: format
name: Apply formatting fixes
entry: autopep8 --in-place .
language: system
types: [python]

View File

@ -1,9 +0,0 @@
repos:
- repo: local
hooks:
- id: fix
name: Apply code fixes
entry: hatch fmt
language: system
types: [python]
require_serial: true

22
.readthedocs.yml Normal file
View File

@ -0,0 +1,22 @@
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/conf.py
# Optionally build your docs in additional formats such as PDF
formats:
- epub
- pdf
- htmlzip
# Optionally set the version of Python and requirements required to build your docs
python:
version: 3
install:
- requirements: docs/requirements.txt

View File

@ -11,7 +11,6 @@ # CONTRIBUTING TO YT-DLP
- [Is anyone going to need the feature?](#is-anyone-going-to-need-the-feature)
- [Is your question about yt-dlp?](#is-your-question-about-yt-dlp)
- [Are you willing to share account details if needed?](#are-you-willing-to-share-account-details-if-needed)
- [Is the website primarily used for piracy](#is-the-website-primarily-used-for-piracy)
- [DEVELOPER INSTRUCTIONS](#developer-instructions)
- [Adding new feature or making overarching changes](#adding-new-feature-or-making-overarching-changes)
- [Adding support for a new site](#adding-support-for-a-new-site)
@ -20,12 +19,10 @@ # CONTRIBUTING TO YT-DLP
- [Provide fallbacks](#provide-fallbacks)
- [Regular expressions](#regular-expressions)
- [Long lines policy](#long-lines-policy)
- [Quotes](#quotes)
- [Inline values](#inline-values)
- [Collapse fallbacks](#collapse-fallbacks)
- [Trailing parentheses](#trailing-parentheses)
- [Use convenience conversion and parsing functions](#use-convenience-conversion-and-parsing-functions)
- [My pull request is labeled pending-fixes](#my-pull-request-is-labeled-pending-fixes)
- [EMBEDDING YT-DLP](README.md#embedding-yt-dlp)
@ -34,9 +31,9 @@ # OPENING AN ISSUE
Bugs and suggestions should be reported at: [yt-dlp/yt-dlp/issues](https://github.com/yt-dlp/yt-dlp/issues). Unless you were prompted to or there is another pertinent reason (e.g. GitHub fails to accept the bug report), please do not send bug reports via personal email. For discussions, join us in our [discord server](https://discord.gg/H5MNcFW63r).
**Please include the full output of yt-dlp when run with `-vU`**, i.e. **add** `-vU` flag to **your command line**, copy the **whole** output and post it in the issue body wrapped in \`\`\` for better formatting. It should look similar to this:
**Please include the full output of yt-dlp when run with `-Uv`**, i.e. **add** `-Uv` flag to **your command line**, copy the **whole** output and post it in the issue body wrapped in \`\`\` for better formatting. It should look similar to this:
```
$ yt-dlp -vU <your command line>
$ yt-dlp -Uv <your command line>
[debug] Command-line config: ['-v', 'demo.com']
[debug] Encodings: locale UTF-8, fs utf-8, out utf-8, pref UTF-8
[debug] yt-dlp version 2021.09.25 (zip)
@ -67,7 +64,7 @@ ### Is the description of the issue itself sufficient?
If your report is shorter than two lines, it is almost certainly missing some of these, which makes it hard for us to respond to it. We're often too polite to close the issue outright, but the missing info makes misinterpretation likely. We often get frustrated by these issues, since the only possible way for us to move forward on them is to ask for clarification over and over.
For bug reports, this means that your report should contain the **complete** output of yt-dlp when called with the `-vU` flag. The error message you get for (most) bugs even says so, but you would not believe how many of our bug reports do not contain this information.
For bug reports, this means that your report should contain the **complete** output of yt-dlp when called with the `-Uv` flag. The error message you get for (most) bugs even says so, but you would not believe how many of our bug reports do not contain this information.
If the error is `ERROR: Unable to extract ...` and you cannot reproduce it from multiple countries, add `--write-pages` and upload the `.dump` files you get [somewhere](https://gist.github.com).
@ -79,7 +76,7 @@ ### Are you using the latest version?
### Is the issue already documented?
Make sure that someone has not already opened the issue you're trying to open. Search at the top of the window or browse the [GitHub Issues](https://github.com/yt-dlp/yt-dlp/search?type=Issues) of this repository. If there is an issue, subscribe to it to be notified when there is any progress. Unless you have something useful to add to the conversation, please refrain from commenting.
Make sure that someone has not already opened the issue you're trying to open. Search at the top of the window or browse the [GitHub Issues](https://github.com/yt-dlp/yt-dlp/search?type=Issues) of this repository. If there is an issue, feel free to write something along the lines of "This affects me as well, with version 2021.01.01. Here is some more information on the issue: ...". While some issues may be old, a new post into them often spurs rapid activity.
Additionally, it is also helpful to see if the issue has already been documented in the [youtube-dl issue tracker](https://github.com/ytdl-org/youtube-dl/issues). If similar issues have already been reported in youtube-dl (but not in our issue tracker), links to them can be included in your issue report here.
@ -115,7 +112,7 @@ ### Is your question about yt-dlp?
### Are you willing to share account details if needed?
The maintainers and potential contributors of the project often do not have an account for the website you are asking support for. So any developer interested in solving your issue may ask you for account details. It is your personal discretion whether you are willing to share the account in order for the developer to try and solve your issue. However, if you are unwilling or unable to provide details, they obviously cannot work on the issue and it cannot be solved unless some developer who both has an account and is willing/able to contribute decides to solve it.
The maintainers and potential contributors of the project often do not have an account for the website you are asking support for. So any developer interested in solving your issue may ask you for account details. It is your personal discression whether you are willing to share the account in order for the developer to try and solve your issue. However, if you are unwilling or unable to provide details, they obviously cannot work on the issue and it cannot be solved unless some developer who both has an account and is willing/able to contribute decides to solve it.
By sharing an account with anyone, you agree to bear all risks associated with it. The maintainers and yt-dlp can't be held responsible for any misuse of the credentials.
@ -125,95 +122,56 @@ ### Are you willing to share account details if needed?
- Change the password before sharing the account to something random (use [this](https://passwordsgenerator.net/) if you don't have a random password generator).
- Change the password after receiving the account back.
### Is the website primarily used for piracy?
We follow [youtube-dl's policy](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) to not support services that is primarily used for infringing copyright. Additionally, it has been decided to not to support porn sites that specialize in fakes. We also cannot support any service that serves only [DRM protected content](https://en.wikipedia.org/wiki/Digital_rights_management).
# DEVELOPER INSTRUCTIONS
Most users do not need to build yt-dlp and can [download the builds](https://github.com/yt-dlp/yt-dlp/releases), get them via [the other installation methods](README.md#installation) or directly run it using `python -m yt_dlp`.
Most users do not need to build yt-dlp and can [download the builds](https://github.com/yt-dlp/yt-dlp/releases) or get them via [the other installation methods](README.md#installation).
`yt-dlp` uses [`hatch`](<https://hatch.pypa.io>) as a project management tool.
You can easily install it using [`pipx`](<https://pipx.pypa.io>) via `pipx install hatch`, or else via `pip` or your package manager of choice. Make sure you are using at least version `1.10.0`, otherwise some functionality might not work as expected.
To run yt-dlp as a developer, you don't need to build anything either. Simply execute
If you plan on contributing to `yt-dlp`, best practice is to start by running the following command:
python -m yt_dlp
```shell
$ hatch run setup
```
To run the test, simply invoke your favorite test runner, or execute a test file directly; any of the following work:
The above command will install a `pre-commit` hook so that required checks/fixes (linting, formatting) will run automatically before each commit. If any code needs to be linted or formatted, then the commit will be blocked and the necessary changes will be made; you should review all edits and re-commit the fixed version.
After this you can use `hatch shell` to enable a virtual environment that has `yt-dlp` and its development dependencies installed.
In addition, the following script commands can be used to run simple tasks such as linting or testing (without having to run `hatch shell` first):
* `hatch fmt`: Automatically fix linter violations and apply required code formatting changes
* See `hatch fmt --help` for more info
* `hatch test`: Run extractor or core tests
* See `hatch test --help` for more info
python -m unittest discover
python test/test_download.py
nosetests
pytest
See item 6 of [new extractor tutorial](#adding-support-for-a-new-site) for how to run extractor specific test cases.
While it is strongly recommended to use `hatch` for yt-dlp development, if you are unable to do so, alternatively you can manually create a virtual environment and use the following commands:
```shell
# To only install development dependencies:
$ python -m devscripts.install_deps --include dev
# Or, for an editable install plus dev dependencies:
$ python -m pip install -e ".[default,dev]"
# To setup the pre-commit hook:
$ pre-commit install
# To be used in place of `hatch test`:
$ python -m devscripts.run_tests
# To be used in place of `hatch fmt`:
$ ruff check --fix .
$ autopep8 --in-place .
# To only check code instead of applying fixes:
$ ruff check .
$ autopep8 --diff .
```
If you want to create a build of yt-dlp yourself, you can follow the instructions [here](README.md#compile).
## Adding new feature or making overarching changes
Before you start writing code for implementing a new feature, open an issue explaining your feature request and at least one use case. This allows the maintainers to decide whether such a feature is desired for the project in the first place, and will provide an avenue to discuss some implementation details. If you open a pull request for a new feature without discussing with us first, do not be surprised when we ask for large changes to the code, or even reject it outright.
Before you start writing code for implementing a new feature, open an issue explaining your feature request and atleast one use case. This allows the maintainers to decide whether such a feature is desired for the project in the first place, and will provide an avenue to discuss some implementation details. If you open a pull request for a new feature without discussing with us first, do not be surprised when we ask for large changes to the code, or even reject it outright.
The same applies for changes to the documentation, code style, or overarching changes to the architecture
## Adding support for a new site
If you want to add support for a new site, first of all **make sure** this site is **not dedicated to [copyright infringement](#is-the-website-primarily-used-for-piracy)**. yt-dlp does **not support** such sites thus pull requests adding support for them **will be rejected**.
If you want to add support for a new site, first of all **make sure** this site is **not dedicated to [copyright infringement](https://www.github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free)**. yt-dlp does **not support** such sites thus pull requests adding support for them **will be rejected**.
After you have ensured this site is distributing its content legally, you can follow this quick list (assuming your service is called `yourextractor`):
1. [Fork this repository](https://github.com/yt-dlp/yt-dlp/fork)
1. Check out the source code with:
```shell
$ git clone git@github.com:YOUR_GITHUB_USERNAME/yt-dlp.git
```
git clone git@github.com:YOUR_GITHUB_USERNAME/yt-dlp.git
1. Start a new git branch with
```shell
$ cd yt-dlp
$ git checkout -b yourextractor
```
cd yt-dlp
git checkout -b yourextractor
1. Start with this simple template and save it to `yt_dlp/extractor/yourextractor.py`:
```python
# coding: utf-8
from .common import InfoExtractor
@ -223,21 +181,15 @@ ## Adding support for a new site
'url': 'https://yourextractor.com/watch/42',
'md5': 'TODO: md5 sum of the first 10241 bytes of the video file (use --test)',
'info_dict': {
# For videos, only the 'id' and 'ext' fields are required to RUN the test:
'id': '42',
'ext': 'mp4',
# Then if the test run fails, it will output the missing/incorrect fields.
# Properties can be added as:
# * A value, e.g.
# 'title': 'Video title goes here',
# * MD5 checksum; start the string with 'md5:', e.g.
# 'description': 'md5:098f6bcd4621d373cade4e832627b4f6',
# * A regular expression; start the string with 're:', e.g.
# 'thumbnail': r're:^https?://.*\.jpg$',
# * A count of elements in a list; start the string with 'count:', e.g.
# 'tags': 'count:10',
# * Any Python type, e.g.
# 'view_count': int,
'title': 'Video title goes here',
'thumbnail': r're:^https?://.*\.jpg$',
# TODO more properties, either as:
# * A value
# * MD5 checksum; start the string with md5:
# * A regular expression; start the string with re:
# * Any Python type (for example int or float)
}
}]
@ -256,33 +208,27 @@ ## Adding support for a new site
# TODO more properties (see yt_dlp/extractor/common.py)
}
```
1. Add an import in [`yt_dlp/extractor/_extractors.py`](yt_dlp/extractor/_extractors.py). Note that the class name must end with `IE`. Also note that when adding a parenthesized import group, the last import in the group must have a trailing comma in order for this formatting to be respected by our code formatter.
1. Run `hatch test YourExtractor`. This *may fail* at first, but you can continually re-run it until you're done. Upon failure, it will output the missing fields and/or correct values which you can copy. If you decide to add more than one test, the tests will then be named `YourExtractor`, `YourExtractor_1`, `YourExtractor_2`, etc. Note that tests with an `only_matching` key in the test's dict are not included in the count. You can also run all the tests in one go with `YourExtractor_all`
1. Make sure you have at least one test for your extractor. Even if all videos covered by the extractor are expected to be inaccessible for automated testing, tests should still be added with a `skip` parameter indicating why the particular test is disabled from running.
1. Have a look at [`yt_dlp/extractor/common.py`](yt_dlp/extractor/common.py) for possible helper methods and a [detailed description of what your extractor should and may return](yt_dlp/extractor/common.py#L119-L440). Add tests and code for as many as you want.
1. Make sure your code follows [yt-dlp coding conventions](#yt-dlp-coding-conventions), passes [ruff](https://docs.astral.sh/ruff/tutorial/#getting-started) code checks and is properly formatted:
1. Add an import in [`yt_dlp/extractor/extractors.py`](yt_dlp/extractor/extractors.py).
1. Run `python test/test_download.py TestDownload.test_YourExtractor`. This *should fail* at first, but you can continually re-run it until you're done. If you decide to add more than one test, the tests will then be named `TestDownload.test_YourExtractor`, `TestDownload.test_YourExtractor_1`, `TestDownload.test_YourExtractor_2`, etc. Note that tests with `only_matching` key in test's dict are not counted in. You can also run all the tests in one go with `TestDownload.test_YourExtractor_all`
1. Make sure you have atleast one test for your extractor. Even if all videos covered by the extractor are expected to be inaccessible for automated testing, tests should still be added with a `skip` parameter indicating why the particular test is disabled from running.
1. Have a look at [`yt_dlp/extractor/common.py`](yt_dlp/extractor/common.py) for possible helper methods and a [detailed description of what your extractor should and may return](yt_dlp/extractor/common.py#L91-L426). Add tests and code for as many as you want.
1. Make sure your code follows [yt-dlp coding conventions](#yt-dlp-coding-conventions) and check the code with [flake8](https://flake8.pycqa.org/en/latest/index.html#quickstart):
```shell
$ hatch fmt --check
```
$ flake8 yt_dlp/extractor/yourextractor.py
You can use `hatch fmt` to automatically fix problems. Rules that the linter/formatter enforces should not be disabled with `# noqa` unless a maintainer requests it. The only exception allowed is for old/printf-style string formatting in GraphQL query templates (use `# noqa: UP031`).
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.8 and above. Backward compatibility is not required for even older versions of Python.
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.6 and above. Backward compatibility is not required for even older versions of Python.
1. When the tests pass, [add](https://git-scm.com/docs/git-add) the new files, [commit](https://git-scm.com/docs/git-commit) them and [push](https://git-scm.com/docs/git-push) the result, like this:
```shell
$ git add yt_dlp/extractor/_extractors.py
$ git add yt_dlp/extractor/yourextractor.py
$ git commit -m '[yourextractor] Add extractor'
$ git push origin yourextractor
```
$ git add yt_dlp/extractor/extractors.py
$ git add yt_dlp/extractor/yourextractor.py
$ git commit -m '[yourextractor] Add extractor'
$ git push origin yourextractor
1. Finally, [create a pull request](https://help.github.com/articles/creating-a-pull-request). We'll then review and merge it.
In any case, thank you very much for your contributions!
**Tip:** To test extractors that require login information, create a file `test/local_parameters.json` and add `"usenetrc": true` or your `username`&`password` or `cookiefile`/`cookiesfrombrowser` in it:
**Tip:** To test extractors that require login information, create a file `test/local_parameters.json` and add `"usenetrc": true` or your username and password in it:
```json
{
"username": "your user name",
@ -294,22 +240,18 @@ ## yt-dlp coding conventions
This section introduces a guide lines for writing idiomatic, robust and future-proof extractor code.
Extractors are very fragile by nature since they depend on the layout of the source data provided by 3rd party media hosters out of your control and this layout tends to change. As an extractor implementer your task is not only to write code that will extract media links and metadata correctly but also to minimize dependency on the source's layout and even to make the code foresee potential future changes and be ready for that. This is important because it will allow the extractor not to break on minor layout changes thus keeping old yt-dlp versions working. Even though this breakage issue may be easily fixed by a new version of yt-dlp, this could take some time, during which the extractor will remain broken.
Extractors are very fragile by nature since they depend on the layout of the source data provided by 3rd party media hosters out of your control and this layout tends to change. As an extractor implementer your task is not only to write code that will extract media links and metadata correctly but also to minimize dependency on the source's layout and even to make the code foresee potential future changes and be ready for that. This is important because it will allow the extractor not to break on minor layout changes thus keeping old yt-dlp versions working. Even though this breakage issue may be easily fixed by a new version of yt-dlp, this could take some time, during which the the extractor will remain broken.
### Mandatory and optional metafields
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L119-L440) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L91-L426) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
- `id` (media identifier)
- `title` (media title)
- `url` (media download URL) or `formats`
The aforementioned metafields are the critical data that the extraction does not make any sense without and if any of them fail to be extracted then the extractor is considered completely broken. While all extractors must return a `title`, they must also allow it's extraction to be non-fatal.
For pornographic sites, appropriate `age_limit` must also be returned.
The extractor is allowed to return the info dict without url or formats in some special cases if it allows the user to extract useful information with `--ignore-no-formats-error` - e.g. when the video is a live stream that has not started yet.
The aforementioned metafields are the critical data that the extraction does not make any sense without and if any of them fail to be extracted then the extractor is considered completely broken. While, in fact, only `id` is technically mandatory, due to compatibility reasons, yt-dlp also treats `title` as mandatory. The extractor is allowed to return the info dict without url or formats in some special cases if it allows the user to extract usefull information with `--ignore-no-formats-error` - Eg: when the video is a live stream that has not started yet.
[Any field](yt_dlp/extractor/common.py#219-L426) apart from the aforementioned ones are considered **optional**. That means that extraction should be **tolerant** to situations when sources for these fields can potentially be unavailable (even if they are always available at the moment) and **future-proof** in order not to break the extraction of general purpose mandatory fields.
@ -348,10 +290,14 @@ #### Example
The latter will break extraction process with `KeyError` if `summary` disappears from `meta` at some later time but with the former approach extraction will just go ahead with `description` set to `None` which is perfectly fine (remember `None` is equivalent to the absence of data).
If the data is nested, do not use `.get` chains, but instead make use of `traverse_obj`.
If the data is nested, do not use `.get` chains, but instead make use of the utility functions `try_get` or `traverse_obj`
Considering the above `meta` again, assume you want to extract `["user"]["name"]` and put it in the resulting info dict as `uploader`
```python
uploader = try_get(meta, lambda x: x['user']['name']) # correct
```
or
```python
uploader = traverse_obj(meta, ('user', 'name')) # correct
```
@ -365,10 +311,6 @@ #### Example
```python
uploader = meta.get('user', {}).get('name') # incorrect
```
or
```python
uploader = try_get(meta, lambda x: x['user']['name']) # old utility
```
Similarly, you should pass `fatal=False` when extracting optional data from a webpage with `_search_regex`, `_html_search_regex` or similar methods, for instance:
@ -394,42 +336,26 @@ #### Example
Another thing to remember is not to try to iterate over `None`
Say you extracted a list of thumbnails into `thumbnail_data` and want to iterate over them
Say you extracted a list of thumbnails into `thumbnail_data` using `try_get` and now want to iterate over them
```python
thumbnail_data = data.get('thumbnails') or []
thumbnail_data = try_get(...)
thumbnails = [{
'url': item['url'],
'height': item.get('h'),
} for item in thumbnail_data if item.get('url')] # correct
'url': item['url']
} for item in thumbnail_data or []] # correct
```
and not like:
```python
thumbnail_data = data.get('thumbnails')
thumbnail_data = try_get(...)
thumbnails = [{
'url': item['url'],
'height': item.get('h'),
'url': item['url']
} for item in thumbnail_data] # incorrect
```
In this case, `thumbnail_data` will be `None` if the field was not found and this will cause the loop `for item in thumbnail_data` to raise a fatal error. Using `or []` avoids this error and results in setting an empty list in `thumbnails` instead.
In the later case, `thumbnail_data` will be `None` if the field was not found and this will cause the loop `for item in thumbnail_data` to raise a fatal error. Using `for item in thumbnail_data or []` avoids this error and results in setting an empty list in `thumbnails` instead.
Alternately, this can be further simplified by using `traverse_obj`
```python
thumbnails = [{
'url': item['url'],
'height': item.get('h'),
} for item in traverse_obj(data, ('thumbnails', lambda _, v: v['url']))]
```
or, even better,
```python
thumbnails = traverse_obj(data, ('thumbnails', ..., {'url': 'url', 'height': 'h'}))
```
### Provide fallbacks
@ -438,21 +364,21 @@ ### Provide fallbacks
#### Example
Say `meta` from the previous example has a `title` and you are about to extract it like:
Say `meta` from the previous example has a `title` and you are about to extract it. Since `title` is a mandatory meta field you should end up with something like:
```python
title = meta.get('title')
title = meta['title']
```
If `title` disappears from `meta` in future due to some changes on the hoster's side the title extraction would fail.
If `title` disappears from `meta` in future due to some changes on the hoster's side the extraction would fail since `title` is mandatory. That's expected.
Assume that you have some another source you can extract `title` from, for example `og:title` HTML meta of a `webpage`. In this case you can provide a fallback like:
Assume that you have some another source you can extract `title` from, for example `og:title` HTML meta of a `webpage`. In this case you can provide a fallback scenario:
```python
title = meta.get('title') or self._og_search_title(webpage)
```
This code will try to extract from `meta` first and if it fails it will try extracting `og:title` from a `webpage`, making the extractor more robust.
This code will try to extract from `meta` first and if it fails it will try extracting `og:title` from a `webpage`.
### Regular expressions
@ -495,7 +421,7 @@ ##### Example
r'<span[^>]+class="title"[^>]*>([^<]+)', webpage, 'title')
```
which tolerates potential changes in the `style` attribute's value. Or even better:
Or even better:
```python
title = self._search_regex( # correct
@ -503,7 +429,7 @@ ##### Example
webpage, 'title', group='title')
```
which also handles both single quotes in addition to double quotes.
Note how you tolerate potential changes in the `style` attribute's value or switch from using double quotes to single for `class` attribute:
The code definitely should not look like:
@ -521,54 +447,15 @@ ##### Example
webpage, 'title', group='title')
```
Here the presence or absence of other attributes including `style` is irrelevant for the data we need, and so the regex must not depend on it
#### Keep the regular expressions as simple as possible, but no simpler
Since many extractors deal with unstructured data provided by websites, we will often need to use very complex regular expressions. You should try to use the *simplest* regex that can accomplish what you want. In other words, each part of the regex must have a reason for existing. If you can take out a symbol and the functionality does not change, the symbol should not be there.
##### Example
Correct:
```python
_VALID_URL = r'https?://(?:www\.)?website\.com/(?:[^/]+/){3,4}(?P<display_id>[^/]+)_(?P<id>\d+)'
```
Incorrect:
```python
_VALID_URL = r'https?:\/\/(?:www\.)?website\.com\/[^\/]+/[^\/]+/[^\/]+(?:\/[^\/]+)?\/(?P<display_id>[^\/]+)_(?P<id>\d+)'
```
#### Do not misuse `.` and use the correct quantifiers (`+*?`)
Avoid creating regexes that over-match because of wrong use of quantifiers. Also try to avoid non-greedy matching (`?`) where possible since they could easily result in [catastrophic backtracking](https://www.regular-expressions.info/catastrophic.html)
Correct:
```python
title = self._search_regex(r'<span\b[^>]+class="title"[^>]*>([^<]+)', webpage, 'title')
```
Incorrect:
```python
title = self._search_regex(r'<span\b.*class="title".*>(.+?)<', webpage, 'title')
```
Here the presence or absence of other attributes including `style` is irrelevent for the data we need, and so the regex must not depend on it
### Long lines policy
There is a soft limit to keep lines of code under 100 characters long. This means it should be respected if possible and if it does not make readability and code maintenance worse. Sometimes, it may be reasonable to go upto 120 characters and sometimes even 80 can be unreadable. Keep in mind that this is not a hard limit and is just one of many tools to make the code more readable.
There is a soft limit to keep lines of code under 100 characters long. This means it should be respected if possible and if it does not make readability and code maintenance worse. Sometimes, it may be reasonable to go upto 120 characters and sometimes even 80 can be unreadable. Keep in mind that this is not a hard limit and is just one of many tools to make the code more readable
For example, you should **never** split long string literals like URLs or some other often copied entities over multiple lines to fit this limit:
Conversely, don't unnecessarily split small lines further. As a rule of thumb, if removing the line split keeps the code under 80 characters, it should be a single line.
##### Examples
Correct:
```python
@ -582,73 +469,24 @@ ##### Examples
'PLMYEtVRpaqY00V9W81Cwmzp6N6vZqfUKD4'
```
Correct:
```python
uploader = traverse_obj(info, ('uploader', 'name'), ('author', 'fullname'))
```
Incorrect:
```python
uploader = traverse_obj(
info,
('uploader', 'name'),
('author', 'fullname'))
```
Correct:
```python
formats = self._extract_m3u8_formats(
m3u8_url, video_id, 'mp4', 'm3u8_native', m3u8_id='hls',
note='Downloading HD m3u8 information', errnote='Unable to download HD m3u8 information')
```
Incorrect:
```python
formats = self._extract_m3u8_formats(m3u8_url,
video_id,
'mp4',
'm3u8_native',
m3u8_id='hls',
note='Downloading HD m3u8 information',
errnote='Unable to download HD m3u8 information')
```
### Quotes
Always use single quotes for strings (even if the string has `'`) and double quotes for docstrings. Use `'''` only for multi-line strings. An exception can be made if a string has multiple single quotes in it and escaping makes it *significantly* harder to read. For f-strings, use you can use double quotes on the inside. But avoid f-strings that have too many quotes inside.
### Inline values
Extracting variables is acceptable for reducing code duplication and improving readability of complex expressions. However, you should avoid extracting variables used only once and moving them to opposite parts of the extractor file, which makes reading the linear flow difficult.
#### Examples
#### Example
Correct:
```python
return {
'title': self._html_search_regex(r'<h1>([^<]+)</h1>', webpage, 'title'),
# ...some lines of code...
}
title = self._html_search_regex(r'<title>([^<]+)</title>', webpage, 'title')
```
Incorrect:
```python
TITLE_RE = r'<h1>([^<]+)</h1>'
TITLE_RE = r'<title>([^<]+)</title>'
# ...some lines of code...
title = self._html_search_regex(TITLE_RE, webpage, 'title')
# ...some lines of code...
return {
'title': title,
# ...some lines of code...
}
```
@ -680,96 +518,56 @@ #### Example
### Trailing parentheses
Always move trailing parentheses used for grouping/functions after the last argument. On the other hand, multi-line literal list/tuple/dict/set should closed be in a new line. Generators and list/dict comprehensions may use either style
Always move trailing parentheses after the last argument.
#### Examples
Note that this *does not* apply to braces `}` or square brackets `]` both of which should closed be in a new line
#### Example
Correct:
```python
url = traverse_obj(info, (
'context', 'dispatcher', 'stores', 'VideoTitlePageStore', 'data', 'video', 0, 'VideoUrlSet', 'VideoUrl'), list)
```
Correct:
```python
url = traverse_obj(
info,
('context', 'dispatcher', 'stores', 'VideoTitlePageStore', 'data', 'video', 0, 'VideoUrlSet', 'VideoUrl'),
lambda x: x['ResultSet']['Result'][0]['VideoUrlSet']['VideoUrl'],
list)
```
Incorrect:
```python
url = traverse_obj(
info,
('context', 'dispatcher', 'stores', 'VideoTitlePageStore', 'data', 'video', 0, 'VideoUrlSet', 'VideoUrl'),
list
lambda x: x['ResultSet']['Result'][0]['VideoUrlSet']['VideoUrl'],
list,
)
```
Correct:
```python
f = {
'url': url,
'format_id': format_id,
}
```
Incorrect:
```python
f = {'url': url,
'format_id': format_id}
```
Correct:
```python
formats = [process_formats(f) for f in format_data
if f.get('type') in ('hls', 'dash', 'direct') and f.get('downloadable')]
```
Correct:
```python
formats = [
process_formats(f) for f in format_data
if f.get('type') in ('hls', 'dash', 'direct') and f.get('downloadable')
]
```
### Use convenience conversion and parsing functions
Wrap all extracted numeric data into safe functions from [`yt_dlp/utils/`](yt_dlp/utils/): `int_or_none`, `float_or_none`. Use them for string to number conversions as well.
Wrap all extracted numeric data into safe functions from [`yt_dlp/utils.py`](yt_dlp/utils.py): `int_or_none`, `float_or_none`. Use them for string to number conversions as well.
Use `url_or_none` for safe URL processing.
Use `traverse_obj` and `try_call` (superseeds `dict_get` and `try_get`) for safe metadata extraction from parsed JSON.
Use `try_get`, `dict_get` and `traverse_obj` for safe metadata extraction from parsed JSON.
Use `unified_strdate` for uniform `upload_date` or any `YYYYMMDD` meta field extraction, `unified_timestamp` for uniform `timestamp` extraction, `parse_filesize` for `filesize` extraction, `parse_count` for count meta fields extraction, `parse_resolution`, `parse_duration` for `duration` extraction, `parse_age_limit` for `age_limit` extraction.
Explore [`yt_dlp/utils/`](yt_dlp/utils/) for more useful convenience functions.
Explore [`yt_dlp/utils.py`](yt_dlp/utils.py) for more useful convenience functions.
#### Examples
#### More examples
##### Safely extract optional description from parsed JSON
```python
description = traverse_obj(response, ('result', 'video', 'summary'), expected_type=str)
thumbnails = traverse_obj(response, ('result', 'thumbnails', ..., 'url'), expected_type=url_or_none)
```
##### Safely extract more optional metadata
```python
video = traverse_obj(response, ('result', 'video', 0), default={}, expected_type=dict)
description = video.get('summary')
duration = float_or_none(video.get('durationMs'), scale=1000)
view_count = int_or_none(video.get('views'))
```
# My pull request is labeled pending-fixes
The `pending-fixes` label is added when there are changes requested to a PR. When the necessary changes are made, the label should be removed. However, despite our best efforts, it may sometimes happen that the maintainer did not see the changes or forgot to remove the label. If your PR is still marked as `pending-fixes` a few days after all requested changes have been made, feel free to ping the maintainer who labeled your issue and ask them to re-review and remove the label.
# EMBEDDING YT-DLP

View File

@ -2,8 +2,6 @@ pukkandan (owner)
shirt-dev (collaborator)
coletdjnz/colethedj (collaborator)
Ashish0804 (collaborator)
bashonly (collaborator)
Grub4K (collaborator)
h-h-h-h
pauldubois98
nixxo
@ -21,6 +19,7 @@ samiksome
alxnull
FelixFrog
Zocker1999NET
nao20010128nao
kurumigi
bbepis
animelover1984/horahoradev
@ -147,7 +146,7 @@ chio0hai
cntrl-s
Deer-Spangle
DEvmIb
Grabien/MaximVol
Grabien
j54vc1bk
mpeter50
mrpapersonic
@ -161,7 +160,7 @@ PilzAdam
zmousm
iw0nderhow
unit193
TwoThousandHedgehogs/KathrynElrod
TwoThousandHedgehogs
Jertzukka
cypheron
Hyeeji
@ -178,456 +177,3 @@ Sematre
jaller94
r5d
julien-hadleyjack
git-anony-mouse
mdawar
trassshhub
foghawk
k3ns1n
teridon
mozlima
timendum
ischmidt20
CreaValix
sian1468
arkamar
hyano
KiberInfinity
tejing1
Bricio
lazypete365
Aniruddh-J
blackgear
CplPwnies
cyberfox1691
FestplattenSchnitzel
hatienl0i261299
iphoting
jakeogh
lukasfink1
lyz-code
marieell
mdpauley
Mipsters
mxmehl
ofkz
P-reducible
pycabbage
regarten
Ronnnny
schn0sch
s0u1h
MrRawes
cffswb
danielyli
1-Byte
mehq
dzek69
aaearon
panatexxa
kmark
un-def
goggle
Soebb
Fam0r
bohwaz
dodrian
vvto33
ca-za
connercsbn
diegorodriguezv
ekangmonyet
elyse0
evansp
GiedriusS
HE7086
JordanWeatherby
m4tu4g
MarwenDallel
nevack
putnam
rand-net
vertan
Wikidepia
Yipten
moench-tegeder
christoph-heinrich
HobbyistDev
LunarFang416
sbor23
aurelg
adamanldo
gamer191
vkorablin
Burve
mnn
ZhymabekRoman
mozbugbox
aejdl
ping
sqrtNOT
bubbleguuum
darkxex
miseran
StefanLobbenmeier
crazymoose77756
nomevi
Brett824
pingiun
dosy4ev
EhtishamSabir
Ferdi265
FirefoxMetzger
ftk
lamby
llamasblade
lockmatrix
misaelaguayo
odo2063
pritam20ps05
scy
sheerluck
AxiosDeminence
DjesonPV
eren-kemer
freezboltz
Galiley
haobinliang
Mehavoid
winterbird-code
yashkc2025
aldoridhoni
jacobtruman
masta79
palewire
cgrigis
DavidH-2022
dfaker
jackyyf
ohaiibuzzle
SamantazFox
shreyasminocha
tejasa97
xenov
satan1st
0xGodspeed
5736d79
587021c
basrieter
Bobscorn
CNugteren
columndeeply
DoubleCouponDay
Fabi019
GautamMKGarg
itachi-19
jeroenj
josanabr
LiviaMedeiros
nikita-moor
snapdgn
SuperSonicHub1
tannertechnology
Timendum
tobi1805
TokyoBlackHole
ajayyy
Alienmaster
bsun0000
changren-wcr
ClosedPort22
CrankDatSouljaBoy
cruel-efficiency
endotronic
Generator
gibson042
How-Bout-No
invertico
jahway603
jwoglom
lksj
megapro17
mlampe
MrOctopus
nosoop
puc9
sashashura
schnusch
SG5
the-marenga
tkgmomosheep
vitkhab
glensc
synthpop123
tntmod54321
milkknife
Bnyro
CapacitorSet
stelcodes
skbeh
muddi900
digitall
chengzhicn
mexus
JChris246
redraskal
Spicadox
barsnick
docbender
KurtBestor
Chrissi2812
FrederikNS
gschizas
JC-Chung
mzhou
OndrejBakan
ab4cbef
aionescu
amra
ByteDream
carusocr
chexxor
felixonmars
FrankZ85
FriedrichRehren
gregsadetsky
LeoniePhiline
LowSuggestion912
Matumo
OIRNOIR
OMEGARAZER
oxamun
pmitchell86
qbnu
qulaz
rebane2001
road-master
rohieb
sdht0
seproDev
Hill-98
LXYan2333
mushbite
venkata-krishnas
7vlad7
alexklapheke
arobase-che
bepvte
bergoid
blmarket
brandon-dacrib
c-basalt
CoryTibbettsDev
Cyberes
D0LLYNH0
danog
DataGhost
falbrechtskirchinger
foreignBlade
garret1317
hasezoey
hoaluvn
ItzMaxTV
ivanskodje
jo-nike
kangalio
linsui
makew0rld
menschel
mikf
mrscrapy
NDagestad
Neurognostic
NextFire
nick-cd
permunkle
pzhlkj6612
ringus1
rjy
Schmoaaaaah
sjthespian
theperfectpunk
toomyzoom
truedread
TxI5
unbeatable-101
vampirefrog
vidiot720
viktor-enzell
zhgwn
barthelmannk
berkanteber
OverlordQ
rexlambert22
Ti4eeT4e
AmanSal1
bbilly1
meliber
nnoboa
rdamas
RfadnjdExt
urectanc
nao20010128nao/Lesmiscore
04-pasha-04
aaruni96
aky-01
AmirAflak
ApoorvShah111
at-wat
davinkevin
demon071
denhotte
FinnRG
fireattack
Frankgoji
GD-Slime
hatsomatt
ifan-t
kshitiz305
kylegustavo
mabdelfattah
nathantouze
niemands
Rajeshwaran2001
RedDeffender
Rohxn16
sb0stn
SevenLives
simon300000
snixon
soundchaser128
szabyg
trainman261
trislee
wader
Yalab7
zhallgato
zhong-yiyu
Zprokkel
AS6939
drzraf
handlerug
jiru
madewokherd
xofe
awalgarg
midnightveil
naginatana
Riteo
1100101
aniolpages
bartbroere
CrendKing
Esokrates
HitomaruKonpaku
LoserFox
peci1
saintliao
shubhexists
SirElderling
almx
elivinsky
starius
TravisDupes
amir16yp
Fymyte
Ganesh910
hashFactory
kclauhk
Kyraminol
lstrojny
middlingphys
NickCis
nicodato
prettykool
S-Aarab
sonmezberkay
TSRBerry
114514ns
agibson-fl
alard
alien-developers
antonkesy
ArnauvGilotra
Arthurszzz
Bibhav48
Bl4Cc4t
boredzo
Caesim404
chkuendig
chtk
Danish-H
dasidiot
diman8
divStar
DmitryScaletta
feederbox826
gmes78
gonzalezjo
hui1601
infanf
jazz1611
jingtra
jkmartindale
johnvictorfs
llistochek
marcdumais
martinxyz
michal-repo
mrmedieval
nbr23
Nicals
Noor-5
NurTasin
pompos02
Pranaxcau
pwaldhauer
RaduManole
RalphORama
rrgomes
ruiminggu
rvsit
sefidel
shmohawk
Snack-X
src-tinkerer
stilor
syntaxsurge
t-nil
ufukk
vista-narvas
x11x
xpadev-net
Xpl0itU
YoshichikaAAA
zhijinwuu
alb
hruzgar
kasper93
leoheitmannruiz
luiso1979
nipotan
Offert4324
sta1us
Tomoka1
trwstin
alexhuot1
clienthax
DaPotato69
emqi
hugohaa
imanoreotwe
JakeFinley96
lostfictions
minamotorin
ocococococ
Podiumnoche
RasmusAntons
roeniss
shoxie007
Szpachlarz
The-MAGI
TuxCoder
voidful
vtexier
WyohKnott
trueauracoral

File diff suppressed because it is too large Load Diff

View File

@ -8,7 +8,6 @@ # Collaborators
## [pukkandan](https://github.com/pukkandan)
[![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/pukkandan)
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/pukkandan)
* Owner of the fork
@ -26,38 +25,16 @@ ## [shirt](https://github.com/shirt-dev)
## [coletdjnz](https://github.com/coletdjnz)
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/coletdjnz)
[![gh-sponsor](https://img.shields.io/badge/_-Sponsor-red.svg?logo=githubsponsors&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/coletdjnz)
* Improved plugin architecture
* Rewrote the networking infrastructure, implemented support for `requests`
* YouTube improvements including: age-gate bypass, private playlists, multiple-clients (to avoid throttling) and a lot of under-the-hood improvements
* Added support for new websites YoutubeWebArchive, MainStreaming, PRX, nzherald, Mediaklikk, StarTV etc
* Improved/fixed support for Patreon, panopto, gfycat, itv, pbs, SouthParkDE etc
* Added support for downloading YoutubeWebArchive videos
## [Ashish0804](https://github.com/Ashish0804) <sub><sup>[Inactive]</sup></sub>
## [Ashish0804](https://github.com/Ashish0804)
[![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/ashish0804)
* Added support for new websites BiliIntl, DiscoveryPlusIndia, OlympicsReplay, PlanetMarathi, ShemarooMe, Utreon, Zee5 etc
* Added playlist/series downloads for Hotstar, ParamountPlus, Rumble, SonyLIV, Trovo, TubiTv, Voot etc
* Improved/fixed support for HiDive, HotStar, Hungama, LBRY, LinkedInLearning, Mxplayer, SonyLiv, TV2, Vimeo, VLive etc
## [bashonly](https://github.com/bashonly)
* `--update-to`, self-updater rewrite, automated/nightly/master releases
* `--cookies-from-browser` support for Firefox containers, external downloader cookie handling overhaul
* Added support for new websites like Dacast, Kick, NBCStations, Triller, VideoKen, Weverse, WrestleUniverse etc
* Improved/fixed support for Anvato, Brightcove, Reddit, SlidesLive, TikTok, Twitter, Vimeo etc
## [Grub4K](https://github.com/Grub4K)
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/Grub4K) [![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/Grub4K)
* `--update-to`, self-updater rewrite, automated/nightly/master releases
* Reworked internals like `traverse_obj`, various core refactors and bugs fixes
* Implemented proper progress reporting for parallel downloads
* Improved/fixed/added Bundestag, crunchyroll, pr0gramm, Twitter, WrestleUniverse etc
* Added support for new websites Zee5, MXPlayer, DiscoveryPlusIndia, ShemarooMe, Utreon etc
* Added playlist/series downloads for TubiTv, SonyLIV, Voot, HotStar etc

9
MANIFEST.in Normal file
View File

@ -0,0 +1,9 @@
include AUTHORS
include Changelog.md
include LICENSE
include README.md
include completions/*/*
include supportedsites.md
include yt-dlp.1
recursive-include devscripts *
recursive-include test *

128
Makefile
View File

@ -1,34 +1,27 @@
all: lazy-extractors yt-dlp doc pypi-files
clean: clean-test clean-dist
clean-all: clean clean-cache
clean: clean-test clean-dist clean-cache
completions: completion-bash completion-fish completion-zsh
doc: README.md CONTRIBUTING.md CONTRIBUTORS issuetemplates supportedsites
doc: README.md CONTRIBUTING.md issuetemplates supportedsites
ot: offlinetest
tar: yt-dlp.tar.gz
# Keep this list in sync with pyproject.toml includes/artifacts
# Keep this list in sync with MANIFEST.in
# intended use: when building a source distribution,
# make pypi-files && python3 -m build -sn .
pypi-files: AUTHORS Changelog.md LICENSE README.md README.txt supportedsites \
completions yt-dlp.1 pyproject.toml setup.cfg devscripts/* test/*
# make pypi-files && python setup.py sdist
pypi-files: AUTHORS Changelog.md LICENSE README.md README.txt supportedsites completions yt-dlp.1 devscripts/* test/*
.PHONY: all clean clean-all clean-test clean-dist clean-cache \
completions completion-bash completion-fish completion-zsh \
doc issuetemplates supportedsites ot offlinetest codetest test \
tar pypi-files lazy-extractors install uninstall
.PHONY: all clean install test tar pypi-files completions ot offlinetest codetest supportedsites
clean-test:
rm -rf test/testdata/sigs/player-*.js tmp/ *.annotations.xml *.aria2 *.description *.dump *.frag \
*.frag.aria2 *.frag.urls *.info.json *.live_chat.json *.meta *.part* *.tmp *.temp *.unknown_video *.ytdl \
*.3gp *.ape *.ass *.avi *.desktop *.f4v *.flac *.flv *.gif *.jpeg *.jpg *.lrc *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 *.mp4 \
*.mpg *.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.ssa *.swf *.swp *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
rm -rf test/testdata/player-*.js tmp/ *.annotations.xml *.aria2 *.description *.dump *.frag \
*.frag.aria2 *.frag.urls *.info.json *.live_chat.json *.part* *.unknown_video *.ytdl \
*.3gp *.ape *.avi *.desktop *.flac *.flv *.jpeg *.jpg *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 \
*.mp4 *.ogg *.opus *.png *.sbv *.srt *.swf *.swp *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
clean-dist:
rm -rf yt-dlp.1.temp.md yt-dlp.1 README.txt MANIFEST build/ dist/ .coverage cover/ yt-dlp.tar.gz completions/ \
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS .mailmap
clean-cache:
find . \( \
-type d -name ".*_cache" -o -type d -name __pycache__ -o -name "*.pyc" -o -name "*.class" \
\) -prune -exec rm -rf {} \;
find . -name "*.pyc" -o -name "*.class" -delete
completion-bash: completions/bash/yt-dlp
completion-fish: completions/fish/yt-dlp.fish
@ -36,75 +29,54 @@ completion-zsh: completions/zsh/_yt-dlp
lazy-extractors: yt_dlp/extractor/lazy_extractors.py
PREFIX ?= /usr/local
DESTDIR ?= .
BINDIR ?= $(PREFIX)/bin
MANDIR ?= $(PREFIX)/man
SHAREDIR ?= $(PREFIX)/share
PYTHON ?= /usr/bin/env python3
GNUTAR ?= tar
# set markdown input format to "markdown-smart" for pandoc version 2+ and to "markdown" for pandoc prior to version 2
PANDOC_VERSION_CMD = pandoc -v 2>/dev/null | head -n1 | cut -d' ' -f2 | head -c1
PANDOC_VERSION != $(PANDOC_VERSION_CMD)
PANDOC_VERSION ?= $(shell $(PANDOC_VERSION_CMD))
MARKDOWN_CMD = if [ "$(PANDOC_VERSION)" = "1" -o "$(PANDOC_VERSION)" = "0" ]; then echo markdown; else echo markdown-smart; fi
MARKDOWN != $(MARKDOWN_CMD)
MARKDOWN ?= $(shell $(MARKDOWN_CMD))
# set SYSCONFDIR to /etc if PREFIX=/usr or PREFIX=/usr/local
SYSCONFDIR = $(shell if [ $(PREFIX) = /usr -o $(PREFIX) = /usr/local ]; then echo /etc; else echo $(PREFIX)/etc; fi)
# set markdown input format to "markdown-smart" for pandoc version 2 and to "markdown" for pandoc prior to version 2
MARKDOWN = $(shell if [ `pandoc -v | head -n1 | cut -d" " -f2 | head -c1` = "2" ]; then echo markdown-smart; else echo markdown; fi)
install: lazy-extractors yt-dlp yt-dlp.1 completions
mkdir -p $(DESTDIR)$(BINDIR)
install -m755 yt-dlp $(DESTDIR)$(BINDIR)/yt-dlp
mkdir -p $(DESTDIR)$(MANDIR)/man1
install -m644 yt-dlp.1 $(DESTDIR)$(MANDIR)/man1/yt-dlp.1
mkdir -p $(DESTDIR)$(SHAREDIR)/bash-completion/completions
install -m644 completions/bash/yt-dlp $(DESTDIR)$(SHAREDIR)/bash-completion/completions/yt-dlp
mkdir -p $(DESTDIR)$(SHAREDIR)/zsh/site-functions
install -m644 completions/zsh/_yt-dlp $(DESTDIR)$(SHAREDIR)/zsh/site-functions/_yt-dlp
mkdir -p $(DESTDIR)$(SHAREDIR)/fish/vendor_completions.d
install -m644 completions/fish/yt-dlp.fish $(DESTDIR)$(SHAREDIR)/fish/vendor_completions.d/yt-dlp.fish
uninstall:
rm -f $(DESTDIR)$(BINDIR)/yt-dlp
rm -f $(DESTDIR)$(MANDIR)/man1/yt-dlp.1
rm -f $(DESTDIR)$(SHAREDIR)/bash-completion/completions/yt-dlp
rm -f $(DESTDIR)$(SHAREDIR)/zsh/site-functions/_yt-dlp
rm -f $(DESTDIR)$(SHAREDIR)/fish/vendor_completions.d/yt-dlp.fish
install -Dm755 yt-dlp $(DESTDIR)$(BINDIR)/yt-dlp
install -Dm644 yt-dlp.1 $(DESTDIR)$(MANDIR)/man1/yt-dlp.1
install -Dm644 completions/bash/yt-dlp $(DESTDIR)$(SHAREDIR)/bash-completion/completions/yt-dlp
install -Dm644 completions/zsh/_yt-dlp $(DESTDIR)$(SHAREDIR)/zsh/site-functions/_yt-dlp
install -Dm644 completions/fish/yt-dlp.fish $(DESTDIR)$(SHAREDIR)/fish/vendor_completions.d/yt-dlp.fish
codetest:
ruff check .
autopep8 --diff .
flake8 .
test:
$(PYTHON) -m pytest -Werror
$(PYTHON) -m pytest
$(MAKE) codetest
offlinetest: codetest
$(PYTHON) -m pytest -Werror -m "not download"
$(PYTHON) -m pytest -k "not download"
CODE_FOLDERS_CMD = find yt_dlp -type f -name '__init__.py' | sed 's,/__init__.py,,' | grep -v '/__' | sort
CODE_FOLDERS != $(CODE_FOLDERS_CMD)
CODE_FOLDERS ?= $(shell $(CODE_FOLDERS_CMD))
CODE_FILES_CMD = for f in $(CODE_FOLDERS) ; do echo "$$f" | sed 's,$$,/*.py,' ; done
CODE_FILES != $(CODE_FILES_CMD)
CODE_FILES ?= $(shell $(CODE_FILES_CMD))
yt-dlp: $(CODE_FILES)
yt-dlp: yt_dlp/*.py yt_dlp/*/*.py
mkdir -p zip
for d in $(CODE_FOLDERS) ; do \
for d in yt_dlp yt_dlp/downloader yt_dlp/extractor yt_dlp/postprocessor ; do \
mkdir -p zip/$$d ;\
cp -pPR $$d/*.py zip/$$d/ ;\
done
(cd zip && touch -t 200001010101 $(CODE_FILES))
touch -t 200001010101 zip/yt_dlp/*.py zip/yt_dlp/*/*.py
mv zip/yt_dlp/__main__.py zip/
(cd zip && zip -q ../yt-dlp $(CODE_FILES) __main__.py)
cd zip ; zip -q ../yt-dlp yt_dlp/*.py yt_dlp/*/*.py __main__.py
rm -rf zip
echo '#!$(PYTHON)' > yt-dlp
cat yt-dlp.zip >> yt-dlp
rm yt-dlp.zip
chmod a+x yt-dlp
README.md: $(CODE_FILES) devscripts/make_readme.py
COLUMNS=80 $(PYTHON) yt_dlp/__main__.py --ignore-config --help | $(PYTHON) devscripts/make_readme.py
README.md: yt_dlp/*.py yt_dlp/*/*.py
COLUMNS=80 $(PYTHON) yt_dlp/__main__.py --help | $(PYTHON) devscripts/make_readme.py
CONTRIBUTING.md: README.md devscripts/make_contributing.py
CONTRIBUTING.md: README.md
$(PYTHON) devscripts/make_contributing.py README.md CONTRIBUTING.md
issuetemplates: devscripts/make_issue_template.py .github/ISSUE_TEMPLATE_tmpl/1_broken_site.yml .github/ISSUE_TEMPLATE_tmpl/2_site_support_request.yml .github/ISSUE_TEMPLATE_tmpl/3_site_feature_request.yml .github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml .github/ISSUE_TEMPLATE_tmpl/5_feature_request.yml yt_dlp/version.py
@ -121,53 +93,45 @@ supportedsites:
README.txt: README.md
pandoc -f $(MARKDOWN) -t plain README.md -o README.txt
yt-dlp.1: README.md devscripts/prepare_manpage.py
yt-dlp.1: README.md
$(PYTHON) devscripts/prepare_manpage.py yt-dlp.1.temp.md
pandoc -s -f $(MARKDOWN) -t man yt-dlp.1.temp.md -o yt-dlp.1
rm -f yt-dlp.1.temp.md
completions/bash/yt-dlp: $(CODE_FILES) devscripts/bash-completion.in
completions/bash/yt-dlp: yt_dlp/*.py yt_dlp/*/*.py devscripts/bash-completion.in
mkdir -p completions/bash
$(PYTHON) devscripts/bash-completion.py
completions/zsh/_yt-dlp: $(CODE_FILES) devscripts/zsh-completion.in
completions/zsh/_yt-dlp: yt_dlp/*.py yt_dlp/*/*.py devscripts/zsh-completion.in
mkdir -p completions/zsh
$(PYTHON) devscripts/zsh-completion.py
completions/fish/yt-dlp.fish: $(CODE_FILES) devscripts/fish-completion.in
completions/fish/yt-dlp.fish: yt_dlp/*.py yt_dlp/*/*.py devscripts/fish-completion.in
mkdir -p completions/fish
$(PYTHON) devscripts/fish-completion.py
_EXTRACTOR_FILES_CMD = find yt_dlp/extractor -name '*.py' -and -not -name 'lazy_extractors.py'
_EXTRACTOR_FILES != $(_EXTRACTOR_FILES_CMD)
_EXTRACTOR_FILES ?= $(shell $(_EXTRACTOR_FILES_CMD))
_EXTRACTOR_FILES = $(shell find yt_dlp/extractor -iname '*.py' -and -not -iname 'lazy_extractors.py')
yt_dlp/extractor/lazy_extractors.py: devscripts/make_lazy_extractors.py devscripts/lazy_load_template.py $(_EXTRACTOR_FILES)
$(PYTHON) devscripts/make_lazy_extractors.py $@
yt-dlp.tar.gz: all
@$(GNUTAR) -czf yt-dlp.tar.gz --transform "s|^|yt-dlp/|" --owner 0 --group 0 \
@tar -czf $(DESTDIR)/yt-dlp.tar.gz --transform "s|^|yt-dlp/|" --owner 0 --group 0 \
--exclude '*.DS_Store' \
--exclude '*.kate-swp' \
--exclude '*.pyc' \
--exclude '*.pyo' \
--exclude '*~' \
--exclude '__pycache__' \
--exclude '.*_cache' \
--exclude '.git' \
-- \
README.md supportedsites.md Changelog.md LICENSE \
CONTRIBUTING.md Collaborators.md CONTRIBUTORS AUTHORS \
Makefile yt-dlp.1 README.txt completions .gitignore \
setup.cfg yt-dlp yt_dlp pyproject.toml devscripts test
Makefile MANIFEST.in yt-dlp.1 README.txt completions \
setup.py setup.cfg yt-dlp yt_dlp requirements.txt \
devscripts test tox.ini pytest.ini
AUTHORS: Changelog.md
@if [ -d '.git' ] && command -v git > /dev/null ; then \
echo 'Generating $@ from git commit history' ; \
git shortlog -s -n HEAD | cut -f2 | sort > $@ ; \
fi
AUTHORS: .mailmap
git shortlog -s -n | cut -f2 | sort > AUTHORS
CONTRIBUTORS: Changelog.md
@if [ -d '.git' ] && command -v git > /dev/null ; then \
echo 'Updating $@ from git commit history' ; \
$(PYTHON) devscripts/make_changelog.py -v -c > /dev/null ; \
fi
.mailmap:
git shortlog -s -e -n | awk '!(out[$$NF]++) { $$1="";sub(/^[ \t]+/,""); print}' > .mailmap

2601
README.md

File diff suppressed because it is too large Load Diff

View File

View File

@ -1,10 +0,0 @@
services:
static:
build: static
environment:
channel: ${channel}
origin: ${origin}
version: ${version}
volumes:
- ~/build:/build
- ../..:/yt-dlp

View File

@ -1,21 +0,0 @@
FROM alpine:3.19 as base
RUN apk --update add --no-cache \
build-base \
python3 \
pipx \
;
RUN pipx install pyinstaller
# Requires above step to prepare the shared venv
RUN ~/.local/share/pipx/shared/bin/python -m pip install -U wheel
RUN apk --update add --no-cache \
scons \
patchelf \
binutils \
;
RUN pipx install staticx
WORKDIR /yt-dlp
COPY entrypoint.sh /entrypoint.sh
ENTRYPOINT /entrypoint.sh

View File

@ -1,13 +0,0 @@
#!/bin/ash
set -e
source ~/.local/share/pipx/venvs/pyinstaller/bin/activate
python -m devscripts.install_deps --include secretstorage
python -m devscripts.make_lazy_extractors
python devscripts/update-version.py -c "${channel}" -r "${origin}" "${version}"
python -m bundle.pyinstaller
deactivate
source ~/.local/share/pipx/venvs/staticx/bin/activate
staticx /yt-dlp/dist/yt-dlp_linux /build/yt-dlp_linux
deactivate

View File

@ -1,59 +0,0 @@
#!/usr/bin/env python3
# Allow execution from anywhere
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import warnings
from py2exe import freeze
from devscripts.utils import read_version
VERSION = read_version()
def main():
warnings.warn(
'py2exe builds do not support pycryptodomex and needs VC++14 to run. '
'It is recommended to run "pyinst.py" to build using pyinstaller instead')
freeze(
console=[{
'script': './yt_dlp/__main__.py',
'dest_base': 'yt-dlp',
'icon_resources': [(1, 'devscripts/logo.ico')],
}],
version_info={
'version': VERSION,
'description': 'A feature-rich command-line audio/video downloader',
'comments': 'Official repository: <https://github.com/yt-dlp/yt-dlp>',
'product_name': 'yt-dlp',
'product_version': VERSION,
},
options={
'bundle_files': 0,
'compressed': 1,
'optimize': 2,
'dist_dir': './dist',
'excludes': [
# py2exe cannot import Crypto
'Crypto',
'Cryptodome',
# requests >=2.32.0 breaks py2exe builds due to certifi dependency
'requests',
'urllib3',
],
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
# Modules that are only imported dynamically must be added here
'includes': ['yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated',
'yt_dlp.utils._legacy', 'yt_dlp.utils._deprecated'],
},
zipfile=None,
)
if __name__ == '__main__':
main()

Binary file not shown.

Binary file not shown.

View File

View File

@ -1,16 +1,15 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
# Allow direct execution
import os
from os.path import dirname as dirn
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
sys.path.insert(0, dirn(dirn((os.path.abspath(__file__)))))
import yt_dlp
BASH_COMPLETION_FILE = 'completions/bash/yt-dlp'
BASH_COMPLETION_TEMPLATE = 'devscripts/bash-completion.in'
BASH_COMPLETION_FILE = "completions/bash/yt-dlp"
BASH_COMPLETION_TEMPLATE = "devscripts/bash-completion.in"
def build_completion(opt_parser):
@ -21,11 +20,11 @@ def build_completion(opt_parser):
opts_flag.append(option.get_opt_string())
with open(BASH_COMPLETION_TEMPLATE) as f:
template = f.read()
with open(BASH_COMPLETION_FILE, 'w') as f:
with open(BASH_COMPLETION_FILE, "w") as f:
# just using the special char
filled_template = template.replace('{{flags}}', ' '.join(opts_flag))
filled_template = template.replace("{{flags}}", " ".join(opts_flag))
f.write(filled_template)
parser = yt_dlp.parseOpts(ignore_config_files=True)[0]
parser = yt_dlp.parseOpts()[0]
build_completion(parser)

435
devscripts/buildserver.py Normal file
View File

@ -0,0 +1,435 @@
# UNUSED
#!/usr/bin/python3
import argparse
import ctypes
import functools
import shutil
import subprocess
import sys
import tempfile
import threading
import traceback
import os.path
sys.path.insert(0, os.path.dirname(os.path.dirname((os.path.abspath(__file__)))))
from yt_dlp.compat import (
compat_input,
compat_http_server,
compat_str,
compat_urlparse,
)
# These are not used outside of buildserver.py thus not in compat.py
try:
import winreg as compat_winreg
except ImportError: # Python 2
import _winreg as compat_winreg
try:
import socketserver as compat_socketserver
except ImportError: # Python 2
import SocketServer as compat_socketserver
class BuildHTTPServer(compat_socketserver.ThreadingMixIn, compat_http_server.HTTPServer):
allow_reuse_address = True
advapi32 = ctypes.windll.advapi32
SC_MANAGER_ALL_ACCESS = 0xf003f
SC_MANAGER_CREATE_SERVICE = 0x02
SERVICE_WIN32_OWN_PROCESS = 0x10
SERVICE_AUTO_START = 0x2
SERVICE_ERROR_NORMAL = 0x1
DELETE = 0x00010000
SERVICE_STATUS_START_PENDING = 0x00000002
SERVICE_STATUS_RUNNING = 0x00000004
SERVICE_ACCEPT_STOP = 0x1
SVCNAME = 'youtubedl_builder'
LPTSTR = ctypes.c_wchar_p
START_CALLBACK = ctypes.WINFUNCTYPE(None, ctypes.c_int, ctypes.POINTER(LPTSTR))
class SERVICE_TABLE_ENTRY(ctypes.Structure):
_fields_ = [
('lpServiceName', LPTSTR),
('lpServiceProc', START_CALLBACK)
]
HandlerEx = ctypes.WINFUNCTYPE(
ctypes.c_int, # return
ctypes.c_int, # dwControl
ctypes.c_int, # dwEventType
ctypes.c_void_p, # lpEventData,
ctypes.c_void_p, # lpContext,
)
def _ctypes_array(c_type, py_array):
ar = (c_type * len(py_array))()
ar[:] = py_array
return ar
def win_OpenSCManager():
res = advapi32.OpenSCManagerW(None, None, SC_MANAGER_ALL_ACCESS)
if not res:
raise Exception('Opening service manager failed - '
'are you running this as administrator?')
return res
def win_install_service(service_name, cmdline):
manager = win_OpenSCManager()
try:
h = advapi32.CreateServiceW(
manager, service_name, None,
SC_MANAGER_CREATE_SERVICE, SERVICE_WIN32_OWN_PROCESS,
SERVICE_AUTO_START, SERVICE_ERROR_NORMAL,
cmdline, None, None, None, None, None)
if not h:
raise OSError('Service creation failed: %s' % ctypes.FormatError())
advapi32.CloseServiceHandle(h)
finally:
advapi32.CloseServiceHandle(manager)
def win_uninstall_service(service_name):
manager = win_OpenSCManager()
try:
h = advapi32.OpenServiceW(manager, service_name, DELETE)
if not h:
raise OSError('Could not find service %s: %s' % (
service_name, ctypes.FormatError()))
try:
if not advapi32.DeleteService(h):
raise OSError('Deletion failed: %s' % ctypes.FormatError())
finally:
advapi32.CloseServiceHandle(h)
finally:
advapi32.CloseServiceHandle(manager)
def win_service_report_event(service_name, msg, is_error=True):
with open('C:/sshkeys/log', 'a', encoding='utf-8') as f:
f.write(msg + '\n')
event_log = advapi32.RegisterEventSourceW(None, service_name)
if not event_log:
raise OSError('Could not report event: %s' % ctypes.FormatError())
try:
type_id = 0x0001 if is_error else 0x0004
event_id = 0xc0000000 if is_error else 0x40000000
lines = _ctypes_array(LPTSTR, [msg])
if not advapi32.ReportEventW(
event_log, type_id, 0, event_id, None, len(lines), 0,
lines, None):
raise OSError('Event reporting failed: %s' % ctypes.FormatError())
finally:
advapi32.DeregisterEventSource(event_log)
def win_service_handler(stop_event, *args):
try:
raise ValueError('Handler called with args ' + repr(args))
TODO
except Exception as e:
tb = traceback.format_exc()
msg = str(e) + '\n' + tb
win_service_report_event(service_name, msg, is_error=True)
raise
def win_service_set_status(handle, status_code):
svcStatus = SERVICE_STATUS()
svcStatus.dwServiceType = SERVICE_WIN32_OWN_PROCESS
svcStatus.dwCurrentState = status_code
svcStatus.dwControlsAccepted = SERVICE_ACCEPT_STOP
svcStatus.dwServiceSpecificExitCode = 0
if not advapi32.SetServiceStatus(handle, ctypes.byref(svcStatus)):
raise OSError('SetServiceStatus failed: %r' % ctypes.FormatError())
def win_service_main(service_name, real_main, argc, argv_raw):
try:
# args = [argv_raw[i].value for i in range(argc)]
stop_event = threading.Event()
handler = HandlerEx(functools.partial(stop_event, win_service_handler))
h = advapi32.RegisterServiceCtrlHandlerExW(service_name, handler, None)
if not h:
raise OSError('Handler registration failed: %s' %
ctypes.FormatError())
TODO
except Exception as e:
tb = traceback.format_exc()
msg = str(e) + '\n' + tb
win_service_report_event(service_name, msg, is_error=True)
raise
def win_service_start(service_name, real_main):
try:
cb = START_CALLBACK(
functools.partial(win_service_main, service_name, real_main))
dispatch_table = _ctypes_array(SERVICE_TABLE_ENTRY, [
SERVICE_TABLE_ENTRY(
service_name,
cb
),
SERVICE_TABLE_ENTRY(None, ctypes.cast(None, START_CALLBACK))
])
if not advapi32.StartServiceCtrlDispatcherW(dispatch_table):
raise OSError('ctypes start failed: %s' % ctypes.FormatError())
except Exception as e:
tb = traceback.format_exc()
msg = str(e) + '\n' + tb
win_service_report_event(service_name, msg, is_error=True)
raise
def main(args=None):
parser = argparse.ArgumentParser()
parser.add_argument('-i', '--install',
action='store_const', dest='action', const='install',
help='Launch at Windows startup')
parser.add_argument('-u', '--uninstall',
action='store_const', dest='action', const='uninstall',
help='Remove Windows service')
parser.add_argument('-s', '--service',
action='store_const', dest='action', const='service',
help='Run as a Windows service')
parser.add_argument('-b', '--bind', metavar='<host:port>',
action='store', default='0.0.0.0:8142',
help='Bind to host:port (default %default)')
options = parser.parse_args(args=args)
if options.action == 'install':
fn = os.path.abspath(__file__).replace('v:', '\\\\vboxsrv\\vbox')
cmdline = '%s %s -s -b %s' % (sys.executable, fn, options.bind)
win_install_service(SVCNAME, cmdline)
return
if options.action == 'uninstall':
win_uninstall_service(SVCNAME)
return
if options.action == 'service':
win_service_start(SVCNAME, main)
return
host, port_str = options.bind.split(':')
port = int(port_str)
print('Listening on %s:%d' % (host, port))
srv = BuildHTTPServer((host, port), BuildHTTPRequestHandler)
thr = threading.Thread(target=srv.serve_forever)
thr.start()
compat_input('Press ENTER to shut down')
srv.shutdown()
thr.join()
def rmtree(path):
for name in os.listdir(path):
fname = os.path.join(path, name)
if os.path.isdir(fname):
rmtree(fname)
else:
os.chmod(fname, 0o666)
os.remove(fname)
os.rmdir(path)
class BuildError(Exception):
def __init__(self, output, code=500):
self.output = output
self.code = code
def __str__(self):
return self.output
class HTTPError(BuildError):
pass
class PythonBuilder(object):
def __init__(self, **kwargs):
python_version = kwargs.pop('python', '3.4')
python_path = None
for node in ('Wow6432Node\\', ''):
try:
key = compat_winreg.OpenKey(
compat_winreg.HKEY_LOCAL_MACHINE,
r'SOFTWARE\%sPython\PythonCore\%s\InstallPath' % (node, python_version))
try:
python_path, _ = compat_winreg.QueryValueEx(key, '')
finally:
compat_winreg.CloseKey(key)
break
except Exception:
pass
if not python_path:
raise BuildError('No such Python version: %s' % python_version)
self.pythonPath = python_path
super(PythonBuilder, self).__init__(**kwargs)
class GITInfoBuilder(object):
def __init__(self, **kwargs):
try:
self.user, self.repoName = kwargs['path'][:2]
self.rev = kwargs.pop('rev')
except ValueError:
raise BuildError('Invalid path')
except KeyError as e:
raise BuildError('Missing mandatory parameter "%s"' % e.args[0])
path = os.path.join(os.environ['APPDATA'], 'Build archive', self.repoName, self.user)
if not os.path.exists(path):
os.makedirs(path)
self.basePath = tempfile.mkdtemp(dir=path)
self.buildPath = os.path.join(self.basePath, 'build')
super(GITInfoBuilder, self).__init__(**kwargs)
class GITBuilder(GITInfoBuilder):
def build(self):
try:
subprocess.check_output(['git', 'clone', 'git://github.com/%s/%s.git' % (self.user, self.repoName), self.buildPath])
subprocess.check_output(['git', 'checkout', self.rev], cwd=self.buildPath)
except subprocess.CalledProcessError as e:
raise BuildError(e.output)
super(GITBuilder, self).build()
class YoutubeDLBuilder(object):
authorizedUsers = ['fraca7', 'phihag', 'rg3', 'FiloSottile', 'ytdl-org']
def __init__(self, **kwargs):
if self.repoName != 'yt-dlp':
raise BuildError('Invalid repository "%s"' % self.repoName)
if self.user not in self.authorizedUsers:
raise HTTPError('Unauthorized user "%s"' % self.user, 401)
super(YoutubeDLBuilder, self).__init__(**kwargs)
def build(self):
try:
proc = subprocess.Popen([os.path.join(self.pythonPath, 'python.exe'), 'setup.py', 'py2exe'], stdin=subprocess.PIPE, cwd=self.buildPath)
proc.wait()
#subprocess.check_output([os.path.join(self.pythonPath, 'python.exe'), 'setup.py', 'py2exe'],
# cwd=self.buildPath)
except subprocess.CalledProcessError as e:
raise BuildError(e.output)
super(YoutubeDLBuilder, self).build()
class DownloadBuilder(object):
def __init__(self, **kwargs):
self.handler = kwargs.pop('handler')
self.srcPath = os.path.join(self.buildPath, *tuple(kwargs['path'][2:]))
self.srcPath = os.path.abspath(os.path.normpath(self.srcPath))
if not self.srcPath.startswith(self.buildPath):
raise HTTPError(self.srcPath, 401)
super(DownloadBuilder, self).__init__(**kwargs)
def build(self):
if not os.path.exists(self.srcPath):
raise HTTPError('No such file', 404)
if os.path.isdir(self.srcPath):
raise HTTPError('Is a directory: %s' % self.srcPath, 401)
self.handler.send_response(200)
self.handler.send_header('Content-Type', 'application/octet-stream')
self.handler.send_header('Content-Disposition', 'attachment; filename=%s' % os.path.split(self.srcPath)[-1])
self.handler.send_header('Content-Length', str(os.stat(self.srcPath).st_size))
self.handler.end_headers()
with open(self.srcPath, 'rb') as src:
shutil.copyfileobj(src, self.handler.wfile)
super(DownloadBuilder, self).build()
class CleanupTempDir(object):
def build(self):
try:
rmtree(self.basePath)
except Exception as e:
print('WARNING deleting "%s": %s' % (self.basePath, e))
super(CleanupTempDir, self).build()
class Null(object):
def __init__(self, **kwargs):
pass
def start(self):
pass
def close(self):
pass
def build(self):
pass
class Builder(PythonBuilder, GITBuilder, YoutubeDLBuilder, DownloadBuilder, CleanupTempDir, Null):
pass
class BuildHTTPRequestHandler(compat_http_server.BaseHTTPRequestHandler):
actionDict = {'build': Builder, 'download': Builder} # They're the same, no more caching.
def do_GET(self):
path = compat_urlparse.urlparse(self.path)
paramDict = dict([(key, value[0]) for key, value in compat_urlparse.parse_qs(path.query).items()])
action, _, path = path.path.strip('/').partition('/')
if path:
path = path.split('/')
if action in self.actionDict:
try:
builder = self.actionDict[action](path=path, handler=self, **paramDict)
builder.start()
try:
builder.build()
finally:
builder.close()
except BuildError as e:
self.send_response(e.code)
msg = compat_str(e).encode('UTF-8')
self.send_header('Content-Type', 'text/plain; charset=UTF-8')
self.send_header('Content-Length', len(msg))
self.end_headers()
self.wfile.write(msg)
else:
self.send_response(500, 'Unknown build method "%s"' % action)
else:
self.send_response(500, 'Malformed URL')
if __name__ == '__main__':
main()

View File

@ -1,173 +0,0 @@
[
{
"action": "add",
"when": "29cb20bd563c02671b31dd840139e93dd37150a1",
"short": "[priority] **A new release type has been added!**\n * [`nightly`](https://github.com/yt-dlp/yt-dlp/releases/tag/nightly) builds will be made after each push, containing the latest fixes (but also possibly bugs).\n * When using `--update`/`-U`, a release binary will only update to its current channel (either `stable` or `nightly`).\n * The `--update-to` option has been added allowing the user more control over program upgrades (or downgrades).\n * `--update-to` can change the release channel (`stable`, `nightly`) and also upgrade or downgrade to specific tags.\n * **Usage**: `--update-to CHANNEL`, `--update-to TAG`, `--update-to CHANNEL@TAG`"
},
{
"action": "add",
"when": "5038f6d713303e0967d002216e7a88652401c22a",
"short": "[priority] **YouTube throttling fixes!**"
},
{
"action": "remove",
"when": "2e023649ea4e11151545a34dc1360c114981a236"
},
{
"action": "add",
"when": "01aba2519a0884ef17d5f85608dbd2a455577147",
"short": "[priority] YouTube: Improved throttling and signature fixes"
},
{
"action": "change",
"when": "c86e433c35fe5da6cb29f3539eef97497f84ed38",
"short": "[extractor/niconico:series] Fix extraction (#6898)",
"authors": ["sqrtNOT"]
},
{
"action": "change",
"when": "69a40e4a7f6caa5662527ebd2f3c4e8aa02857a2",
"short": "[extractor/youtube:music_search_url] Extract title (#7102)",
"authors": ["kangalio"]
},
{
"action": "change",
"when": "8417f26b8a819cd7ffcd4e000ca3e45033e670fb",
"short": "Add option `--color` (#6904)",
"authors": ["Grub4K"]
},
{
"action": "change",
"when": "b4e0d75848e9447cee2cd3646ce54d4744a7ff56",
"short": "Improve `--download-sections`\n - Support negative time-ranges\n - Add `*from-url` to obey time-ranges in URL",
"authors": ["pukkandan"]
},
{
"action": "change",
"when": "1e75d97db21152acc764b30a688e516f04b8a142",
"short": "[extractor/youtube] Add `ios` to default clients used\n - IOS is affected neither by 403 nor by nsig so helps mitigate them preemptively\n - IOS also has higher bit-rate 'premium' formats though they are not labeled as such",
"authors": ["pukkandan"]
},
{
"action": "change",
"when": "f2ff0f6f1914b82d4a51681a72cc0828115dcb4a",
"short": "[extractor/motherless] Add gallery support, fix groups (#7211)",
"authors": ["rexlambert22", "Ti4eeT4e"]
},
{
"action": "change",
"when": "a4486bfc1dc7057efca9dd3fe70d7fa25c56f700",
"short": "[misc] Revert \"Add automatic duplicate issue detection\"",
"authors": ["pukkandan"]
},
{
"action": "add",
"when": "1ceb657bdd254ad961489e5060f2ccc7d556b729",
"short": "[priority] Security: [[CVE-2023-35934](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-35934)] Fix [Cookie leak](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-v8mc-9377-rwjj)\n - `--add-header Cookie:` is deprecated and auto-scoped to input URL domains\n - Cookies are scoped when passed to external downloaders\n - Add `cookies` field to info.json and deprecate `http_headers.Cookie`"
},
{
"action": "change",
"when": "b03fa7834579a01cc5fba48c0e73488a16683d48",
"short": "[ie/twitter] Revert 92315c03774cfabb3a921884326beb4b981f786b",
"authors": ["pukkandan"]
},
{
"action": "change",
"when": "fcd6a76adc49d5cd8783985c7ce35384b72e545f",
"short": "[test] Add tests for socks proxies (#7908)",
"authors": ["coletdjnz"]
},
{
"action": "change",
"when": "4bf912282a34b58b6b35d8f7e6be535770c89c76",
"short": "[rh:urllib] Remove dot segments during URL normalization (#7662)",
"authors": ["coletdjnz"]
},
{
"action": "change",
"when": "59e92b1f1833440bb2190f847eb735cf0f90bc85",
"short": "[rh:urllib] Simplify gzip decoding (#7611)",
"authors": ["Grub4K"]
},
{
"action": "add",
"when": "c1d71d0d9f41db5e4306c86af232f5f6220a130b",
"short": "[priority] **The minimum *recommended* Python version has been raised to 3.8**\nSince Python 3.7 has reached end-of-life, support for it will be dropped soon. [Read more](https://github.com/yt-dlp/yt-dlp/issues/7803)"
},
{
"action": "add",
"when": "61bdf15fc7400601c3da1aa7a43917310a5bf391",
"short": "[priority] Security: [[CVE-2023-40581](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-40581)] [Prevent RCE when using `--exec` with `%q` on Windows](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-42h4-v29r-42qg)\n - The shell escape function is now using `\"\"` instead of `\\\"`.\n - `utils.Popen` has been patched to properly quote commands."
},
{
"action": "change",
"when": "8a8b54523addf46dfd50ef599761a81bc22362e6",
"short": "[rh:requests] Add handler for `requests` HTTP library (#3668)\n\n\tAdds support for HTTPS proxies and persistent connections (keep-alive)",
"authors": ["bashonly", "coletdjnz", "Grub4K"]
},
{
"action": "add",
"when": "1d03633c5a1621b9f3a756f0a4f9dc61fab3aeaa",
"short": "[priority] **The release channels have been adjusted!**\n\t* [`master`](https://github.com/yt-dlp/yt-dlp-master-builds) builds are made after each push, containing the latest fixes (but also possibly bugs). This was previously the `nightly` channel.\n\t* [`nightly`](https://github.com/yt-dlp/yt-dlp-nightly-builds) builds are now made once a day, if there were any changes."
},
{
"action": "add",
"when": "f04b5bedad7b281bee9814686bba1762bae092eb",
"short": "[priority] Security: [[CVE-2023-46121](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-46121)] Patch [Generic Extractor MITM Vulnerability via Arbitrary Proxy Injection](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-3ch3-jhc6-5r8x)\n\t- Disallow smuggling of arbitrary `http_headers`; extractors now only use specific headers"
},
{
"action": "change",
"when": "15f22b4880b6b3f71f350c64d70976ae65b9f1ca",
"short": "[webvtt] Allow spaces before newlines for CueBlock (#7681)",
"authors": ["TSRBerry"]
},
{
"action": "change",
"when": "4ce57d3b873c2887814cbec03d029533e82f7db5",
"short": "[ie] Support multi-period MPD streams (#6654)",
"authors": ["alard", "pukkandan"]
},
{
"action": "change",
"when": "aa7e9ae4f48276bd5d0173966c77db9484f65a0a",
"short": "[ie/xvideos] Support new URL format (#9502)",
"authors": ["sta1us"]
},
{
"action": "remove",
"when": "22e4dfacb61f62dfbb3eb41b31c7b69ba1059b80"
},
{
"action": "change",
"when": "e3a3ed8a981d9395c4859b6ef56cd02bc3148db2",
"short": "[cleanup:ie] No `from` stdlib imports in extractors",
"authors": ["pukkandan"]
},
{
"action": "add",
"when": "9590cc6b4768e190183d7d071a6c78170889116a",
"short": "[priority] Security: [[CVE-2024-22423](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2024-22423)] [Prevent RCE when using `--exec` with `%q` on Windows](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-hjq6-52gw-2g7p)\n - The shell escape function now properly escapes `%`, `\\` and `\\n`.\n - `utils.Popen` has been patched accordingly."
},
{
"action": "change",
"when": "41ba4a808b597a3afed78c89675a30deb6844450",
"short": "[ie/tiktok] Extract via mobile API only if extractor-arg is passed (#9938)",
"authors": ["bashonly"]
},
{
"action": "remove",
"when": "6e36d17f404556f0e3a43f441c477a71a91877d9"
},
{
"action": "change",
"when": "beaf832c7a9d57833f365ce18f6115b88071b296",
"short": "[ie/soundcloud] Add `formats` extractor-arg (#10004)",
"authors": ["bashonly", "Grub4K"]
},
{
"action": "change",
"when": "5c019f6328ad40d66561eac3c4de0b3cd070d0f6",
"short": "[cleanup] Misc (#9765)",
"authors": ["bashonly", "Grub4K", "seproDev"]
}
]

View File

@ -1,96 +0,0 @@
{
"$schema": "http://json-schema.org/draft/2020-12/schema",
"type": "array",
"uniqueItems": true,
"items": {
"type": "object",
"oneOf": [
{
"type": "object",
"properties": {
"action": {
"enum": [
"add"
]
},
"when": {
"type": "string",
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
},
"hash": {
"type": "string",
"pattern": "^[0-9a-f]{40}$"
},
"short": {
"type": "string"
},
"authors": {
"type": "array",
"items": {
"type": "string"
}
}
},
"required": [
"action",
"short"
]
},
{
"type": "object",
"properties": {
"action": {
"enum": [
"remove"
]
},
"when": {
"type": "string",
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
},
"hash": {
"type": "string",
"pattern": "^[0-9a-f]{40}$"
}
},
"required": [
"action",
"hash"
]
},
{
"type": "object",
"properties": {
"action": {
"enum": [
"change"
]
},
"when": {
"type": "string",
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
},
"hash": {
"type": "string",
"pattern": "^[0-9a-f]{40}$"
},
"short": {
"type": "string"
},
"authors": {
"type": "array",
"items": {
"type": "string"
}
}
},
"required": [
"action",
"hash",
"short",
"authors"
]
}
]
}
}

View File

@ -1,4 +1,6 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
"""
This script employs a VERY basic heuristic ('porn' in webpage.lower()) to check
if we are not 'age_limit' tagging some porn site
@ -10,14 +12,11 @@
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import urllib.parse
import urllib.request
from test.helper import gettestcases
from yt_dlp.utils import compat_urllib_parse_urlparse
from yt_dlp.utils import compat_urllib_request
if len(sys.argv) > 1:
METHOD = 'LIST'
@ -28,9 +27,9 @@
for test in gettestcases():
if METHOD == 'EURISTIC':
try:
webpage = urllib.request.urlopen(test['url'], timeout=10).read()
webpage = compat_urllib_request.urlopen(test['url'], timeout=10).read()
except Exception:
print('\nFail: {}'.format(test['name']))
print('\nFail: {0}'.format(test['name']))
continue
webpage = webpage.decode('utf8', 'replace')
@ -38,9 +37,9 @@
RESULT = 'porn' in webpage.lower()
elif METHOD == 'LIST':
domain = urllib.parse.urlparse(test['url']).netloc
domain = compat_urllib_parse_urlparse(test['url']).netloc
if not domain:
print('\nFail: {}'.format(test['name']))
print('\nFail: {0}'.format(test['name']))
continue
domain = '.'.join(domain.split('.')[-2:])
@ -48,11 +47,11 @@
if RESULT and ('info_dict' not in test or 'age_limit' not in test['info_dict']
or test['info_dict']['age_limit'] != 18):
print('\nPotential missing age_limit check: {}'.format(test['name']))
print('\nPotential missing age_limit check: {0}'.format(test['name']))
elif not RESULT and ('info_dict' in test and 'age_limit' in test['info_dict']
and test['info_dict']['age_limit'] == 18):
print('\nPotential false negative: {}'.format(test['name']))
print('\nPotential false negative: {0}'.format(test['name']))
else:
sys.stdout.write('.')

View File

@ -1,48 +0,0 @@
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import yt_dlp
import yt_dlp.options
create_parser = yt_dlp.options.create_parser
def parse_patched_options(opts):
patched_parser = create_parser()
patched_parser.defaults.update({
'ignoreerrors': False,
'retries': 0,
'fragment_retries': 0,
'extract_flat': False,
'concat_playlist': 'never',
})
yt_dlp.options.create_parser = lambda: patched_parser
try:
return yt_dlp.parse_options(opts)
finally:
yt_dlp.options.create_parser = create_parser
default_opts = parse_patched_options([]).ydl_opts
def cli_to_api(opts, cli_defaults=False):
opts = (yt_dlp.parse_options if cli_defaults else parse_patched_options)(opts).ydl_opts
diff = {k: v for k, v in opts.items() if default_opts[k] != v}
if 'postprocessors' in diff:
diff['postprocessors'] = [pp for pp in diff['postprocessors']
if pp not in default_opts['postprocessors']]
return diff
if __name__ == '__main__':
from pprint import pprint
print('\nThe arguments passed translate to:\n')
pprint(cli_to_api(sys.argv[1:]))
print('\nCombining these with the CLI defaults gives:\n')
pprint(cli_to_api(sys.argv[1:], True))

View File

@ -0,0 +1,112 @@
# Unused
#!/usr/bin/env python3
from __future__ import unicode_literals
import io
import json
import mimetypes
import netrc
import optparse
import os
import re
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from yt_dlp.compat import (
compat_basestring,
compat_getpass,
compat_print,
compat_urllib_request,
)
from yt_dlp.utils import (
make_HTTPS_handler,
sanitized_Request,
)
class GitHubReleaser(object):
_API_URL = 'https://api.github.com/repos/ytdl-org/youtube-dl/releases'
_UPLOADS_URL = 'https://uploads.github.com/repos/ytdl-org/youtube-dl/releases/%s/assets?name=%s'
_NETRC_MACHINE = 'github.com'
def __init__(self, debuglevel=0):
self._init_github_account()
https_handler = make_HTTPS_handler({}, debuglevel=debuglevel)
self._opener = compat_urllib_request.build_opener(https_handler)
def _init_github_account(self):
try:
info = netrc.netrc().authenticators(self._NETRC_MACHINE)
if info is not None:
self._token = info[2]
compat_print('Using GitHub credentials found in .netrc...')
return
else:
compat_print('No GitHub credentials found in .netrc')
except (IOError, netrc.NetrcParseError):
compat_print('Unable to parse .netrc')
self._token = compat_getpass(
'Type your GitHub PAT (personal access token) and press [Return]: ')
def _call(self, req):
if isinstance(req, compat_basestring):
req = sanitized_Request(req)
req.add_header('Authorization', 'token %s' % self._token)
response = self._opener.open(req).read().decode('utf-8')
return json.loads(response)
def list_releases(self):
return self._call(self._API_URL)
def create_release(self, tag_name, name=None, body='', draft=False, prerelease=False):
data = {
'tag_name': tag_name,
'target_commitish': 'master',
'name': name,
'body': body,
'draft': draft,
'prerelease': prerelease,
}
req = sanitized_Request(self._API_URL, json.dumps(data).encode('utf-8'))
return self._call(req)
def create_asset(self, release_id, asset):
asset_name = os.path.basename(asset)
url = self._UPLOADS_URL % (release_id, asset_name)
# Our files are small enough to be loaded directly into memory.
data = open(asset, 'rb').read()
req = sanitized_Request(url, data)
mime_type, _ = mimetypes.guess_type(asset_name)
req.add_header('Content-Type', mime_type or 'application/octet-stream')
return self._call(req)
def main():
parser = optparse.OptionParser(usage='%prog CHANGELOG VERSION BUILDPATH')
options, args = parser.parse_args()
if len(args) != 3:
parser.error('Expected a version and a build directory')
changelog_file, version, build_path = args
with io.open(changelog_file, encoding='utf-8') as inf:
changelog = inf.read()
mobj = re.search(r'(?s)version %s\n{2}(.+?)\n{3}' % version, changelog)
body = mobj.group(1) if mobj else ''
releaser = GitHubReleaser()
new_release = releaser.create_release(
version, name='yt-dlp %s' % version, body=body)
release_id = new_release['id']
for asset in os.listdir(build_path):
compat_print('Uploading %s...' % asset)
releaser.create_asset(release_id, os.path.join(build_path, asset))
if __name__ == '__main__':
main()

View File

@ -1,14 +1,12 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from __future__ import unicode_literals
import optparse
import os
from os.path import dirname as dirn
import sys
sys.path.insert(0, dirn(dirn((os.path.abspath(__file__)))))
import yt_dlp
from yt_dlp.utils import shell_quote
@ -48,5 +46,5 @@ def build_completion(opt_parser):
f.write(filled_template)
parser = yt_dlp.parseOpts(ignore_config_files=True)[0]
parser = yt_dlp.parseOpts()[0]
build_completion(parser)

View File

@ -1,17 +1,15 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from __future__ import unicode_literals
import codecs
import subprocess
from yt_dlp.aes import aes_encrypt, key_expansion
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from yt_dlp.utils import intlist_to_bytes
from yt_dlp.aes import aes_encrypt, key_expansion
secret_msg = b'Secret message goes here'

View File

@ -0,0 +1,43 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
import json
import sys
import hashlib
import os.path
if len(sys.argv) <= 1:
print('Specify the version number as parameter')
sys.exit()
version = sys.argv[1]
with open('update/LATEST_VERSION', 'w') as f:
f.write(version)
versions_info = json.load(open('update/versions.json'))
if 'signature' in versions_info:
del versions_info['signature']
new_version = {}
filenames = {
'bin': 'yt-dlp',
'exe': 'yt-dlp.exe',
'tar': 'yt-dlp-%s.tar.gz' % version}
build_dir = os.path.join('..', '..', 'build', version)
for key, filename in filenames.items():
url = 'https://yt-dl.org/downloads/%s/%s' % (version, filename)
fn = os.path.join(build_dir, filename)
with open(fn, 'rb') as f:
data = f.read()
if not data:
raise ValueError('File %s is empty!' % fn)
sha256sum = hashlib.sha256(data).hexdigest()
new_version[key] = (url, sha256sum)
versions_info['versions'][version] = new_version
versions_info['latest'] = version
with open('update/versions.json', 'w') as jsonf:
json.dump(versions_info, jsonf, indent=4, sort_keys=True)

View File

@ -0,0 +1,22 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
import json
versions_info = json.load(open('update/versions.json'))
version = versions_info['latest']
version_dict = versions_info['versions'][version]
# Read template page
with open('download.html.in', 'r', encoding='utf-8') as tmplf:
template = tmplf.read()
template = template.replace('@PROGRAM_VERSION@', version)
template = template.replace('@PROGRAM_URL@', version_dict['bin'][0])
template = template.replace('@PROGRAM_SHA256SUM@', version_dict['bin'][1])
template = template.replace('@EXE_URL@', version_dict['exe'][0])
template = template.replace('@EXE_SHA256SUM@', version_dict['exe'][1])
template = template.replace('@TAR_URL@', version_dict['tar'][0])
template = template.replace('@TAR_SHA256SUM@', version_dict['tar'][1])
with open('download.html', 'w', encoding='utf-8') as dlf:
dlf.write(template)

View File

@ -0,0 +1,34 @@
#!/usr/bin/env python3
from __future__ import unicode_literals, with_statement
import rsa
import json
from binascii import hexlify
try:
input = raw_input
except NameError:
pass
versions_info = json.load(open('update/versions.json'))
if 'signature' in versions_info:
del versions_info['signature']
print('Enter the PKCS1 private key, followed by a blank line:')
privkey = b''
while True:
try:
line = input()
except EOFError:
break
if line == '':
break
privkey += line.encode('ascii') + b'\n'
privkey = rsa.PrivateKey.load_pkcs1(privkey)
signature = hexlify(rsa.pkcs1.sign(json.dumps(versions_info, sort_keys=True).encode('utf-8'), privkey, 'SHA-256')).decode()
print('signature: ' + signature)
versions_info['signature'] = signature
with open('update/versions.json', 'w') as versionsf:
json.dump(versions_info, versionsf, indent=4, sort_keys=True)

View File

@ -0,0 +1,21 @@
#!/usr/bin/env python3
# coding: utf-8
from __future__ import with_statement, unicode_literals
import datetime
import glob
import io # For Python 2 compatibility
import os
import re
year = str(datetime.datetime.now().year)
for fn in glob.glob('*.html*'):
with io.open(fn, encoding='utf-8') as f:
content = f.read()
newc = re.sub(r'(?P<copyright>Copyright © 2011-)(?P<year>[0-9]{4})', 'Copyright © 2011-' + year, content)
if content != newc:
tmpFn = fn + '.part'
with io.open(tmpFn, 'wt', encoding='utf-8') as outf:
outf.write(newc)
os.rename(tmpFn, fn)

View File

@ -0,0 +1,76 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
import datetime
import io
import json
import textwrap
atom_template = textwrap.dedent("""\
<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
<link rel="self" href="http://ytdl-org.github.io/youtube-dl/update/releases.atom" />
<title>yt-dlp releases</title>
<id>https://yt-dl.org/feed/yt-dlp-updates-feed</id>
<updated>@TIMESTAMP@</updated>
@ENTRIES@
</feed>""")
entry_template = textwrap.dedent("""
<entry>
<id>https://yt-dl.org/feed/yt-dlp-updates-feed/yt-dlp-@VERSION@</id>
<title>New version @VERSION@</title>
<link href="http://ytdl-org.github.io/yt-dlp" />
<content type="xhtml">
<div xmlns="http://www.w3.org/1999/xhtml">
Downloads available at <a href="https://yt-dl.org/downloads/@VERSION@/">https://yt-dl.org/downloads/@VERSION@/</a>
</div>
</content>
<author>
<name>The yt-dlp maintainers</name>
</author>
<updated>@TIMESTAMP@</updated>
</entry>
""")
now = datetime.datetime.now()
now_iso = now.isoformat() + 'Z'
atom_template = atom_template.replace('@TIMESTAMP@', now_iso)
versions_info = json.load(open('update/versions.json'))
versions = list(versions_info['versions'].keys())
versions.sort()
entries = []
for v in versions:
fields = v.split('.')
year, month, day = map(int, fields[:3])
faked = 0
patchlevel = 0
while True:
try:
datetime.date(year, month, day)
except ValueError:
day -= 1
faked += 1
assert day > 0
continue
break
if len(fields) >= 4:
try:
patchlevel = int(fields[3])
except ValueError:
patchlevel = 1
timestamp = '%04d-%02d-%02dT00:%02d:%02dZ' % (year, month, day, faked, patchlevel)
entry = entry_template.replace('@TIMESTAMP@', timestamp)
entry = entry.replace('@VERSION@', v)
entries.append(entry)
entries_str = textwrap.indent(''.join(entries), '\t')
atom_template = atom_template.replace('@ENTRIES@', entries_str)
with io.open('update/releases.atom', 'w', encoding='utf-8') as atom_file:
atom_file.write(atom_template)

View File

@ -0,0 +1,37 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
import sys
import os
import textwrap
# We must be able to import yt_dlp
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))))
import yt_dlp
def main():
with open('supportedsites.html.in', 'r', encoding='utf-8') as tmplf:
template = tmplf.read()
ie_htmls = []
for ie in yt_dlp.list_extractors(age_limit=None):
ie_html = '<b>{}</b>'.format(ie.IE_NAME)
ie_desc = getattr(ie, 'IE_DESC', None)
if ie_desc is False:
continue
elif ie_desc is not None:
ie_html += ': {}'.format(ie.IE_DESC)
if not ie.working():
ie_html += ' (Currently broken)'
ie_htmls.append('<li>{}</li>'.format(ie_html))
template = template.replace('@SITES@', textwrap.indent('\n'.join(ie_htmls), '\t'))
with open('supportedsites.html', 'w', encoding='utf-8') as sitesf:
sitesf.write(template)
if __name__ == '__main__':
main()

View File

@ -1,81 +0,0 @@
#!/usr/bin/env python3
# Allow execution from anywhere
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import argparse
import re
import subprocess
from pathlib import Path
from devscripts.tomlparse import parse_toml
from devscripts.utils import read_file
def parse_args():
parser = argparse.ArgumentParser(description='Install dependencies for yt-dlp')
parser.add_argument(
'input', nargs='?', metavar='TOMLFILE', default=Path(__file__).parent.parent / 'pyproject.toml',
help='input file (default: %(default)s)')
parser.add_argument(
'-e', '--exclude', metavar='DEPENDENCY', action='append',
help='exclude a dependency')
parser.add_argument(
'-i', '--include', metavar='GROUP', action='append',
help='include an optional dependency group')
parser.add_argument(
'-o', '--only-optional', action='store_true',
help='only install optional dependencies')
parser.add_argument(
'-p', '--print', action='store_true',
help='only print requirements to stdout')
parser.add_argument(
'-u', '--user', action='store_true',
help='install with pip as --user')
return parser.parse_args()
def main():
args = parse_args()
project_table = parse_toml(read_file(args.input))['project']
recursive_pattern = re.compile(rf'{project_table["name"]}\[(?P<group_name>[\w-]+)\]')
optional_groups = project_table['optional-dependencies']
excludes = args.exclude or []
def yield_deps(group):
for dep in group:
if mobj := recursive_pattern.fullmatch(dep):
yield from optional_groups.get(mobj.group('group_name'), [])
else:
yield dep
targets = []
if not args.only_optional: # `-o` should exclude 'dependencies' and the 'default' group
targets.extend(project_table['dependencies'])
if 'default' not in excludes: # `--exclude default` should exclude entire 'default' group
targets.extend(yield_deps(optional_groups['default']))
for include in filter(None, map(optional_groups.get, args.include or [])):
targets.extend(yield_deps(include))
targets = [t for t in targets if re.match(r'[\w-]+', t).group(0).lower() not in excludes]
if args.print:
for target in targets:
print(target)
return
pip_args = [sys.executable, '-m', 'pip', 'install', '-U']
if args.user:
pip_args.append('--user')
pip_args.extend(targets)
return subprocess.call(pip_args)
if __name__ == '__main__':
sys.exit(main())

View File

@ -1,39 +1,31 @@
import importlib
import random
# coding: utf-8
import re
from ..utils import (
age_restricted,
bug_reports_message,
classproperty,
variadic,
write_string,
)
# These bloat the lazy_extractors, so allow them to passthrough silently
ALLOWED_CLASSMETHODS = {'extract_from_webpage', 'get_testcases', 'get_webpage_testcases'}
_WARNED = False
from ..utils import bug_reports_message, write_string
class LazyLoadMetaClass(type):
def __getattr__(cls, name):
global _WARNED
if ('_real_class' not in cls.__dict__
and name not in ALLOWED_CLASSMETHODS and not _WARNED):
_WARNED = True
write_string('WARNING: Falling back to normal extractor since lazy extractor '
f'{cls.__name__} does not have attribute {name}{bug_reports_message()}\n')
return getattr(cls.real_class, name)
if '_real_class' not in cls.__dict__:
write_string(
f'WARNING: Falling back to normal extractor since lazy extractor '
f'{cls.__name__} does not have attribute {name}{bug_reports_message()}')
return getattr(cls._get_real_class(), name)
class LazyLoadExtractor(metaclass=LazyLoadMetaClass):
@classproperty
def real_class(cls):
_module = None
_WORKING = True
@classmethod
def _get_real_class(cls):
if '_real_class' not in cls.__dict__:
cls._real_class = getattr(importlib.import_module(cls._module), cls.__name__)
mod = __import__(cls._module, fromlist=(cls.__name__,))
cls._real_class = getattr(mod, cls.__name__)
return cls._real_class
def __new__(cls, *args, **kwargs):
instance = cls.real_class.__new__(cls.real_class)
real_cls = cls._get_real_class()
instance = real_cls.__new__(real_cls)
instance.__init__(*args, **kwargs)
return instance

View File

@ -1,510 +0,0 @@
from __future__ import annotations
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import enum
import itertools
import json
import logging
import re
from collections import defaultdict
from dataclasses import dataclass
from functools import lru_cache
from pathlib import Path
from devscripts.utils import read_file, run_process, write_file
BASE_URL = 'https://github.com'
LOCATION_PATH = Path(__file__).parent
HASH_LENGTH = 7
logger = logging.getLogger(__name__)
class CommitGroup(enum.Enum):
PRIORITY = 'Important'
CORE = 'Core'
EXTRACTOR = 'Extractor'
DOWNLOADER = 'Downloader'
POSTPROCESSOR = 'Postprocessor'
NETWORKING = 'Networking'
MISC = 'Misc.'
@classmethod
@lru_cache
def subgroup_lookup(cls):
return {
name: group
for group, names in {
cls.MISC: {
'build',
'ci',
'cleanup',
'devscripts',
'docs',
'test',
},
cls.NETWORKING: {
'rh',
},
}.items()
for name in names
}
@classmethod
@lru_cache
def group_lookup(cls):
result = {
'fd': cls.DOWNLOADER,
'ie': cls.EXTRACTOR,
'pp': cls.POSTPROCESSOR,
'upstream': cls.CORE,
}
result.update({item.name.lower(): item for item in iter(cls)})
return result
@classmethod
def get(cls, value: str) -> tuple[CommitGroup | None, str | None]:
group, _, subgroup = (group.strip().lower() for group in value.partition('/'))
result = cls.group_lookup().get(group)
if not result:
if subgroup:
return None, value
subgroup = group
result = cls.subgroup_lookup().get(subgroup)
return result, subgroup or None
@dataclass
class Commit:
hash: str | None
short: str
authors: list[str]
def __str__(self):
result = f'{self.short!r}'
if self.hash:
result += f' ({self.hash[:HASH_LENGTH]})'
if self.authors:
authors = ', '.join(self.authors)
result += f' by {authors}'
return result
@dataclass
class CommitInfo:
details: str | None
sub_details: tuple[str, ...]
message: str
issues: list[str]
commit: Commit
fixes: list[Commit]
def key(self):
return ((self.details or '').lower(), self.sub_details, self.message)
def unique(items):
return sorted({item.strip().lower(): item for item in items if item}.values())
class Changelog:
MISC_RE = re.compile(r'(?:^|\b)(?:lint(?:ing)?|misc|format(?:ting)?|fixes)(?:\b|$)', re.IGNORECASE)
ALWAYS_SHOWN = (CommitGroup.PRIORITY,)
def __init__(self, groups, repo, collapsible=False):
self._groups = groups
self._repo = repo
self._collapsible = collapsible
def __str__(self):
return '\n'.join(self._format_groups(self._groups)).replace('\t', ' ')
def _format_groups(self, groups):
first = True
for item in CommitGroup:
if self._collapsible and item not in self.ALWAYS_SHOWN and first:
first = False
yield '\n<details><summary><h3>Changelog</h3></summary>\n'
group = groups[item]
if group:
yield self.format_module(item.value, group)
if self._collapsible:
yield '\n</details>'
def format_module(self, name, group):
result = f'\n#### {name} changes\n' if name else '\n'
return result + '\n'.join(self._format_group(group))
def _format_group(self, group):
sorted_group = sorted(group, key=CommitInfo.key)
detail_groups = itertools.groupby(sorted_group, lambda item: (item.details or '').lower())
for _, items in detail_groups:
items = list(items)
details = items[0].details
if details == 'cleanup':
items = self._prepare_cleanup_misc_items(items)
prefix = '-'
if details:
if len(items) == 1:
prefix = f'- **{details}**:'
else:
yield f'- **{details}**'
prefix = '\t-'
sub_detail_groups = itertools.groupby(items, lambda item: tuple(map(str.lower, item.sub_details)))
for sub_details, entries in sub_detail_groups:
if not sub_details:
for entry in entries:
yield f'{prefix} {self.format_single_change(entry)}'
continue
entries = list(entries)
sub_prefix = f'{prefix} {", ".join(entries[0].sub_details)}'
if len(entries) == 1:
yield f'{sub_prefix}: {self.format_single_change(entries[0])}'
continue
yield sub_prefix
for entry in entries:
yield f'\t{prefix} {self.format_single_change(entry)}'
def _prepare_cleanup_misc_items(self, items):
cleanup_misc_items = defaultdict(list)
sorted_items = []
for item in items:
if self.MISC_RE.search(item.message):
cleanup_misc_items[tuple(item.commit.authors)].append(item)
else:
sorted_items.append(item)
for commit_infos in cleanup_misc_items.values():
sorted_items.append(CommitInfo(
'cleanup', ('Miscellaneous',), ', '.join(
self._format_message_link(None, info.commit.hash)
for info in sorted(commit_infos, key=lambda item: item.commit.hash or '')),
[], Commit(None, '', commit_infos[0].commit.authors), []))
return sorted_items
def format_single_change(self, info: CommitInfo):
message, sep, rest = info.message.partition('\n')
if '[' not in message:
# If the message doesn't already contain markdown links, try to add a link to the commit
message = self._format_message_link(message, info.commit.hash)
if info.issues:
message = f'{message} ({self._format_issues(info.issues)})'
if info.commit.authors:
message = f'{message} by {self._format_authors(info.commit.authors)}'
if info.fixes:
fix_message = ', '.join(f'{self._format_message_link(None, fix.hash)}' for fix in info.fixes)
authors = sorted({author for fix in info.fixes for author in fix.authors}, key=str.casefold)
if authors != info.commit.authors:
fix_message = f'{fix_message} by {self._format_authors(authors)}'
message = f'{message} (With fixes in {fix_message})'
return message if not sep else f'{message}{sep}{rest}'
def _format_message_link(self, message, commit_hash):
assert message or commit_hash, 'Improperly defined commit message or override'
message = message if message else commit_hash[:HASH_LENGTH]
return f'[{message}]({self.repo_url}/commit/{commit_hash})' if commit_hash else message
def _format_issues(self, issues):
return ', '.join(f'[#{issue}]({self.repo_url}/issues/{issue})' for issue in issues)
@staticmethod
def _format_authors(authors):
return ', '.join(f'[{author}]({BASE_URL}/{author})' for author in authors)
@property
def repo_url(self):
return f'{BASE_URL}/{self._repo}'
class CommitRange:
COMMAND = 'git'
COMMIT_SEPARATOR = '-----'
AUTHOR_INDICATOR_RE = re.compile(r'Authored by:? ', re.IGNORECASE)
MESSAGE_RE = re.compile(r'''
(?:\[(?P<prefix>[^\]]+)\]\ )?
(?:(?P<sub_details>`?[\w.-]+`?): )?
(?P<message>.+?)
(?:\ \((?P<issues>\#\d+(?:,\ \#\d+)*)\))?
''', re.VERBOSE | re.DOTALL)
EXTRACTOR_INDICATOR_RE = re.compile(r'(?:Fix|Add)\s+Extractors?', re.IGNORECASE)
REVERT_RE = re.compile(r'(?:\[[^\]]+\]\s+)?(?i:Revert)\s+([\da-f]{40})')
FIXES_RE = re.compile(r'(?i:Fix(?:es)?(?:\s+bugs?)?(?:\s+in|\s+for)?|Revert|Improve)\s+([\da-f]{40})')
UPSTREAM_MERGE_RE = re.compile(r'Update to ytdl-commit-([\da-f]+)')
def __init__(self, start, end, default_author=None):
self._start, self._end = start, end
self._commits, self._fixes = self._get_commits_and_fixes(default_author)
self._commits_added = []
def __iter__(self):
return iter(itertools.chain(self._commits.values(), self._commits_added))
def __len__(self):
return len(self._commits) + len(self._commits_added)
def __contains__(self, commit):
if isinstance(commit, Commit):
if not commit.hash:
return False
commit = commit.hash
return commit in self._commits
def _get_commits_and_fixes(self, default_author):
result = run_process(
self.COMMAND, 'log', f'--format=%H%n%s%n%b%n{self.COMMIT_SEPARATOR}',
f'{self._start}..{self._end}' if self._start else self._end).stdout
commits, reverts = {}, {}
fixes = defaultdict(list)
lines = iter(result.splitlines(False))
for i, commit_hash in enumerate(lines):
short = next(lines)
skip = short.startswith('Release ') or short == '[version] update'
authors = [default_author] if default_author else []
for line in iter(lambda: next(lines), self.COMMIT_SEPARATOR):
match = self.AUTHOR_INDICATOR_RE.match(line)
if match:
authors = sorted(map(str.strip, line[match.end():].split(',')), key=str.casefold)
commit = Commit(commit_hash, short, authors)
if skip and (self._start or not i):
logger.debug(f'Skipped commit: {commit}')
continue
elif skip:
logger.debug(f'Reached Release commit, breaking: {commit}')
break
revert_match = self.REVERT_RE.fullmatch(commit.short)
if revert_match:
reverts[revert_match.group(1)] = commit
continue
fix_match = self.FIXES_RE.search(commit.short)
if fix_match:
commitish = fix_match.group(1)
fixes[commitish].append(commit)
commits[commit.hash] = commit
for commitish, revert_commit in reverts.items():
reverted = commits.pop(commitish, None)
if reverted:
logger.debug(f'{commitish} fully reverted {reverted}')
else:
commits[revert_commit.hash] = revert_commit
for commitish, fix_commits in fixes.items():
if commitish in commits:
hashes = ', '.join(commit.hash[:HASH_LENGTH] for commit in fix_commits)
logger.info(f'Found fix(es) for {commitish[:HASH_LENGTH]}: {hashes}')
for fix_commit in fix_commits:
del commits[fix_commit.hash]
else:
logger.debug(f'Commit with fixes not in changes: {commitish[:HASH_LENGTH]}')
return commits, fixes
def apply_overrides(self, overrides):
for override in overrides:
when = override.get('when')
if when and when not in self and when != self._start:
logger.debug(f'Ignored {when!r} override')
continue
override_hash = override.get('hash') or when
if override['action'] == 'add':
commit = Commit(override.get('hash'), override['short'], override.get('authors') or [])
logger.info(f'ADD {commit}')
self._commits_added.append(commit)
elif override['action'] == 'remove':
if override_hash in self._commits:
logger.info(f'REMOVE {self._commits[override_hash]}')
del self._commits[override_hash]
elif override['action'] == 'change':
if override_hash not in self._commits:
continue
commit = Commit(override_hash, override['short'], override.get('authors') or [])
logger.info(f'CHANGE {self._commits[commit.hash]} -> {commit}')
self._commits[commit.hash] = commit
self._commits = dict(reversed(self._commits.items()))
def groups(self):
group_dict = defaultdict(list)
for commit in self:
upstream_re = self.UPSTREAM_MERGE_RE.search(commit.short)
if upstream_re:
commit.short = f'[upstream] Merged with youtube-dl {upstream_re.group(1)}'
match = self.MESSAGE_RE.fullmatch(commit.short)
if not match:
logger.error(f'Error parsing short commit message: {commit.short!r}')
continue
prefix, sub_details_alt, message, issues = match.groups()
issues = [issue.strip()[1:] for issue in issues.split(',')] if issues else []
if prefix:
groups, details, sub_details = zip(*map(self.details_from_prefix, prefix.split(',')))
group = next(iter(filter(None, groups)), None)
details = ', '.join(unique(details))
sub_details = list(itertools.chain.from_iterable(sub_details))
else:
group = CommitGroup.CORE
details = None
sub_details = []
if sub_details_alt:
sub_details.append(sub_details_alt)
sub_details = tuple(unique(sub_details))
if not group:
if self.EXTRACTOR_INDICATOR_RE.search(commit.short):
group = CommitGroup.EXTRACTOR
logger.error(f'Assuming [ie] group for {commit.short!r}')
else:
group = CommitGroup.CORE
commit_info = CommitInfo(
details, sub_details, message.strip(),
issues, commit, self._fixes[commit.hash])
logger.debug(f'Resolved {commit.short!r} to {commit_info!r}')
group_dict[group].append(commit_info)
return group_dict
@staticmethod
def details_from_prefix(prefix):
if not prefix:
return CommitGroup.CORE, None, ()
prefix, *sub_details = prefix.split(':')
group, details = CommitGroup.get(prefix)
if group is CommitGroup.PRIORITY and details:
details = details.partition('/')[2].strip()
if details and '/' in details:
logger.error(f'Prefix is overnested, using first part: {prefix}')
details = details.partition('/')[0].strip()
if details == 'common':
details = None
elif group is CommitGroup.NETWORKING and details == 'rh':
details = 'Request Handler'
return group, details, sub_details
def get_new_contributors(contributors_path, commits):
contributors = set()
if contributors_path.exists():
for line in read_file(contributors_path).splitlines():
author, _, _ = line.strip().partition(' (')
authors = author.split('/')
contributors.update(map(str.casefold, authors))
new_contributors = set()
for commit in commits:
for author in commit.authors:
author_folded = author.casefold()
if author_folded not in contributors:
contributors.add(author_folded)
new_contributors.add(author)
return sorted(new_contributors, key=str.casefold)
def create_changelog(args):
logging.basicConfig(
datefmt='%Y-%m-%d %H-%M-%S', format='{asctime} | {levelname:<8} | {message}',
level=logging.WARNING - 10 * args.verbosity, style='{', stream=sys.stderr)
commits = CommitRange(None, args.commitish, args.default_author)
if not args.no_override:
if args.override_path.exists():
overrides = json.loads(read_file(args.override_path))
commits.apply_overrides(overrides)
else:
logger.warning(f'File {args.override_path.as_posix()} does not exist')
logger.info(f'Loaded {len(commits)} commits')
new_contributors = get_new_contributors(args.contributors_path, commits)
if new_contributors:
if args.contributors:
write_file(args.contributors_path, '\n'.join(new_contributors) + '\n', mode='a')
logger.info(f'New contributors: {", ".join(new_contributors)}')
return Changelog(commits.groups(), args.repo, args.collapsible)
def create_parser():
import argparse
parser = argparse.ArgumentParser(
description='Create a changelog markdown from a git commit range')
parser.add_argument(
'commitish', default='HEAD', nargs='?',
help='The commitish to create the range from (default: %(default)s)')
parser.add_argument(
'-v', '--verbosity', action='count', default=0,
help='increase verbosity (can be used twice)')
parser.add_argument(
'-c', '--contributors', action='store_true',
help='update CONTRIBUTORS file (default: %(default)s)')
parser.add_argument(
'--contributors-path', type=Path, default=LOCATION_PATH.parent / 'CONTRIBUTORS',
help='path to the CONTRIBUTORS file')
parser.add_argument(
'--no-override', action='store_true',
help='skip override json in commit generation (default: %(default)s)')
parser.add_argument(
'--override-path', type=Path, default=LOCATION_PATH / 'changelog_override.json',
help='path to the changelog_override.json file')
parser.add_argument(
'--default-author', default='pukkandan',
help='the author to use without a author indicator (default: %(default)s)')
parser.add_argument(
'--repo', default='yt-dlp/yt-dlp',
help='the github repository to use for the operations (default: %(default)s)')
parser.add_argument(
'--collapsible', action='store_true',
help='make changelog collapsible (default: %(default)s)')
return parser
if __name__ == '__main__':
print(create_changelog(create_parser().parse_args()))

View File

@ -1,5 +1,7 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
import io
import optparse
import re
@ -14,7 +16,7 @@ def main():
infile, outfile = args
with open(infile, encoding='utf-8') as inf:
with io.open(infile, encoding='utf-8') as inf:
readme = inf.read()
bug_text = re.search(
@ -24,7 +26,7 @@ def main():
out = bug_text + dev_text
with open(outfile, 'w', encoding='utf-8') as outf:
with io.open(outfile, 'w', encoding='utf-8') as outf:
outf.write(out)

View File

@ -1,72 +1,29 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import re
from devscripts.utils import get_filename_args, read_file, write_file
VERBOSE_TMPL = '''
- type: checkboxes
id: verbose
attributes:
label: Provide verbose output that clearly demonstrates the problem
options:
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
required: true
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
required: false
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
required: true
- type: textarea
id: log
attributes:
label: Complete Verbose Output
description: |
It should start like this:
placeholder: |
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Proxy map: {}
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
<more lines>
render: shell
validations:
required: true
'''.strip()
NO_SKIP = '''
- type: checkboxes
attributes:
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
description: Fill all fields even if you think it is irrelevant for the issue
options:
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\\* field
required: true
'''.strip()
import io
import optparse
def main():
fields = {'no_skip': NO_SKIP}
fields['verbose'] = VERBOSE_TMPL % fields
fields['verbose_optional'] = re.sub(r'(\n\s+validations:)?\n\s+required: true', '', fields['verbose'])
parser = optparse.OptionParser(usage='%prog INFILE OUTFILE')
options, args = parser.parse_args()
if len(args) != 2:
parser.error('Expected an input and an output filename')
infile, outfile = get_filename_args(has_infile=True)
write_file(outfile, read_file(infile) % fields)
infile, outfile = args
with io.open(infile, encoding='utf-8') as inf:
issue_template_tmpl = inf.read()
# Get the version from yt_dlp/version.py without importing the package
exec(compile(open('yt_dlp/version.py').read(),
'yt_dlp/version.py', 'exec'))
out = issue_template_tmpl % {'version': locals()['__version__']}
with io.open(outfile, 'w', encoding='utf-8') as outf:
outf.write(out)
if __name__ == '__main__':
main()

View File

@ -1,132 +1,105 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import shutil
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from __future__ import unicode_literals, print_function
from inspect import getsource
import io
import os
from os.path import dirname as dirn
import sys
from devscripts.utils import get_filename_args, read_file, write_file
sys.path.insert(0, dirn(dirn((os.path.abspath(__file__)))))
NO_ATTR = object()
STATIC_CLASS_PROPERTIES = [
'IE_NAME', '_ENABLED', '_VALID_URL', # Used for URL matching
'_WORKING', 'IE_DESC', '_NETRC_MACHINE', 'SEARCH_KEY', # Used for --extractor-descriptions
'age_limit', # Used for --age-limit (evaluated)
'_RETURN_TYPE', # Accessed in CLI only with instance (evaluated)
]
CLASS_METHODS = [
'ie_key', 'suitable', '_match_valid_url', # Used for URL matching
'working', 'get_temp_id', '_match_id', # Accessed just before instance creation
'description', # Used for --extractor-descriptions
'is_suitable', # Used for --age-limit
'supports_login', 'is_single_video', # Accessed in CLI only with instance
]
IE_TEMPLATE = '''
lazy_extractors_filename = sys.argv[1] if len(sys.argv) > 1 else 'yt_dlp/extractor/lazy_extractors.py'
if os.path.exists(lazy_extractors_filename):
os.remove(lazy_extractors_filename)
# Block plugins from loading
plugins_dirname = 'ytdlp_plugins'
plugins_blocked_dirname = 'ytdlp_plugins_blocked'
if os.path.exists(plugins_dirname):
os.rename(plugins_dirname, plugins_blocked_dirname)
from yt_dlp.extractor import _ALL_CLASSES
from yt_dlp.extractor.common import InfoExtractor, SearchInfoExtractor
if os.path.exists(plugins_blocked_dirname):
os.rename(plugins_blocked_dirname, plugins_dirname)
with open('devscripts/lazy_load_template.py', 'rt') as f:
module_template = f.read()
CLASS_PROPERTIES = ['ie_key', 'working', '_match_valid_url', 'suitable', '_match_id', 'get_temp_id']
module_contents = [
module_template,
*[getsource(getattr(InfoExtractor, k)) for k in CLASS_PROPERTIES],
'\nclass LazyLoadSearchExtractor(LazyLoadExtractor):\n pass\n']
ie_template = '''
class {name}({bases}):
_module = {module!r}
_module = '{module}'
'''
MODULE_TEMPLATE = read_file('devscripts/lazy_load_template.py')
def main():
lazy_extractors_filename = get_filename_args(default_outfile='yt_dlp/extractor/lazy_extractors.py')
if os.path.exists(lazy_extractors_filename):
os.remove(lazy_extractors_filename)
_ALL_CLASSES = get_all_ies() # Must be before import
import yt_dlp.plugins
from yt_dlp.extractor.common import InfoExtractor, SearchInfoExtractor
# Filter out plugins
_ALL_CLASSES = [cls for cls in _ALL_CLASSES if not cls.__module__.startswith(f'{yt_dlp.plugins.PACKAGE_NAME}.')]
DummyInfoExtractor = type('InfoExtractor', (InfoExtractor,), {'IE_NAME': NO_ATTR})
module_src = '\n'.join((
MODULE_TEMPLATE,
' _module = None',
*extra_ie_code(DummyInfoExtractor),
'\nclass LazyLoadSearchExtractor(LazyLoadExtractor):\n pass\n',
*build_ies(_ALL_CLASSES, (InfoExtractor, SearchInfoExtractor), DummyInfoExtractor),
))
write_file(lazy_extractors_filename, f'{module_src}\n')
def get_base_name(base):
if base is InfoExtractor:
return 'LazyLoadExtractor'
elif base is SearchInfoExtractor:
return 'LazyLoadSearchExtractor'
else:
return base.__name__
def get_all_ies():
PLUGINS_DIRNAME = 'ytdlp_plugins'
BLOCKED_DIRNAME = f'{PLUGINS_DIRNAME}_blocked'
if os.path.exists(PLUGINS_DIRNAME):
# os.rename cannot be used, e.g. in Docker. See https://github.com/yt-dlp/yt-dlp/pull/4958
shutil.move(PLUGINS_DIRNAME, BLOCKED_DIRNAME)
try:
from yt_dlp.extractor.extractors import _ALL_CLASSES
finally:
if os.path.exists(BLOCKED_DIRNAME):
shutil.move(BLOCKED_DIRNAME, PLUGINS_DIRNAME)
return _ALL_CLASSES
def build_lazy_ie(ie, name):
s = ie_template.format(
name=name,
bases=', '.join(map(get_base_name, ie.__bases__)),
module=ie.__module__)
valid_url = getattr(ie, '_VALID_URL', None)
if not valid_url and hasattr(ie, '_make_valid_url'):
valid_url = ie._make_valid_url()
if valid_url:
s += f' _VALID_URL = {valid_url!r}\n'
if not ie._WORKING:
s += ' _WORKING = False\n'
if ie.suitable.__func__ is not InfoExtractor.suitable.__func__:
s += f'\n{getsource(ie.suitable)}'
return s
def extra_ie_code(ie, base=None):
for var in STATIC_CLASS_PROPERTIES:
val = getattr(ie, var)
if val != (getattr(base, var) if base else NO_ATTR):
yield f' {var} = {val!r}'
yield ''
# find the correct sorting and add the required base classes so that subclasses
# can be correctly created
classes = _ALL_CLASSES[:-1]
ordered_cls = []
while classes:
for c in classes[:]:
bases = set(c.__bases__) - set((object, InfoExtractor, SearchInfoExtractor))
stop = False
for b in bases:
if b not in classes and b not in ordered_cls:
if b.__name__ == 'GenericIE':
exit()
classes.insert(0, b)
stop = True
if stop:
break
if all(b in ordered_cls for b in bases):
ordered_cls.append(c)
classes.remove(c)
break
ordered_cls.append(_ALL_CLASSES[-1])
for name in CLASS_METHODS:
f = getattr(ie, name)
if not base or f.__func__ != getattr(base, name).__func__:
yield getsource(f)
names = []
for ie in ordered_cls:
name = ie.__name__
src = build_lazy_ie(ie, name)
module_contents.append(src)
if ie in _ALL_CLASSES:
names.append(name)
module_contents.append(
'\n_ALL_CLASSES = [{0}]'.format(', '.join(names)))
def build_ies(ies, bases, attr_base):
names = []
for ie in sort_ies(ies, bases):
yield build_lazy_ie(ie, ie.__name__, attr_base)
if ie in ies:
names.append(ie.__name__)
module_src = '\n'.join(module_contents) + '\n'
yield f'\n_ALL_CLASSES = [{", ".join(names)}]'
def sort_ies(ies, ignored_bases):
"""find the correct sorting and add the required base classes so that subclasses can be correctly created"""
classes, returned_classes = ies[:-1], set()
assert ies[-1].__name__ == 'GenericIE', 'Last IE must be GenericIE'
while classes:
for c in classes[:]:
bases = set(c.__bases__) - {object, *ignored_bases}
restart = False
for b in sorted(bases, key=lambda x: x.__name__):
if b not in classes and b not in returned_classes:
assert b.__name__ != 'GenericIE', 'Cannot inherit from GenericIE'
classes.insert(0, b)
restart = True
if restart:
break
if bases <= returned_classes:
yield c
returned_classes.add(c)
classes.remove(c)
break
yield ies[-1]
def build_lazy_ie(ie, name, attr_base):
bases = ', '.join({
'InfoExtractor': 'LazyLoadExtractor',
'SearchInfoExtractor': 'LazyLoadSearchExtractor',
}.get(base.__name__, base.__name__) for base in ie.__bases__)
s = IE_TEMPLATE.format(name=name, module=ie.__module__, bases=bases)
return s + '\n'.join(extra_ie_code(ie, attr_base))
if __name__ == '__main__':
main()
with io.open(lazy_extractors_filename, 'wt', encoding='utf-8') as f:
f.write(module_src)

View File

@ -1,93 +1,31 @@
#!/usr/bin/env python3
"""
yt-dlp --help | make_readme.py
This must be run in a console of correct width
"""
# yt-dlp --help | make_readme.py
# This must be run in a console of correct width
# Allow direct execution
import os
from __future__ import unicode_literals
import io
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import functools
import re
from devscripts.utils import read_file, write_file
README_FILE = 'README.md'
helptext = sys.stdin.read()
OPTIONS_START = 'General Options:'
OPTIONS_END = 'CONFIGURATION'
EPILOG_START = 'See full documentation'
ALLOWED_OVERSHOOT = 2
if isinstance(helptext, bytes):
helptext = helptext.decode('utf-8')
DISABLE_PATCH = object()
with io.open(README_FILE, encoding='utf-8') as f:
oldreadme = f.read()
header = oldreadme[:oldreadme.index('## General Options:')]
footer = oldreadme[oldreadme.index('# CONFIGURATION'):]
def take_section(text, start=None, end=None, *, shift=0):
return text[
text.index(start) + shift if start else None:
text.index(end) + shift if end else None
]
options = helptext[helptext.index(' General Options:'):]
options = re.sub(r'(?m)^ (\w.+)$', r'## \1', options)
options = options + '\n'
def apply_patch(text, patch):
return text if patch[0] is DISABLE_PATCH else re.sub(*patch, text)
options = take_section(sys.stdin.read(), f'\n {OPTIONS_START}', f'\n{EPILOG_START}', shift=1)
max_width = max(map(len, options.split('\n')))
switch_col_width = len(re.search(r'(?m)^\s{5,}', options).group())
delim = f'\n{" " * switch_col_width}'
PATCHES = (
( # Standardize `--update` message
r'(?m)^( -U, --update\s+).+(\n \s.+)*$',
r'\1Update this program to the latest version',
),
( # Headings
r'(?m)^ (\w.+\n)( (?=\w))?',
r'## \1',
),
( # Fixup `--date` formatting
rf'(?m)( --date DATE.+({delim}[^\[]+)*)\[.+({delim}.+)*$',
(rf'\1[now|today|yesterday][-N[day|week|month|year]].{delim}'
f'E.g. "--date today-2weeks" downloads only{delim}'
'videos uploaded on the same day two weeks ago'),
),
( # Do not split URLs
rf'({delim[:-1]})? (?P<label>\[\S+\] )?(?P<url>https?({delim})?:({delim})?/({delim})?/(({delim})?\S+)+)\s',
lambda mobj: ''.join((delim, mobj.group('label') or '', re.sub(r'\s+', '', mobj.group('url')), '\n')),
),
( # Do not split "words"
rf'(?m)({delim}\S+)+$',
lambda mobj: ''.join((delim, mobj.group(0).replace(delim, ''))),
),
( # Allow overshooting last line
rf'(?m)^(?P<prev>.+)${delim}(?P<current>.+)$(?!{delim})',
lambda mobj: (mobj.group().replace(delim, ' ')
if len(mobj.group()) - len(delim) + 1 <= max_width + ALLOWED_OVERSHOOT
else mobj.group()),
),
( # Avoid newline when a space is available b/w switch and description
DISABLE_PATCH, # This creates issues with prepare_manpage
r'(?m)^(\s{4}-.{%d})(%s)' % (switch_col_width - 6, delim),
r'\1 ',
),
( # Replace brackets with a Markdown link
r'SponsorBlock API \((http.+)\)',
r'[SponsorBlock API](\1)',
),
)
readme = read_file(README_FILE)
write_file(README_FILE, ''.join((
take_section(readme, end=f'## {OPTIONS_START}'),
functools.reduce(apply_patch, PATCHES, options),
take_section(readme, f'# {OPTIONS_END}'),
)))
with io.open(README_FILE, 'w', encoding='utf-8') as f:
f.write(header)
f.write(options)
f.write(footer)

View File

@ -1,19 +1,48 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
# Allow direct execution
import io
import optparse
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from devscripts.utils import get_filename_args, write_file
from yt_dlp.extractor import list_extractor_classes
# Import yt_dlp
ROOT_DIR = os.path.join(os.path.dirname(__file__), '..')
sys.path.insert(0, ROOT_DIR)
import yt_dlp
def main():
out = '\n'.join(ie.description() for ie in list_extractor_classes() if ie.IE_DESC is not False)
write_file(get_filename_args(), f'# Supported sites\n{out}\n')
parser = optparse.OptionParser(usage='%prog OUTFILE.md')
options, args = parser.parse_args()
if len(args) != 1:
parser.error('Expected an output filename')
outfile, = args
def gen_ies_md(ies):
for ie in ies:
ie_md = '**{0}**'.format(ie.IE_NAME)
ie_desc = getattr(ie, 'IE_DESC', None)
if ie_desc is False:
continue
if ie_desc is not None:
ie_md += ': {0}'.format(ie.IE_DESC)
search_key = getattr(ie, 'SEARCH_KEY', None)
if search_key is not None:
ie_md += f'; "{ie.SEARCH_KEY}:" prefix'
if not ie.working():
ie_md += ' (Currently broken)'
yield ie_md
ies = sorted(yt_dlp.gen_extractors(), key=lambda i: i.IE_NAME.lower())
out = '# Supported sites\n' + ''.join(
' - ' + md + '\n'
for md in gen_ies_md(ies))
with io.open(outfile, 'w', encoding='utf-8') as outf:
outf.write(out)
if __name__ == '__main__':

6
devscripts/posix-locale.sh Executable file
View File

@ -0,0 +1,6 @@
# source this file in your shell to get a POSIX locale (which will break many programs, but that's kind of the point)
export LC_ALL=POSIX
export LANG=POSIX
export LANGUAGE=POSIX

View File

@ -1,22 +1,11 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import io
import optparse
import os.path
import re
from devscripts.utils import (
compose_functions,
get_filename_args,
read_file,
write_file,
)
ROOT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
README_FILE = os.path.join(ROOT_DIR, 'README.md')
@ -24,7 +13,7 @@
# NAME
yt\-dlp \- A feature\-rich command\-line audio/video downloader
yt\-dlp \- A youtube-dl fork with additional features and patches
# SYNOPSIS
@ -35,6 +24,25 @@
'''
def main():
parser = optparse.OptionParser(usage='%prog OUTFILE.md')
options, args = parser.parse_args()
if len(args) != 1:
parser.error('Expected an output filename')
outfile, = args
with io.open(README_FILE, encoding='utf-8') as f:
readme = f.read()
readme = filter_excluded_sections(readme)
readme = move_sections(readme)
readme = filter_options(readme)
with io.open(outfile, 'w', encoding='utf-8') as outf:
outf.write(PREFIX + readme)
def filter_excluded_sections(readme):
EXCLUDED_SECTION_BEGIN_STRING = re.escape('<!-- MANPAGE: BEGIN EXCLUDED SECTION -->')
EXCLUDED_SECTION_END_STRING = re.escape('<!-- MANPAGE: END EXCLUDED SECTION -->')
@ -43,27 +51,6 @@ def filter_excluded_sections(readme):
'', readme)
def _convert_code_blocks(readme):
current_code_block = None
for line in readme.splitlines(True):
if current_code_block:
if line == current_code_block:
current_code_block = None
yield '\n'
else:
yield f' {line}'
elif line.startswith('```'):
current_code_block = line.count('`') * '`' + '\n'
yield '\n'
else:
yield line
def convert_code_blocks(readme):
return ''.join(_convert_code_blocks(readme))
def move_sections(readme):
MOVE_TAG_TEMPLATE = '<!-- MANPAGE: MOVE "%s" SECTION HERE -->'
sections = re.findall(r'(?m)^%s$' % (
@ -86,35 +73,26 @@ def move_sections(readme):
def filter_options(readme):
section = re.search(r'(?sm)^# USAGE AND OPTIONS\n.+?(?=^# )', readme).group(0)
section_new = section.replace('*', R'\*')
options = '# OPTIONS\n'
for line in section_new.split('\n')[1:]:
mobj = re.fullmatch(r'''(?x)
\s{4}(?P<opt>-(?:,\s|[^\s])+)
(?:\s(?P<meta>(?:[^\s]|\s(?!\s))+))?
(\s{2,}(?P<desc>.+))?
''', line)
if not mobj:
options += f'{line.lstrip()}\n'
continue
option, metavar, description = mobj.group('opt', 'meta', 'desc')
for line in section.split('\n')[1:]:
if line.lstrip().startswith('-'):
split = re.split(r'\s{2,}', line.lstrip())
# Description string may start with `-` as well. If there is
# only one piece then it's a description bit not an option.
if len(split) > 1:
option, description = split
split_option = option.split(' ')
# Pandoc's definition_lists. See http://pandoc.org/README.html
option = f'{option} *{metavar}*' if metavar else option
description = f'{description}\n' if description else ''
options += f'\n{option}\n: {description}'
continue
if not split_option[-1].startswith('-'): # metavar
option = ' '.join(split_option[:-1] + [f'*{split_option[-1]}*'])
# Pandoc's definition_lists. See http://pandoc.org/README.html
options += f'\n{option}\n: {description}\n'
continue
options += line.lstrip() + '\n'
return readme.replace(section, options, 1)
TRANSFORM = compose_functions(filter_excluded_sections, convert_code_blocks, move_sections, filter_options)
def main():
write_file(get_filename_args(), PREFIX + TRANSFORM(read_file(README_FILE)))
if __name__ == '__main__':
main()

143
devscripts/release.sh Executable file
View File

@ -0,0 +1,143 @@
# Unused
#!/bin/bash
# IMPORTANT: the following assumptions are made
# * the GH repo is on the origin remote
# * the gh-pages branch is named so locally
# * the git config user.signingkey is properly set
# You will need
# pip install coverage nose rsa wheel
# TODO
# release notes
# make hash on local files
set -e
skip_tests=true
gpg_sign_commits=""
buildserver='localhost:8142'
while true
do
case "$1" in
--run-tests)
skip_tests=false
shift
;;
--gpg-sign-commits|-S)
gpg_sign_commits="-S"
shift
;;
--buildserver)
buildserver="$2"
shift 2
;;
--*)
echo "ERROR: unknown option $1"
exit 1
;;
*)
break
;;
esac
done
if [ -z "$1" ]; then echo "ERROR: specify version number like this: $0 1994.09.06"; exit 1; fi
version="$1"
major_version=$(echo "$version" | sed -n 's#^\([0-9]*\.[0-9]*\.[0-9]*\).*#\1#p')
if test "$major_version" '!=' "$(date '+%Y.%m.%d')"; then
echo "$version does not start with today's date!"
exit 1
fi
if [ ! -z "`git tag | grep "$version"`" ]; then echo 'ERROR: version already present'; exit 1; fi
if [ ! -z "`git status --porcelain | grep -v CHANGELOG`" ]; then echo 'ERROR: the working directory is not clean; commit or stash changes'; exit 1; fi
useless_files=$(find yt_dlp -type f -not -name '*.py')
if [ ! -z "$useless_files" ]; then echo "ERROR: Non-.py files in yt_dlp: $useless_files"; exit 1; fi
if [ ! -f "updates_key.pem" ]; then echo 'ERROR: updates_key.pem missing'; exit 1; fi
if ! type pandoc >/dev/null 2>/dev/null; then echo 'ERROR: pandoc is missing'; exit 1; fi
if ! python3 -c 'import rsa' 2>/dev/null; then echo 'ERROR: python3-rsa is missing'; exit 1; fi
if ! python3 -c 'import wheel' 2>/dev/null; then echo 'ERROR: wheel is missing'; exit 1; fi
read -p "Is Changelog up to date? (y/n) " -n 1
if [[ ! $REPLY =~ ^[Yy]$ ]]; then exit 1; fi
/bin/echo -e "\n### First of all, testing..."
make clean
if $skip_tests ; then
echo 'SKIPPING TESTS'
else
nosetests --verbose --with-coverage --cover-package=yt_dlp --cover-html test --stop || exit 1
fi
/bin/echo -e "\n### Changing version in version.py..."
sed -i "s/__version__ = '.*'/__version__ = '$version'/" yt_dlp/version.py
/bin/echo -e "\n### Changing version in Changelog..."
sed -i "s/<unreleased>/$version/" Changelog.md
/bin/echo -e "\n### Committing documentation, templates and yt_dlp/version.py..."
make README.md CONTRIBUTING.md issuetemplates supportedsites
git add README.md CONTRIBUTING.md .github/ISSUE_TEMPLATE/1_broken_site.md .github/ISSUE_TEMPLATE/2_site_support_request.md .github/ISSUE_TEMPLATE/3_site_feature_request.md .github/ISSUE_TEMPLATE/4_bug_report.md .github/ISSUE_TEMPLATE/5_feature_request.md .github/ISSUE_TEMPLATE/6_question.md docs/supportedsites.md yt_dlp/version.py Changelog.md
git commit $gpg_sign_commits -m "release $version"
/bin/echo -e "\n### Now tagging, signing and pushing..."
git tag -s -m "Release $version" "$version"
git show "$version"
read -p "Is it good, can I push? (y/n) " -n 1
if [[ ! $REPLY =~ ^[Yy]$ ]]; then exit 1; fi
echo
MASTER=$(git rev-parse --abbrev-ref HEAD)
git push origin $MASTER:master
git push origin "$version"
/bin/echo -e "\n### OK, now it is time to build the binaries..."
REV=$(git rev-parse HEAD)
make yt-dlp yt-dlp.tar.gz
read -p "VM running? (y/n) " -n 1
wget "http://$buildserver/build/ytdl-org/youtube-dl/yt-dlp.exe?rev=$REV" -O yt-dlp.exe
mkdir -p "build/$version"
mv yt-dlp yt-dlp.exe "build/$version"
mv yt-dlp.tar.gz "build/$version/yt-dlp-$version.tar.gz"
RELEASE_FILES="yt-dlp yt-dlp.exe yt-dlp-$version.tar.gz"
(cd build/$version/ && md5sum $RELEASE_FILES > MD5SUMS)
(cd build/$version/ && sha1sum $RELEASE_FILES > SHA1SUMS)
(cd build/$version/ && sha256sum $RELEASE_FILES > SHA2-256SUMS)
(cd build/$version/ && sha512sum $RELEASE_FILES > SHA2-512SUMS)
/bin/echo -e "\n### Signing and uploading the new binaries to GitHub..."
for f in $RELEASE_FILES; do gpg --passphrase-repeat 5 --detach-sig "build/$version/$f"; done
ROOT=$(pwd)
python devscripts/create-github-release.py Changelog.md $version "$ROOT/build/$version"
ssh ytdl@yt-dl.org "sh html/update_latest.sh $version"
/bin/echo -e "\n### Now switching to gh-pages..."
git clone --branch gh-pages --single-branch . build/gh-pages
(
set -e
ORIGIN_URL=$(git config --get remote.origin.url)
cd build/gh-pages
"$ROOT/devscripts/gh-pages/add-version.py" $version
"$ROOT/devscripts/gh-pages/update-feed.py"
"$ROOT/devscripts/gh-pages/sign-versions.py" < "$ROOT/updates_key.pem"
"$ROOT/devscripts/gh-pages/generate-download.py"
"$ROOT/devscripts/gh-pages/update-copyright.py"
"$ROOT/devscripts/gh-pages/update-sites.py"
git add *.html *.html.in update
git commit $gpg_sign_commits -m "release $version"
git push "$ROOT" gh-pages
git push "$ORIGIN_URL" gh-pages
)
rm -rf build
make pypi-files
echo "Uploading to PyPi ..."
python setup.py sdist bdist_wheel upload
make clean
/bin/echo -e "\n### DONE!"

16
devscripts/run_tests.bat Normal file
View File

@ -0,0 +1,16 @@
@setlocal
@echo off
cd /d %~dp0..
if ["%~1"]==[""] (
set "test_set="test""
) else if ["%~1"]==["core"] (
set "test_set="-m not download""
) else if ["%~1"]==["download"] (
set "test_set="-m "download""
) else (
echo.Invalid test type "%~1". Use "core" ^| "download"
exit /b 1
)
pytest %test_set%

View File

@ -1,75 +0,0 @@
#!/usr/bin/env python3
import argparse
import functools
import os
import re
import shlex
import subprocess
import sys
from pathlib import Path
fix_test_name = functools.partial(re.compile(r'IE(_all|_\d+)?$').sub, r'\1')
def parse_args():
parser = argparse.ArgumentParser(description='Run selected yt-dlp tests')
parser.add_argument(
'test', help='a extractor tests, or one of "core" or "download"', nargs='*')
parser.add_argument(
'-k', help='run a test matching EXPRESSION. Same as "pytest -k"', metavar='EXPRESSION')
parser.add_argument(
'--pytest-args', help='arguments to passthrough to pytest')
return parser.parse_args()
def run_tests(*tests, pattern=None, ci=False):
run_core = 'core' in tests or (not pattern and not tests)
run_download = 'download' in tests
tests = list(map(fix_test_name, tests))
pytest_args = args.pytest_args or os.getenv('HATCH_TEST_ARGS', '')
arguments = ['pytest', '-Werror', '--tb=short', *shlex.split(pytest_args)]
if ci:
arguments.append('--color=yes')
if pattern:
arguments.extend(['-k', pattern])
if run_core:
arguments.extend(['-m', 'not download'])
elif run_download:
arguments.extend(['-m', 'download'])
else:
arguments.extend(
f'test/test_download.py::TestDownload::test_{test}' for test in tests)
print(f'Running {arguments}', flush=True)
try:
return subprocess.call(arguments)
except FileNotFoundError:
pass
arguments = [sys.executable, '-Werror', '-m', 'unittest']
if pattern:
arguments.extend(['-k', pattern])
if run_core:
print('"pytest" needs to be installed to run core tests', file=sys.stderr, flush=True)
return 1
elif run_download:
arguments.append('test.test_download')
else:
arguments.extend(
f'test.test_download.TestDownload.test_{test}' for test in tests)
print(f'Running {arguments}', flush=True)
return subprocess.call(arguments)
if __name__ == '__main__':
try:
args = parse_args()
os.chdir(Path(__file__).parent.parent)
sys.exit(run_tests(*args.test, pattern=args.k, ci=bool(os.getenv('CI'))))
except KeyboardInterrupt:
pass

14
devscripts/run_tests.sh Executable file
View File

@ -0,0 +1,14 @@
#!/bin/sh
if [ -z $1 ]; then
test_set='test'
elif [ $1 = 'core' ]; then
test_set="-m not download"
elif [ $1 = 'download' ]; then
test_set="-m download"
else
echo 'Invalid test type "'$1'". Use "core" | "download"'
exit 1
fi
python3 -m pytest "$test_set"

View File

@ -1,36 +0,0 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import argparse
import functools
import re
from devscripts.utils import compose_functions, read_file, write_file
VERSION_FILE = 'yt_dlp/version.py'
def parse_options():
parser = argparse.ArgumentParser(description='Set the build variant of the package')
parser.add_argument('variant', help='Name of the variant')
parser.add_argument('-M', '--update-message', default=None, help='Message to show in -U')
return parser.parse_args()
def property_setter(name, value):
return functools.partial(re.sub, rf'(?m)^{name}\s*=\s*.+$', f'{name} = {value!r}')
opts = parse_options()
transform = compose_functions(
property_setter('VARIANT', opts.variant),
property_setter('UPDATE_HINT', opts.update_message),
)
write_file(VERSION_FILE, transform(read_file(VERSION_FILE)))

View File

@ -0,0 +1,49 @@
# Unused
#!/usr/bin/env python3
from __future__ import unicode_literals
import itertools
import json
import os
import re
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from yt_dlp.compat import (
compat_print,
compat_urllib_request,
)
from yt_dlp.utils import format_bytes
def format_size(bytes):
return '%s (%d bytes)' % (format_bytes(bytes), bytes)
total_bytes = 0
for page in itertools.count(1):
releases = json.loads(compat_urllib_request.urlopen(
'https://api.github.com/repos/ytdl-org/youtube-dl/releases?page=%s' % page
).read().decode('utf-8'))
if not releases:
break
for release in releases:
compat_print(release['name'])
for asset in release['assets']:
asset_name = asset['name']
total_bytes += asset['download_count'] * asset['size']
if all(not re.match(p, asset_name) for p in (
r'^yt-dlp$',
r'^yt-dlp-\d{4}\.\d{2}\.\d{2}(?:\.\d+)?\.tar\.gz$',
r'^yt-dlp\.exe$')):
continue
compat_print(
' %s size: %s downloads: %d'
% (asset_name, format_size(asset['size']), asset['download_count']))
compat_print('total downloads traffic: %s' % format_size(total_bytes))

View File

@ -1,189 +0,0 @@
#!/usr/bin/env python3
"""
Simple parser for spec compliant toml files
A simple toml parser for files that comply with the spec.
Should only be used to parse `pyproject.toml` for `install_deps.py`.
IMPORTANT: INVALID FILES OR MULTILINE STRINGS ARE NOT SUPPORTED!
"""
from __future__ import annotations
import datetime as dt
import json
import re
WS = r'(?:[\ \t]*)'
STRING_RE = re.compile(r'"(?:\\.|[^\\"\n])*"|\'[^\'\n]*\'')
SINGLE_KEY_RE = re.compile(rf'{STRING_RE.pattern}|[A-Za-z0-9_-]+')
KEY_RE = re.compile(rf'{WS}(?:{SINGLE_KEY_RE.pattern}){WS}(?:\.{WS}(?:{SINGLE_KEY_RE.pattern}){WS})*')
EQUALS_RE = re.compile(rf'={WS}')
WS_RE = re.compile(WS)
_SUBTABLE = rf'(?P<subtable>^\[(?P<is_list>\[)?(?P<path>{KEY_RE.pattern})\]\]?)'
EXPRESSION_RE = re.compile(rf'^(?:{_SUBTABLE}|{KEY_RE.pattern}=)', re.MULTILINE)
LIST_WS_RE = re.compile(rf'{WS}((#[^\n]*)?\n{WS})*')
LEFTOVER_VALUE_RE = re.compile(r'[^,}\]\t\n#]+')
def parse_key(value: str):
for match in SINGLE_KEY_RE.finditer(value):
if match[0][0] == '"':
yield json.loads(match[0])
elif match[0][0] == '\'':
yield match[0][1:-1]
else:
yield match[0]
def get_target(root: dict, paths: list[str], is_list=False):
target = root
for index, key in enumerate(paths, 1):
use_list = is_list and index == len(paths)
result = target.get(key)
if result is None:
result = [] if use_list else {}
target[key] = result
if isinstance(result, dict):
target = result
elif use_list:
target = {}
result.append(target)
else:
target = result[-1]
assert isinstance(target, dict)
return target
def parse_enclosed(data: str, index: int, end: str, ws_re: re.Pattern):
index += 1
if match := ws_re.match(data, index):
index = match.end()
while data[index] != end:
index = yield True, index
if match := ws_re.match(data, index):
index = match.end()
if data[index] == ',':
index += 1
if match := ws_re.match(data, index):
index = match.end()
assert data[index] == end
yield False, index + 1
def parse_value(data: str, index: int):
if data[index] == '[':
result = []
indices = parse_enclosed(data, index, ']', LIST_WS_RE)
valid, index = next(indices)
while valid:
index, value = parse_value(data, index)
result.append(value)
valid, index = indices.send(index)
return index, result
if data[index] == '{':
result = {}
indices = parse_enclosed(data, index, '}', WS_RE)
valid, index = next(indices)
while valid:
valid, index = indices.send(parse_kv_pair(data, index, result))
return index, result
if match := STRING_RE.match(data, index):
return match.end(), json.loads(match[0]) if match[0][0] == '"' else match[0][1:-1]
match = LEFTOVER_VALUE_RE.match(data, index)
assert match
value = match[0].strip()
for func in [
int,
float,
dt.time.fromisoformat,
dt.date.fromisoformat,
dt.datetime.fromisoformat,
{'true': True, 'false': False}.get,
]:
try:
value = func(value)
break
except Exception:
pass
return match.end(), value
def parse_kv_pair(data: str, index: int, target: dict):
match = KEY_RE.match(data, index)
if not match:
return None
*keys, key = parse_key(match[0])
match = EQUALS_RE.match(data, match.end())
assert match
index = match.end()
index, value = parse_value(data, index)
get_target(target, keys)[key] = value
return index
def parse_toml(data: str):
root = {}
target = root
index = 0
while True:
match = EXPRESSION_RE.search(data, index)
if not match:
break
if match.group('subtable'):
index = match.end()
path, is_list = match.group('path', 'is_list')
target = get_target(root, list(parse_key(path)), bool(is_list))
continue
index = parse_kv_pair(data, match.start(), target)
assert index is not None
return root
def main():
import argparse
from pathlib import Path
parser = argparse.ArgumentParser()
parser.add_argument('infile', type=Path, help='The TOML file to read as input')
args = parser.parse_args()
with args.infile.open('r', encoding='utf-8') as file:
data = file.read()
def default(obj):
if isinstance(obj, (dt.date, dt.time, dt.datetime)):
return obj.isoformat()
print(json.dumps(parse_toml(data), default=default))
if __name__ == '__main__':
main()

View File

@ -0,0 +1,37 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
import json
import os
import re
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from yt_dlp.compat import compat_urllib_request
# usage: python3 ./devscripts/update-formulae.py <path-to-formulae-rb> <version>
# version can be either 0-aligned (yt-dlp version) or normalized (PyPl version)
filename, version = sys.argv[1:]
normalized_version = '.'.join(str(int(x)) for x in version.split('.'))
pypi_release = json.loads(compat_urllib_request.urlopen(
'https://pypi.org/pypi/yt-dlp/%s/json' % normalized_version
).read().decode('utf-8'))
tarball_file = next(x for x in pypi_release['urls'] if x['filename'].endswith('.tar.gz'))
sha256sum = tarball_file['digests']['sha256']
url = tarball_file['url']
with open(filename, 'r') as r:
formulae_text = r.read()
formulae_text = re.sub(r'sha256 "[0-9a-f]*?"', 'sha256 "%s"' % sha256sum, formulae_text)
formulae_text = re.sub(r'url "[^"]*?"', 'url "%s"' % url, formulae_text)
with open(filename, 'w') as w:
w.write(formulae_text)

View File

@ -1,82 +1,42 @@
#!/usr/bin/env python3
# Allow direct execution
import os
from datetime import datetime
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import subprocess
import argparse
import contextlib
import datetime as dt
import sys
with open('yt_dlp/version.py', 'rt') as f:
exec(compile(f.read(), 'yt_dlp/version.py', 'exec'))
old_version = locals()['__version__']
from devscripts.utils import read_version, run_process, write_file
old_version_list = old_version.split('.')
old_ver = '.'.join(old_version_list[:3])
old_rev = old_version_list[3] if len(old_version_list) > 3 else ''
def get_new_version(version, revision):
if not version:
version = dt.datetime.now(dt.timezone.utc).strftime('%Y.%m.%d')
ver = datetime.utcnow().strftime("%Y.%m.%d")
if revision:
assert revision.isdecimal(), 'Revision must be a number'
else:
old_version = read_version().split('.')
if version.split('.') == old_version[:3]:
revision = str(int(([*old_version, 0])[3]) + 1)
rev = (sys.argv[1:] or [''])[0] # Use first argument, if present as revision number
if not rev:
rev = str(int(old_rev or 0) + 1) if old_ver == ver else ''
return f'{version}.{revision}' if revision else version
VERSION = '.'.join((ver, rev)) if rev else ver
try:
sp = subprocess.Popen(['git', 'rev-parse', '--short', 'HEAD'], stdout=subprocess.PIPE)
GIT_HEAD = sp.communicate()[0].decode().strip() or None
except Exception:
GIT_HEAD = None
def get_git_head():
with contextlib.suppress(Exception):
return run_process('git', 'rev-parse', 'HEAD').stdout.strip()
VERSION_TEMPLATE = '''\
VERSION_FILE = f'''\
# Autogenerated by devscripts/update-version.py
__version__ = {version!r}
__version__ = {VERSION!r}
RELEASE_GIT_HEAD = {git_head!r}
VARIANT = None
UPDATE_HINT = None
CHANNEL = {channel!r}
ORIGIN = {origin!r}
_pkg_version = {package_version!r}
RELEASE_GIT_HEAD = {GIT_HEAD!r}
'''
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Update the version.py file')
parser.add_argument(
'-c', '--channel', default='stable',
help='Select update channel (default: %(default)s)')
parser.add_argument(
'-r', '--origin', default='local',
help='Select origin/repository (default: %(default)s)')
parser.add_argument(
'-s', '--suffix', default='',
help='Add an alphanumeric suffix to the package version, e.g. "dev"')
parser.add_argument(
'-o', '--output', default='yt_dlp/version.py',
help='The output file to write to (default: %(default)s)')
parser.add_argument(
'version', nargs='?', default=None,
help='A version or revision to use instead of generating one')
args = parser.parse_args()
with open('yt_dlp/version.py', 'wt') as f:
f.write(VERSION_FILE)
git_head = get_git_head()
version = (
args.version if args.version and '.' in args.version
else get_new_version(None, args.version))
write_file(args.output, VERSION_TEMPLATE.format(
version=version, git_head=git_head, channel=args.channel, origin=args.origin,
package_version=f'{version}{args.suffix}'))
print(f'version={version} ({args.channel}), head={git_head}')
print('::set-output name=ytdlp_version::' + VERSION)
print(f'\nVersion = {VERSION}, Git HEAD = {GIT_HEAD}')

View File

@ -1,26 +0,0 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from pathlib import Path
from devscripts.make_changelog import create_changelog, create_parser
from devscripts.utils import read_file, read_version, write_file
# Always run after devscripts/update-version.py, and run before `make doc|pypi-files|tar|all`
if __name__ == '__main__':
parser = create_parser()
parser.description = 'Update an existing changelog file with an entry for a new release'
parser.add_argument(
'--changelog-path', type=Path, default=Path(__file__).parent.parent / 'Changelog.md',
help='path to the Changelog file')
args = parser.parse_args()
new_entry = create_changelog(args)
header, sep, changelog = read_file(args.changelog_path).partition('\n### ')
write_file(args.changelog_path, f'{header}{sep}{read_version()}\n{new_entry}\n{sep}{changelog}')

View File

@ -1,47 +0,0 @@
import argparse
import functools
import subprocess
def read_file(fname):
with open(fname, encoding='utf-8') as f:
return f.read()
def write_file(fname, content, mode='w'):
with open(fname, mode, encoding='utf-8') as f:
return f.write(content)
def read_version(fname='yt_dlp/version.py', varname='__version__'):
"""Get the version without importing the package"""
items = {}
exec(compile(read_file(fname), fname, 'exec'), items)
return items[varname]
def get_filename_args(has_infile=False, default_outfile=None):
parser = argparse.ArgumentParser()
if has_infile:
parser.add_argument('infile', help='Input file')
kwargs = {'nargs': '?', 'default': default_outfile} if default_outfile else {}
parser.add_argument('outfile', **kwargs, help='Output file')
opts = parser.parse_args()
if has_infile:
return opts.infile, opts.outfile
return opts.outfile
def compose_functions(*functions):
return lambda x: functools.reduce(lambda y, f: f(y), functions, x)
def run_process(*args, **kwargs):
kwargs.setdefault('text', True)
kwargs.setdefault('check', True)
kwargs.setdefault('capture_output', True)
if kwargs['text']:
kwargs.setdefault('encoding', 'utf-8')
kwargs.setdefault('errors', 'replace')
return subprocess.run(args, **kwargs)

58
devscripts/wine-py2exe.sh Executable file
View File

@ -0,0 +1,58 @@
# UNUSED
#!/bin/bash
# Run with as parameter a setup.py that works in the current directory
# e.g. no os.chdir()
# It will run twice, the first time will crash
set -e
SCRIPT_DIR="$( cd "$( dirname "$0" )" && pwd )"
if [ ! -d wine-py2exe ]; then
sudo apt-get install wine1.3 axel bsdiff
mkdir wine-py2exe
cd wine-py2exe
export WINEPREFIX=`pwd`
axel -a "http://www.python.org/ftp/python/2.7/python-2.7.msi"
axel -a "http://downloads.sourceforge.net/project/py2exe/py2exe/0.6.9/py2exe-0.6.9.win32-py2.7.exe"
#axel -a "http://winetricks.org/winetricks"
# http://appdb.winehq.org/objectManager.php?sClass=version&iId=21957
echo "Follow python setup on screen"
wine msiexec /i python-2.7.msi
echo "Follow py2exe setup on screen"
wine py2exe-0.6.9.win32-py2.7.exe
#echo "Follow Microsoft Visual C++ 2008 Redistributable Package setup on screen"
#bash winetricks vcrun2008
rm py2exe-0.6.9.win32-py2.7.exe
rm python-2.7.msi
#rm winetricks
# http://bugs.winehq.org/show_bug.cgi?id=3591
mv drive_c/Python27/Lib/site-packages/py2exe/run.exe drive_c/Python27/Lib/site-packages/py2exe/run.exe.backup
bspatch drive_c/Python27/Lib/site-packages/py2exe/run.exe.backup drive_c/Python27/Lib/site-packages/py2exe/run.exe "$SCRIPT_DIR/SizeOfImage.patch"
mv drive_c/Python27/Lib/site-packages/py2exe/run_w.exe drive_c/Python27/Lib/site-packages/py2exe/run_w.exe.backup
bspatch drive_c/Python27/Lib/site-packages/py2exe/run_w.exe.backup drive_c/Python27/Lib/site-packages/py2exe/run_w.exe "$SCRIPT_DIR/SizeOfImage_w.patch"
cd -
else
export WINEPREFIX="$( cd wine-py2exe && pwd )"
fi
wine "C:\\Python27\\python.exe" "$1" py2exe > "py2exe.log" 2>&1 || true
echo '# Copying python27.dll' >> "py2exe.log"
cp "$WINEPREFIX/drive_c/windows/system32/python27.dll" build/bdist.win32/winexe/bundle-2.7/
wine "C:\\Python27\\python.exe" "$1" py2exe >> "py2exe.log" 2>&1

View File

@ -1,23 +1,22 @@
#!/usr/bin/env python3
from __future__ import unicode_literals
# Allow direct execution
import os
from os.path import dirname as dirn
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
sys.path.insert(0, dirn(dirn((os.path.abspath(__file__)))))
import yt_dlp
ZSH_COMPLETION_FILE = 'completions/zsh/_yt-dlp'
ZSH_COMPLETION_TEMPLATE = 'devscripts/zsh-completion.in'
ZSH_COMPLETION_FILE = "completions/zsh/_yt-dlp"
ZSH_COMPLETION_TEMPLATE = "devscripts/zsh-completion.in"
def build_completion(opt_parser):
opts = [opt for group in opt_parser.option_groups
for opt in group.option_list]
opts_file = [opt for opt in opts if opt.metavar == 'FILE']
opts_dir = [opt for opt in opts if opt.metavar == 'DIR']
opts_file = [opt for opt in opts if opt.metavar == "FILE"]
opts_dir = [opt for opt in opts if opt.metavar == "DIR"]
fileopts = []
for opt in opts_file:
@ -38,13 +37,13 @@ def build_completion(opt_parser):
with open(ZSH_COMPLETION_TEMPLATE) as f:
template = f.read()
template = template.replace('{{fileopts}}', '|'.join(fileopts))
template = template.replace('{{diropts}}', '|'.join(diropts))
template = template.replace('{{flags}}', ' '.join(flags))
template = template.replace("{{fileopts}}", "|".join(fileopts))
template = template.replace("{{diropts}}", "|".join(diropts))
template = template.replace("{{flags}}", " ".join(flags))
with open(ZSH_COMPLETION_FILE, 'w') as f:
with open(ZSH_COMPLETION_FILE, "w") as f:
f.write(template)
parser = yt_dlp.parseOpts(ignore_config_files=True)[0]
parser = yt_dlp.parseOpts()[0]
build_completion(parser)

1
docs/.gitignore vendored Normal file
View File

@ -0,0 +1 @@
_build/

5
docs/Changelog.md Normal file
View File

@ -0,0 +1,5 @@
---
orphan: true
---
```{include} ../Changelog.md
```

5
docs/Collaborators.md Normal file
View File

@ -0,0 +1,5 @@
---
orphan: true
---
```{include} ../Collaborators.md
```

5
docs/Contributing.md Normal file
View File

@ -0,0 +1,5 @@
---
orphan: true
---
```{include} ../Contributing.md
```

6
docs/LICENSE.md Normal file
View File

@ -0,0 +1,6 @@
---
orphan: true
---
# LICENSE
```{include} ../LICENSE
```

177
docs/Makefile Normal file
View File

@ -0,0 +1,177 @@
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
endif
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
clean:
rm -rf $(BUILDDIR)/*
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/yt-dlp.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/yt-dlp.qhc"
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/yt-dlp"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/yt-dlp"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."

2
docs/README.md Normal file
View File

@ -0,0 +1,2 @@
```{include} ../README.md
```

68
docs/conf.py Normal file
View File

@ -0,0 +1,68 @@
# coding: utf-8
#
# yt-dlp documentation build configuration file
import sys
import os
# Allows to import yt-dlp
sys.path.insert(0, os.path.abspath('..'))
# -- General configuration ------------------------------------------------
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'myst_parser',
]
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The master toctree document.
master_doc = 'README'
# General information about the project.
project = u'yt-dlp'
author = u'yt-dlp'
copyright = u'UNLICENSE'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
from yt_dlp.version import __version__
version = __version__
# The full version, including alpha/beta/rc tags.
release = version
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = ['_build']
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# -- Options for HTML output ----------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'default'
# Disable highlights
highlight_language = 'none'
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
# html_static_path = ['_static']
# Enable heading anchors
myst_heading_anchors = 4
# Suppress heading warnings
suppress_warnings = [
'myst.header',
]

1
docs/requirements.txt Normal file
View File

@ -0,0 +1 @@
myst-parser

5
docs/supportedsites.md Normal file
View File

@ -0,0 +1,5 @@
---
orphan: true
---
```{include} ../supportedsites.md
```

6
docs/ytdlp_plugins.md Normal file
View File

@ -0,0 +1,6 @@
---
orphan: true
---
# ytdlp_plugins
See [https://github.com/yt-dlp/yt-dlp/tree/master/ytdlp_plugins](https://github.com/yt-dlp/yt-dlp/tree/master/ytdlp_plugins).

View File

@ -1,29 +0,0 @@
-----BEGIN PGP PUBLIC KEY BLOCK-----
mQINBGP78C4BEAD0rF9zjGPAt0thlt5C1ebzccAVX7Nb1v+eqQjk+WEZdTETVCg3
WAM5ngArlHdm/fZqzUgO+pAYrB60GKeg7ffUDf+S0XFKEZdeRLYeAaqqKhSibVal
DjvOBOztu3W607HLETQAqA7wTPuIt2WqmpL60NIcyr27LxqmgdN3mNvZ2iLO+bP0
nKR/C+PgE9H4ytywDa12zMx6PmZCnVOOOu6XZEFmdUxxdQ9fFDqd9LcBKY2LDOcS
Yo1saY0YWiZWHtzVoZu1kOzjnS5Fjq/yBHJLImDH7pNxHm7s/PnaurpmQFtDFruk
t+2lhDnpKUmGr/I/3IHqH/X+9nPoS4uiqQ5HpblB8BK+4WfpaiEg75LnvuOPfZIP
KYyXa/0A7QojMwgOrD88ozT+VCkKkkJ+ijXZ7gHNjmcBaUdKK7fDIEOYI63Lyc6Q
WkGQTigFffSUXWHDCO9aXNhP3ejqFWgGMtCUsrbkcJkWuWY7q5ARy/05HbSM3K4D
U9eqtnxmiV1WQ8nXuI9JgJQRvh5PTkny5LtxqzcmqvWO9TjHBbrs14BPEO9fcXxK
L/CFBbzXDSvvAgArdqqlMoncQ/yicTlfL6qzJ8EKFiqW14QMTdAn6SuuZTodXCTi
InwoT7WjjuFPKKdvfH1GP4bnqdzTnzLxCSDIEtfyfPsIX+9GI7Jkk/zZjQARAQAB
tDdTaW1vbiBTYXdpY2tpICh5dC1kbHAgc2lnbmluZyBrZXkpIDxjb250YWN0QGdy
dWI0ay54eXo+iQJOBBMBCgA4FiEErAy75oSNaoc0ZK9OV89lkztadYEFAmP78C4C
GwMFCwkIBwIGFQoJCAsCBBYCAwECHgECF4AACgkQV89lkztadYEVqQ//cW7TxhXg
7Xbh2EZQzXml0egn6j8QaV9KzGragMiShrlvTO2zXfLXqyizrFP4AspgjSn/4NrI
8mluom+Yi+qr7DXT4BjQqIM9y3AjwZPdywe912Lxcw52NNoPZCm24I9T7ySc8lmR
FQvZC0w4H/VTNj/2lgJ1dwMflpwvNRiWa5YzcFGlCUeDIPskLx9++AJE+xwU3LYm
jQQsPBqpHHiTBEJzMLl+rfd9Fg4N+QNzpFkTDW3EPerLuvJniSBBwZthqxeAtw4M
UiAXh6JvCc2hJkKCoygRfM281MeolvmsGNyQm+axlB0vyldiPP6BnaRgZlx+l6MU
cPqgHblb7RW5j9lfr6OYL7SceBIHNv0CFrt1OnkGo/tVMwcs8LH3Ae4a7UJlIceL
V54aRxSsZU7w4iX+PB79BWkEsQzwKrUuJVOeL4UDwWajp75OFaUqbS/slDDVXvK5
OIeuth3mA/adjdvgjPxhRQjA3l69rRWIJDrqBSHldmRsnX6cvXTDy8wSXZgy51lP
m4IVLHnCy9m4SaGGoAsfTZS0cC9FgjUIyTyrq9M67wOMpUxnuB0aRZgJE1DsI23E
qdvcSNVlO+39xM/KPWUEh6b83wMn88QeW+DCVGWACQq5N3YdPnAJa50617fGbY6I
gXIoRHXkDqe23PZ/jURYCv0sjVtjPoVC+bg=
=bJkn
-----END PGP PUBLIC KEY BLOCK-----

117
bundle/pyinstaller.py → pyinst.py Executable file → Normal file
View File

@ -1,31 +1,34 @@
#!/usr/bin/env python3
# Allow direct execution
# coding: utf-8
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import platform
import sys
from PyInstaller.utils.hooks import collect_submodules
from PyInstaller.__main__ import run as run_pyinstaller
from devscripts.utils import read_version
OS_NAME = platform.system()
if OS_NAME == 'Windows':
from PyInstaller.utils.win32.versioninfo import (
VarStruct, VarFileInfo, StringStruct, StringTable,
StringFileInfo, FixedFileInfo, VSVersionInfo, SetVersion,
)
elif OS_NAME == 'Darwin':
pass
else:
raise Exception('{OS_NAME} is not supported')
OS_NAME, MACHINE, ARCH = sys.platform, platform.machine().lower(), platform.architecture()[0][:2]
if MACHINE in ('x86', 'x86_64', 'amd64', 'i386', 'i686'):
MACHINE = 'x86' if ARCH == '32' else ''
ARCH = platform.architecture()[0][:2]
def main():
opts, version = parse_options(), read_version()
opts = parse_options()
version = read_version()
onedir = '--onedir' in opts or '-D' in opts
if not onedir and '-F' not in opts and '--onefile' not in opts:
opts.append('--onefile')
suffix = '_macos' if OS_NAME == 'Darwin' else '_x86' if ARCH == '32' else ''
final_file = 'dist/%syt-dlp%s%s' % (
'yt-dlp/' if '--onedir' in opts else '', suffix, '.exe' if OS_NAME == 'Windows' else '')
name, final_file = exe(onedir)
print(f'Building yt-dlp v{version} for {OS_NAME} {platform.machine()} with options {opts}')
print(f'Building yt-dlp v{version} {ARCH}bit for {OS_NAME} with options {opts}')
print('Remember to update the version using "devscripts/update-version.py"')
if not os.path.isfile('yt_dlp/extractor/lazy_extractors.py'):
print('WARNING: Building without lazy_extractors. Run '
@ -33,43 +36,36 @@ def main():
print(f'Destination: {final_file}\n')
opts = [
f'--name={name}',
f'--name=yt-dlp{suffix}',
'--icon=devscripts/logo.ico',
'--upx-exclude=vcruntime140.dll',
'--noconfirm',
'--additional-hooks-dir=yt_dlp/__pyinstaller',
*dependency_options(),
*opts,
'yt_dlp/__main__.py',
]
print(f'Running PyInstaller with {opts}')
run_pyinstaller(opts)
import PyInstaller.__main__
PyInstaller.__main__.run(opts)
set_version_info(final_file, version)
def parse_options():
# Compatibility with older arguments
# Compatability with older arguments
opts = sys.argv[1:]
if opts[0:1] in (['32'], ['64']):
if ARCH != opts[0]:
raise Exception(f'{opts[0]}bit executable cannot be built on a {ARCH}bit system')
opts = opts[1:]
return opts
return opts or ['--onefile']
def exe(onedir):
"""@returns (name, path)"""
name = '_'.join(filter(None, (
'yt-dlp',
{'win32': '', 'darwin': 'macos'}.get(OS_NAME, OS_NAME),
MACHINE,
)))
return name, ''.join(filter(None, (
'dist/',
onedir and f'{name}/',
name,
OS_NAME == 'win32' and '.exe',
)))
def read_version():
exec(compile(open('yt_dlp/version.py').read(), 'yt_dlp/version.py', 'exec'))
return locals()['__version__']
def version_to_list(version):
@ -77,29 +73,36 @@ def version_to_list(version):
return list(map(int, version_list)) + [0] * (4 - len(version_list))
def dependency_options():
dependencies = [pycryptodome_module(), 'mutagen'] + collect_submodules('websockets')
excluded_modules = ['test', 'ytdlp_plugins', 'youtube-dl', 'youtube-dlc']
yield from (f'--hidden-import={module}' for module in dependencies)
yield from (f'--exclude-module={module}' for module in excluded_modules)
def pycryptodome_module():
try:
import Cryptodome # noqa: F401
except ImportError:
try:
import Crypto # noqa: F401
print('WARNING: Using Crypto since Cryptodome is not available. '
'Install with: pip install pycryptodomex', file=sys.stderr)
return 'Crypto'
except ImportError:
pass
return 'Cryptodome'
def set_version_info(exe, version):
if OS_NAME == 'win32':
if OS_NAME == 'Windows':
windows_set_version(exe, version)
def windows_set_version(exe, version):
from PyInstaller.utils.win32.versioninfo import (
FixedFileInfo,
StringFileInfo,
StringStruct,
StringTable,
VarFileInfo,
VarStruct,
VSVersionInfo,
)
try:
from PyInstaller.utils.win32.versioninfo import SetVersion
except ImportError: # Pyinstaller >= 5.8
from PyInstaller.utils.win32.versioninfo import write_version_info_to_executable as SetVersion
version_list = version_to_list(version)
suffix = MACHINE and f'_{MACHINE}'
suffix = '_x86' if ARCH == '32' else ''
SetVersion(exe, VSVersionInfo(
ffi=FixedFileInfo(
filevers=version_list,
@ -113,9 +116,9 @@ def windows_set_version(exe, version):
),
kids=[
StringFileInfo([StringTable('040904B0', [
StringStruct('Comments', f'yt-dlp{suffix} Command Line Interface'),
StringStruct('Comments', 'yt-dlp%s Command Line Interface.' % suffix),
StringStruct('CompanyName', 'https://github.com/yt-dlp'),
StringStruct('FileDescription', 'yt-dlp%s' % (MACHINE and f' ({MACHINE})')),
StringStruct('FileDescription', 'yt-dlp%s' % (' (32 Bit)' if ARCH == '32' else '')),
StringStruct('FileVersion', version),
StringStruct('InternalName', f'yt-dlp{suffix}'),
StringStruct('LegalCopyright', 'pukkandan.ytdlp@gmail.com | UNLICENSE'),
@ -123,8 +126,8 @@ def windows_set_version(exe, version):
StringStruct('ProductName', f'yt-dlp{suffix}'),
StringStruct(
'ProductVersion', f'{version}{suffix} on Python {platform.python_version()}'),
])]), VarFileInfo([VarStruct('Translation', [0, 1200])]),
],
])]), VarFileInfo([VarStruct('Translation', [0, 1200])])
]
))

View File

@ -1,383 +0,0 @@
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "yt-dlp"
maintainers = [
{name = "pukkandan", email = "pukkandan.ytdlp@gmail.com"},
{name = "Grub4K", email = "contact@grub4k.xyz"},
{name = "bashonly", email = "bashonly@protonmail.com"},
{name = "coletdjnz", email = "coletdjnz@protonmail.com"},
]
description = "A feature-rich command-line audio/video downloader"
readme = "README.md"
requires-python = ">=3.8"
keywords = [
"youtube-dl",
"video-downloader",
"youtube-downloader",
"sponsorblock",
"youtube-dlc",
"yt-dlp",
]
license = {file = "LICENSE"}
classifiers = [
"Topic :: Multimedia :: Video",
"Development Status :: 5 - Production/Stable",
"Environment :: Console",
"Programming Language :: Python",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: Implementation",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
"License :: OSI Approved :: The Unlicense (Unlicense)",
"Operating System :: OS Independent",
]
dynamic = ["version"]
dependencies = [
"brotli; implementation_name=='cpython'",
"brotlicffi; implementation_name!='cpython'",
"certifi",
"mutagen",
"pycryptodomex",
"requests>=2.32.2,<3",
"urllib3>=1.26.17,<3",
"websockets>=12.0",
]
[project.optional-dependencies]
default = []
curl-cffi = ["curl-cffi==0.5.10; implementation_name=='cpython'"]
secretstorage = [
"cffi",
"secretstorage",
]
build = [
"build",
"hatchling",
"pip",
"setuptools",
"wheel",
]
dev = [
"pre-commit",
"yt-dlp[static-analysis]",
"yt-dlp[test]",
]
static-analysis = [
"autopep8~=2.0",
"ruff~=0.4.4",
]
test = [
"pytest~=8.1",
]
pyinstaller = [
"pyinstaller>=6.7.0", # for compat with setuptools>=70
]
py2exe = [
"py2exe>=0.12",
]
[project.urls]
Documentation = "https://github.com/yt-dlp/yt-dlp#readme"
Repository = "https://github.com/yt-dlp/yt-dlp"
Tracker = "https://github.com/yt-dlp/yt-dlp/issues"
Funding = "https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators"
[project.scripts]
yt-dlp = "yt_dlp:main"
[project.entry-points.pyinstaller40]
hook-dirs = "yt_dlp.__pyinstaller:get_hook_dirs"
[tool.hatch.build.targets.sdist]
include = [
"/yt_dlp",
"/devscripts",
"/test",
"/.gitignore", # included by default, needed for auto-excludes
"/Changelog.md",
"/LICENSE", # included as license
"/pyproject.toml", # included by default
"/README.md", # included as readme
"/setup.cfg",
"/supportedsites.md",
]
artifacts = [
"/yt_dlp/extractor/lazy_extractors.py",
"/completions",
"/AUTHORS", # included by default
"/README.txt",
"/yt-dlp.1",
]
[tool.hatch.build.targets.wheel]
packages = ["yt_dlp"]
artifacts = ["/yt_dlp/extractor/lazy_extractors.py"]
[tool.hatch.build.targets.wheel.shared-data]
"completions/bash/yt-dlp" = "share/bash-completion/completions/yt-dlp"
"completions/zsh/_yt-dlp" = "share/zsh/site-functions/_yt-dlp"
"completions/fish/yt-dlp.fish" = "share/fish/vendor_completions.d/yt-dlp.fish"
"README.txt" = "share/doc/yt_dlp/README.txt"
"yt-dlp.1" = "share/man/man1/yt-dlp.1"
[tool.hatch.version]
path = "yt_dlp/version.py"
pattern = "_pkg_version = '(?P<version>[^']+)'"
[tool.hatch.envs.default]
features = ["curl-cffi", "default"]
dependencies = ["pre-commit"]
path = ".venv"
installer = "uv"
[tool.hatch.envs.default.scripts]
setup = "pre-commit install --config .pre-commit-hatch.yaml"
yt-dlp = "python -Werror -Xdev -m yt_dlp {args}"
[tool.hatch.envs.hatch-static-analysis]
detached = true
features = ["static-analysis"]
dependencies = [] # override hatch ruff version
config-path = "pyproject.toml"
[tool.hatch.envs.hatch-static-analysis.scripts]
format-check = "autopep8 --diff {args:.}"
format-fix = "autopep8 --in-place {args:.}"
lint-check = "ruff check {args:.}"
lint-fix = "ruff check --fix {args:.}"
[tool.hatch.envs.hatch-test]
features = ["test"]
dependencies = [
"pytest-randomly~=3.15",
"pytest-rerunfailures~=14.0",
"pytest-xdist[psutil]~=3.5",
]
[tool.hatch.envs.hatch-test.scripts]
run = "python -m devscripts.run_tests {args}"
run-cov = "echo Code coverage not implemented && exit 1"
[[tool.hatch.envs.hatch-test.matrix]]
python = [
"3.8",
"3.9",
"3.10",
"3.11",
"3.12",
"pypy3.8",
"pypy3.9",
"pypy3.10",
]
[tool.ruff]
line-length = 120
[tool.ruff.lint]
ignore = [
"E402", # module-import-not-at-top-of-file
"E501", # line-too-long
"E731", # lambda-assignment
"E741", # ambiguous-variable-name
"UP036", # outdated-version-block
"B006", # mutable-argument-default
"B008", # function-call-in-default-argument
"B011", # assert-false
"B017", # assert-raises-exception
"B023", # function-uses-loop-variable (false positives)
"B028", # no-explicit-stacklevel
"B904", # raise-without-from-inside-except
"C401", # unnecessary-generator-set
"C402", # unnecessary-generator-dict
"PIE790", # unnecessary-placeholder
"SIM102", # collapsible-if
"SIM108", # if-else-block-instead-of-if-exp
"SIM112", # uncapitalized-environment-variables
"SIM113", # enumerate-for-loop
"SIM114", # if-with-same-arms
"SIM115", # open-file-with-context-handler
"SIM117", # multiple-with-statements
"SIM223", # expr-and-false
"SIM300", # yoda-conditions
"TD001", # invalid-todo-tag
"TD002", # missing-todo-author
"TD003", # missing-todo-link
"PLE0604", # invalid-all-object (false positives)
"PLW0603", # global-statement
"PLW1510", # subprocess-run-without-check
"PLW2901", # redefined-loop-name
"RUF001", # ambiguous-unicode-character-string
"RUF012", # mutable-class-default
"RUF100", # unused-noqa (flake8 has slightly different behavior)
]
select = [
"E", # pycodestyle Error
"W", # pycodestyle Warning
"F", # Pyflakes
"I", # isort
"Q", # flake8-quotes
"N803", # invalid-argument-name
"N804", # invalid-first-argument-name-for-class-method
"UP", # pyupgrade
"B", # flake8-bugbear
"A", # flake8-builtins
"COM", # flake8-commas
"C4", # flake8-comprehensions
"FA", # flake8-future-annotations
"ISC", # flake8-implicit-str-concat
"ICN003", # banned-import-from
"PIE", # flake8-pie
"T20", # flake8-print
"RSE", # flake8-raise
"RET504", # unnecessary-assign
"SIM", # flake8-simplify
"TID251", # banned-api
"TD", # flake8-todos
"PLC", # Pylint Convention
"PLE", # Pylint Error
"PLW", # Pylint Warning
"RUF", # Ruff-specific rules
]
[tool.ruff.lint.per-file-ignores]
"devscripts/lazy_load_template.py" = [
"F401", # unused-import
]
"!yt_dlp/extractor/**.py" = [
"I", # isort
"ICN003", # banned-import-from
"T20", # flake8-print
"A002", # builtin-argument-shadowing
"C408", # unnecessary-collection-call
]
"yt_dlp/jsinterp.py" = [
"UP031", # printf-string-formatting
]
[tool.ruff.lint.isort]
known-first-party = [
"bundle",
"devscripts",
"test",
]
relative-imports-order = "closest-to-furthest"
[tool.ruff.lint.flake8-quotes]
docstring-quotes = "double"
multiline-quotes = "single"
inline-quotes = "single"
avoid-escape = false
[tool.ruff.lint.pep8-naming]
classmethod-decorators = [
"yt_dlp.utils.classproperty",
]
[tool.ruff.lint.flake8-import-conventions]
banned-from = [
"base64",
"datetime",
"functools",
"glob",
"hashlib",
"itertools",
"json",
"math",
"os",
"pathlib",
"random",
"re",
"string",
"sys",
"time",
"urllib",
"uuid",
"xml",
]
[tool.ruff.lint.flake8-tidy-imports.banned-api]
"yt_dlp.compat.compat_str".msg = "Use `str` instead."
"yt_dlp.compat.compat_b64decode".msg = "Use `base64.b64decode` instead."
"yt_dlp.compat.compat_urlparse".msg = "Use `urllib.parse` instead."
"yt_dlp.compat.compat_parse_qs".msg = "Use `urllib.parse.parse_qs` instead."
"yt_dlp.compat.compat_urllib_parse_unquote".msg = "Use `urllib.parse.unquote` instead."
"yt_dlp.compat.compat_urllib_parse_urlencode".msg = "Use `urllib.parse.urlencode` instead."
"yt_dlp.compat.compat_urllib_parse_urlparse".msg = "Use `urllib.parse.urlparse` instead."
"yt_dlp.compat.compat_shlex_quote".msg = "Use `yt_dlp.utils.shell_quote` instead."
"yt_dlp.utils.error_to_compat_str".msg = "Use `str` instead."
[tool.autopep8]
max_line_length = 120
recursive = true
exit-code = true
jobs = 0
select = [
"E101",
"E112",
"E113",
"E115",
"E116",
"E117",
"E121",
"E122",
"E123",
"E124",
"E125",
"E126",
"E127",
"E128",
"E129",
"E131",
"E201",
"E202",
"E203",
"E211",
"E221",
"E222",
"E223",
"E224",
"E225",
"E226",
"E227",
"E228",
"E231",
"E241",
"E242",
"E251",
"E252",
"E261",
"E262",
"E265",
"E266",
"E271",
"E272",
"E273",
"E274",
"E275",
"E301",
"E302",
"E303",
"E304",
"E305",
"E306",
"E502",
"E701",
"E702",
"E704",
"W391",
"W504",
]
[tool.pytest.ini_options]
addopts = "-ra -v --strict-markers"
markers = [
"download",
]

4
pytest.ini Normal file
View File

@ -0,0 +1,4 @@
[pytest]
addopts = -ra -v --strict-markers
markers =
download

3
requirements.txt Normal file
View File

@ -0,0 +1,3 @@
mutagen
pycryptodomex
websockets

View File

@ -1,39 +1,6 @@
[wheel]
universal = True
[flake8]
exclude = build,venv,.tox,.git,.pytest_cache
ignore = E402,E501,E731,E741,W503
max_line_length = 120
per_file_ignores =
devscripts/lazy_load_template.py: F401
[autoflake]
ignore-init-module-imports = true
ignore-pass-after-docstring = true
remove-all-unused-imports = true
remove-duplicate-keys = true
remove-unused-variables = true
[tox:tox]
skipsdist = true
envlist = py{38,39,310,311,312},pypy{38,39,310}
skip_missing_interpreters = true
[testenv] # tox
deps =
pytest
commands = pytest {posargs:"-m not download"}
passenv = HOME # For test_compat_expanduser
setenv =
# PYTHONWARNINGS = error # Catches PIP's warnings too
[isort]
py_version = 38
multi_line_output = VERTICAL_HANGING_INDENT
line_length = 80
reverse_relative = true
ensure_newline_before_comments = true
include_trailing_comma = true
known_first_party =
test
exclude = yt_dlp/extractor/__init__.py,devscripts/buildserver.py,devscripts/lazy_load_template.py,devscripts/make_issue_template.py,setup.py,build,.git,venv,devscripts/create-github-release.py,devscripts/release.sh,devscripts/show-downloads-statistics.py
ignore = E402,E501,E731,E741,W503

142
setup.py Normal file
View File

@ -0,0 +1,142 @@
#!/usr/bin/env python3
# coding: utf-8
import os.path
import warnings
import sys
try:
from setuptools import setup, Command, find_packages
setuptools_available = True
except ImportError:
from distutils.core import setup, Command
setuptools_available = False
from distutils.spawn import spawn
# Get the version from yt_dlp/version.py without importing the package
exec(compile(open('yt_dlp/version.py').read(), 'yt_dlp/version.py', 'exec'))
DESCRIPTION = 'A youtube-dl fork with additional features and patches'
LONG_DESCRIPTION = '\n\n'.join((
'Official repository: <https://github.com/yt-dlp/yt-dlp>',
'**PS**: Some links in this document will not work since this is a copy of the README.md from Github',
open('README.md', 'r', encoding='utf-8').read()))
REQUIREMENTS = ['mutagen', 'pycryptodomex', 'websockets']
if sys.argv[1:2] == ['py2exe']:
import py2exe
warnings.warn(
'py2exe builds do not support pycryptodomex and needs VC++14 to run. '
'The recommended way is to use "pyinst.py" to build using pyinstaller')
params = {
'console': [{
'script': './yt_dlp/__main__.py',
'dest_base': 'yt-dlp',
'version': __version__,
'description': DESCRIPTION,
'comments': LONG_DESCRIPTION.split('\n')[0],
'product_name': 'yt-dlp',
'product_version': __version__,
}],
'options': {
'py2exe': {
'bundle_files': 0,
'compressed': 1,
'optimize': 2,
'dist_dir': './dist',
'excludes': ['Crypto', 'Cryptodome'], # py2exe cannot import Crypto
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
}
},
'zipfile': None
}
else:
files_spec = [
('share/bash-completion/completions', ['completions/bash/yt-dlp']),
('share/zsh/site-functions', ['completions/zsh/_yt-dlp']),
('share/fish/vendor_completions.d', ['completions/fish/yt-dlp.fish']),
('share/doc/yt_dlp', ['README.txt']),
('share/man/man1', ['yt-dlp.1'])
]
root = os.path.dirname(os.path.abspath(__file__))
data_files = []
for dirname, files in files_spec:
resfiles = []
for fn in files:
if not os.path.exists(fn):
warnings.warn('Skipping file %s since it is not present. Try running `make pypi-files` first' % fn)
else:
resfiles.append(fn)
data_files.append((dirname, resfiles))
params = {
'data_files': data_files,
}
if setuptools_available:
params['entry_points'] = {'console_scripts': ['yt-dlp = yt_dlp:main']}
else:
params['scripts'] = ['yt-dlp']
class build_lazy_extractors(Command):
description = 'Build the extractor lazy loading module'
user_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
spawn([sys.executable, 'devscripts/make_lazy_extractors.py', 'yt_dlp/extractor/lazy_extractors.py'],
dry_run=self.dry_run)
if setuptools_available:
packages = find_packages(exclude=('youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins'))
else:
packages = ['yt_dlp', 'yt_dlp.downloader', 'yt_dlp.extractor', 'yt_dlp.postprocessor']
setup(
name='yt-dlp',
version=__version__,
maintainer='pukkandan',
maintainer_email='pukkandan.ytdlp@gmail.com',
description=DESCRIPTION,
long_description=LONG_DESCRIPTION,
long_description_content_type='text/markdown',
url='https://github.com/yt-dlp/yt-dlp',
packages=packages,
install_requires=REQUIREMENTS,
project_urls={
'Documentation': 'https://yt-dlp.readthedocs.io',
'Source': 'https://github.com/yt-dlp/yt-dlp',
'Tracker': 'https://github.com/yt-dlp/yt-dlp/issues',
'Funding': 'https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators',
},
classifiers=[
'Topic :: Multimedia :: Video',
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'Programming Language :: Python',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: Implementation',
'Programming Language :: Python :: Implementation :: CPython',
'Programming Language :: Python :: Implementation :: PyPy',
'License :: Public Domain',
'Operating System :: OS Independent',
],
python_requires='>=3.6',
cmdclass={'build_lazy_extractors': build_lazy_extractors},
**params
)

Some files were not shown because too many files have changed in this diff Show More