Compare commits

...

162 commits
v3.7.1 ... main

Author SHA1 Message Date
anthony sottile
8416413a0e
Merge pull request #3599 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-12-22 16:55:46 -05:00
pre-commit-ci[bot]
37a879e65e
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/setup-cfg-fmt: v3.1.0 → v3.2.0](https://github.com/asottile/setup-cfg-fmt/compare/v3.1.0...v3.2.0)
2025-12-22 20:26:26 +00:00
Anthony Sottile
8a0630ca1a v4.5.1
Some checks failed
main / main-linux (push) Has been cancelled
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
2025-12-16 16:13:56 -05:00
anthony sottile
fcbc745744
Merge pull request #3597 from pre-commit/empty-setup-py
fix python local template when artifact dirs are present
2025-12-16 14:56:40 -06:00
Anthony Sottile
51592eecec fix python local template when artifact dirs are present 2025-12-16 15:45:01 -05:00
anthony sottile
67e8faf80b
Merge pull request #3596 from pre-commit/pre-commit-ci-update-config
Some checks are pending
languages / language (push) Blocked by required conditions
languages / collector (push) Blocked by required conditions
languages / vars (push) Waiting to run
main / main-windows (push) Waiting to run
main / main-linux (push) Waiting to run
[pre-commit.ci] pre-commit autoupdate
2025-12-15 16:04:01 -06:00
pre-commit-ci[bot]
c251e6b6d0
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.19.0 → v1.19.1](https://github.com/pre-commit/mirrors-mypy/compare/v1.19.0...v1.19.1)
2025-12-15 20:48:45 +00:00
anthony sottile
98ccafa3ce
Merge pull request #3593 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-12-01 16:13:49 -05:00
pre-commit-ci[bot]
48953556d0
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.18.2 → v1.19.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.18.2...v1.19.0)
2025-12-01 21:05:15 +00:00
anthony sottile
2cedd58e69
Merge pull request #3588 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-11-25 10:52:12 -05:00
pre-commit-ci[bot]
465192d7de
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/pyupgrade: v3.21.1 → v3.21.2](https://github.com/asottile/pyupgrade/compare/v3.21.1...v3.21.2)
2025-11-24 20:53:38 +00:00
anthony sottile
fd42f96874
Merge pull request #3586 from pre-commit/zipapp-sha256-file-not-needed
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
remove sha256 file from zipapp script
2025-11-22 16:15:39 -05:00
anthony sottile
8ea2b790d8 remove sha256 file from zipapp script
github displays the checksum for us now!
2025-11-22 16:06:27 -05:00
anthony sottile
1af6c8fa95 v4.5.0 2025-11-22 16:02:16 -05:00
anthony sottile
3358a3b540
Merge pull request #3585 from pre-commit/hazmat
add pre-commit hazmat
2025-11-22 14:03:09 -05:00
anthony sottile
bdf68790b7 add pre-commit hazmat 2025-11-22 13:53:53 -05:00
anthony sottile
e436690f14
Merge pull request #3584 from pre-commit/exitstack
Some checks are pending
languages / vars (push) Waiting to run
languages / language (push) Blocked by required conditions
languages / collector (push) Blocked by required conditions
main / main-windows (push) Waiting to run
main / main-linux (push) Waiting to run
use ExitStack instead of start + stop
2025-11-21 15:19:53 -05:00
anthony sottile
8d34f95308 use ExitStack instead of start + stop 2025-11-21 15:09:41 -05:00
anthony sottile
9c7ea88ab9
Merge pull request #3583 from pre-commit/forward-compat-map-manifest
Some checks failed
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
languages / collector (push) Has been cancelled
add forward-compat error message
2025-11-19 15:10:28 -05:00
Anthony Sottile
844dacc168 add forward-compat error message 2025-11-19 14:57:01 -05:00
anthony sottile
6a1d543e52
Merge pull request #3582 from pre-commit/move-gc-back
move logic for gc back to commands.gc
2025-11-19 14:44:46 -05:00
Anthony Sottile
66278a9a0b move logic for gc back to commands.gc 2025-11-19 14:32:09 -05:00
anthony sottile
1b32c50bc7
Merge pull request #3579 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-11-10 16:53:56 -05:00
pre-commit-ci[bot]
063229aee7
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/pyupgrade: v3.21.0 → v3.21.1](https://github.com/asottile/pyupgrade/compare/v3.21.0...v3.21.1)
2025-11-10 20:59:54 +00:00
anthony sottile
49e28eea48
Merge pull request #3578 from pre-commit/store-gc-refactor
Some checks are pending
languages / vars (push) Waiting to run
languages / language (push) Blocked by required conditions
languages / collector (push) Blocked by required conditions
main / main-windows (push) Waiting to run
main / main-linux (push) Waiting to run
refactor gc into store
2025-11-09 17:16:27 -05:00
Anthony Sottile
d5c273a2ba refactor gc into store
this will make refactoring this easier later and limits the api surface of Store
2025-11-09 17:03:43 -05:00
anthony sottile
17cf886473 v4.4.0
Some checks are pending
languages / vars (push) Waiting to run
languages / language (push) Blocked by required conditions
languages / collector (push) Blocked by required conditions
main / main-windows (push) Waiting to run
main / main-linux (push) Waiting to run
2025-11-08 16:11:43 -05:00
anthony sottile
cb63a5cb9a
Merge pull request #3535 from br-rhrbacek/fix-cgroups
Fix docker-in-docker detection for cgroups v2
2025-11-08 15:45:53 -05:00
Radek Hrbacek
f80801d75a Fix docker-in-docker detection for cgroups v2 2025-11-08 15:37:32 -05:00
anthony sottile
9143fc3545
Merge pull request #3577 from pre-commit/language-unsupported
rename system and script languages to unsupported / unsupported_script
2025-11-08 15:21:50 -05:00
anthony sottile
725acc969a rename system and script languages to unsupported / unsupported_script 2025-11-08 15:09:16 -05:00
anthony sottile
3815e2e6d8
Merge pull request #3576 from pre-commit/fix-stages-config-error
fix missing context in error for stages
2025-11-08 14:44:32 -05:00
anthony sottile
aa2961c122 fix missing context in error for stages 2025-11-08 14:31:15 -05:00
anthony sottile
46297f7cd6
Merge pull request #3575 from pre-commit/rm-python3-hooks-repo
Some checks are pending
languages / vars (push) Waiting to run
languages / language (push) Blocked by required conditions
languages / collector (push) Blocked by required conditions
main / main-windows (push) Waiting to run
main / main-linux (push) Waiting to run
rm python3_hooks_repo
2025-11-08 13:45:42 -05:00
anthony sottile
95eec75004 rm python3_hooks_repo 2025-11-08 13:34:44 -05:00
anthony sottile
5e4b3546f3
Merge pull request #3574 from pre-commit/rm-hook-with-spaces-test
remove redundant system spaces test
2025-11-08 13:24:42 -05:00
anthony sottile
8bbfcf1f82 remove redundant system spaces test
`test_args_with_spaces_and_quotes` also covers this behaviour
2025-11-08 13:16:38 -05:00
anthony sottile
65175f3cf3
Merge pull request #3566 from pre-commit/upgrade-rbenv
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
upgrade rbenv / ruby-build
2025-10-24 12:34:53 -07:00
anthony sottile
fc33a62f3c upgrade rbenv / ruby-build 2025-10-24 15:18:07 -04:00
anthony sottile
2db924eb98
Merge pull request #3561 from pre-commit/warn-to-warning
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
fix deprecated call
2025-10-16 10:34:47 -04:00
Anthony Sottile
ddfcf4034b fix deprecated call 2025-10-16 10:23:30 -04:00
anthony sottile
1b424ccfa2
Merge pull request #3558 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-10-13 16:54:01 -04:00
pre-commit-ci[bot]
221637b0cb
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/setup-cfg-fmt: v2.8.0 → v3.1.0](https://github.com/asottile/setup-cfg-fmt/compare/v2.8.0...v3.1.0)
- [github.com/asottile/reorder-python-imports: v3.15.0 → v3.16.0](https://github.com/asottile/reorder-python-imports/compare/v3.15.0...v3.16.0)
- [github.com/asottile/add-trailing-comma: v3.2.0 → v4.0.0](https://github.com/asottile/add-trailing-comma/compare/v3.2.0...v4.0.0)
- [github.com/asottile/pyupgrade: v3.20.0 → v3.21.0](https://github.com/asottile/pyupgrade/compare/v3.20.0...v3.21.0)
2025-10-13 20:38:45 +00:00
anthony sottile
7ad23528d0
Merge pull request #3554 from pre-commit/all-repos_autofix_all-repos-manual
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
py310+
2025-10-10 12:08:36 -04:00
anthony sottile
f415f6c4d7 py310+
Committed via https://github.com/asottile/all-repos
2025-10-09 17:44:05 -04:00
Anthony Sottile
99fa9ba5ef
Merge pull request #3544 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-09-23 09:36:35 -04:00
pre-commit-ci[bot]
ad0d4cd427
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.17.1 → v1.18.2](https://github.com/pre-commit/mirrors-mypy/compare/v1.17.1...v1.18.2)
2025-09-22 20:44:04 +00:00
Anthony Sottile
924680e974
Merge pull request #3537 from pre-commit/security-options-null
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
handle `SecurityOptions: null` in docker response
2025-09-06 14:47:53 -04:00
anthony sottile
2930ea0fcd handle SecurityOptions: null in docker response 2025-09-06 14:40:20 -04:00
Anthony Sottile
b96127c485
Merge pull request #3536 from pre-commit/store-true-default
store_true does not need default=...
2025-09-06 14:28:02 -04:00
Anthony Sottile
954cc3b3b3
Merge pull request #3528 from JulianMaurin/feat/fail-fast
Add fail-fast argument for run command
2025-09-06 14:22:57 -04:00
anthony sottile
e671830402 store_true does not need default=... 2025-09-06 14:20:01 -04:00
JulianMaurin
c78f248c60 Add fail-fast argument for run command 2025-09-06 14:14:23 -04:00
Anthony Sottile
e70b313c80
Merge pull request #3510 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-08-13 10:21:38 -04:00
pre-commit-ci[bot]
87a681f866
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/pre-commit-hooks: v5.0.0 → v6.0.0](https://github.com/pre-commit/pre-commit-hooks/compare/v5.0.0...v6.0.0)
2025-08-11 20:46:13 +00:00
anthony sottile
b74a22d96c v4.3.0
Some checks failed
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / vars (push) Has been cancelled
languages / collector (push) Has been cancelled
languages / language (push) Has been cancelled
2025-08-09 14:54:49 -04:00
Anthony Sottile
cc899de192
Merge pull request #3507 from bc-lee/dart-fix
Make Dart pre-commit hook compatible with the latest Dart SDKs
2025-08-09 14:03:07 -04:00
Byoungchan Lee
2a0bcea757 Downgrade Dart SDK version installed in the CI 2025-08-08 17:40:30 +09:00
Byoungchan Lee
f1cc7a445f Make Dart pre-commit hook compatible with the latest Dart SDKs
Dart introduced sound null safety in version 2.12.0, and as of Dart 3,
null safety is mandatory. Older Dart SDKs allowed both pre-null safety
and null-safe packages, but modern Dart SDKs, where most source code is
null-safe, now reject pre-null safety packages.

The current `pubspec.yaml` template with `sdk: '>=2.10.0'` is
incompatible with recent Dart SDKs, leading to the following error:

An unexpected error has occurred: CalledProcessError: command: ('/path/to/dart-sdk/bin/dart', 'pub', 'get')
return code: 65
stdout:
    Resolving dependencies...
stderr:
    The lower bound of "sdk: '>=2.10.0'" must be 2.12.0'
    or higher to enable null safety.

    The current Dart SDK (3.8.3) only supports null safety.

    For details, see https://dart.dev/null-safety

To ensure compatibility with the modern Dart ecosystem, this commit
updates the minimum Dart SDK version to 2.12.0 or higher,
which implicitly supports null safety.
Additionally, `testing/get-dart.sh` has been updated to verify that
the pre-commit hook functions correctly with the latest Dart versions.
2025-08-08 17:14:59 +09:00
Anthony Sottile
72a3b71f0e
Merge pull request #3504 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-08-04 18:09:19 -04:00
pre-commit-ci[bot]
c8925a457a
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.17.0 → v1.17.1](https://github.com/pre-commit/mirrors-mypy/compare/v1.17.0...v1.17.1)
2025-08-04 20:31:31 +00:00
Anthony Sottile
a5fe6c500c
Merge pull request #3496 from ericphanson/eph/jl-startup
Some checks failed
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
Julia language: skip startup.jl file
2025-08-02 14:43:29 -04:00
Eric Hanson
6f1f433a9c Julia language: skip startup.jl file 2025-08-02 14:35:27 -04:00
Anthony Sottile
c6817210b1
Merge pull request #3499 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-07-24 11:21:39 +02:00
pre-commit-ci[bot]
4fd4537bc6
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.16.1 → v1.17.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.16.1...v1.17.0)
2025-07-21 20:02:20 +00:00
Anthony Sottile
a1d7bed86f
Merge pull request #3485 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-06-24 15:37:11 -04:00
pre-commit-ci[bot]
d1d5b3d564
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/PyCQA/flake8: 7.2.0 → 7.3.0](https://github.com/PyCQA/flake8/compare/7.2.0...7.3.0)
- [github.com/pre-commit/mirrors-mypy: v1.16.0 → v1.16.1](https://github.com/pre-commit/mirrors-mypy/compare/v1.16.0...v1.16.1)
2025-06-23 19:55:22 +00:00
Anthony Sottile
9c228a0bd8
Merge pull request #3477 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-06-02 17:29:56 -07:00
pre-commit-ci[bot]
d4f0c6e8a7
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.15.0 → v1.16.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.15.0...v1.16.0)
2025-06-02 19:57:10 +00:00
Anthony Sottile
5f0c773e74
Merge pull request #3470 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-05-27 21:36:14 -07:00
pre-commit-ci[bot]
43b426a501
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/reorder-python-imports: v3.14.0 → v3.15.0](https://github.com/asottile/reorder-python-imports/compare/v3.14.0...v3.15.0)
- [github.com/asottile/add-trailing-comma: v3.1.0 → v3.2.0](https://github.com/asottile/add-trailing-comma/compare/v3.1.0...v3.2.0)
- [github.com/asottile/pyupgrade: v3.19.1 → v3.20.0](https://github.com/asottile/pyupgrade/compare/v3.19.1...v3.20.0)
2025-05-26 19:45:48 +00:00
Anthony Sottile
8a4af027a1
Merge pull request #3446 from matthewhughes934/fix-docker-rootless-permission-mounts
Some checks failed
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
Fix permission errors for mounts in rootless docker
2025-05-23 17:11:14 -04:00
Matthew Hughes
466f6c4a39 Fix permission errors for mounts in rootless docker
By running containers in a rootless docker context as root. This is
because user and group IDs are remapped in the user namespaces uses by
rootless docker, and it's unlikely that the current user ID will map to
the same ID under this remap (see docs[1] for some more details).
Specifically, it means ownership of mounted volumes will not be for the
current user and trying to write can result in permission errors.

This change borrows heavily from an existing PR[2].

The output format of `docker system info` I don't think is
documented/guaranteed anywhere, but it should corresponding to the
format of a `/info` API request to Docker[3]

The added test _hopes_ to avoid regressions in this behaviour, but since
tests aren't run in a rootless docker context on the PR checks (and I
couldn't find an easy way to make it the case) there's still a risk of
regressions sneaking in.

Link: https://docs.docker.com/engine/security/rootless/ [1]
Link: https://github.com/pre-commit/pre-commit/pull/1484/ [2]
Link: https://docs.docker.com/reference/api/engine/version/v1.48/#tag/System/operation/SystemAuth [3]
Co-authored-by: Kurt von Laven <Kurt-von-Laven@users.noreply.github.com>
Co-authored-by: Fabrice Flore-Thébault <ffloreth@redhat.com>
2025-05-23 17:01:10 -04:00
Anthony Sottile
d2b61d0ef2
Merge pull request #3439 from pre-commit/pre-commit-ci-update-config
Some checks failed
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
[pre-commit.ci] pre-commit autoupdate
2025-03-31 15:55:15 -04:00
pre-commit-ci[bot]
43592c2a29 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2025-03-31 19:44:12 +00:00
pre-commit-ci[bot]
6d47b8d52b
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/setup-cfg-fmt: v2.7.0 → v2.8.0](https://github.com/asottile/setup-cfg-fmt/compare/v2.7.0...v2.8.0)
- [github.com/PyCQA/flake8: 7.1.2 → 7.2.0](https://github.com/PyCQA/flake8/compare/7.1.2...7.2.0)
2025-03-31 19:43:51 +00:00
Anthony Sottile
aa48766b88 v4.2.0
Some checks failed
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
2025-03-18 17:34:49 -04:00
Anthony Sottile
bf6f11dc6c
Merge pull request #3430 from pre-commit/preferential-sys-impl
adjust python default_language_version to prefer versioned exe
2025-03-18 17:26:41 -04:00
Anthony Sottile
3e8d0f5e1c adjust python default_language_version to prefer versioned exe 2025-03-18 14:55:24 -04:00
Anthony Sottile
ff7256cedf
Merge pull request #3425 from tusharsadhwani/ambiguous-ref
Some checks failed
languages / vars (push) Has been cancelled
languages / language (push) Has been cancelled
languages / collector (push) Has been cancelled
main / main-windows (push) Has been cancelled
main / main-linux (push) Has been cancelled
fix: crash on ambiguous ref 'HEAD'
2025-03-15 15:29:29 -04:00
Tushar Sadhwani
b7eb412c79 fix: crash on ambiguous ref 'HEAD' 2025-03-15 15:23:15 -04:00
Anthony Sottile
7b88c63ae6
Merge pull request #3404 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2025-02-17 16:20:33 -05:00
pre-commit-ci[bot]
94b97e28f7
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/PyCQA/flake8: 7.1.1 → 7.1.2](https://github.com/PyCQA/flake8/compare/7.1.1...7.1.2)
2025-02-17 21:07:26 +00:00
Anthony Sottile
2f93b80484
Merge pull request #3401 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2025-02-12 09:34:15 -05:00
pre-commit-ci[bot]
4f90a1e88a
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.14.1 → v1.15.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.14.1...v1.15.0)
2025-02-10 22:44:01 +00:00
Anthony Sottile
aba1ce04e7
Merge pull request #3396 from pre-commit/all-repos_autofix_all-repos-sed
upgrade asottile/workflows
2025-01-30 15:05:49 -05:00
Anthony Sottile
e2210c97e2 upgrade asottile/workflows
Committed via https://github.com/asottile/all-repos
2025-01-30 14:58:50 -05:00
Anthony Sottile
804c853d8f
Merge pull request #3390 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2025-01-20 20:43:00 -05:00
pre-commit-ci[bot]
edd0002e43
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/hhatto/autopep8: v2.3.1 → v2.3.2](https://github.com/hhatto/autopep8/compare/v2.3.1...v2.3.2)
2025-01-20 22:30:07 +00:00
Anthony Sottile
b152e922ef v4.1.0 2025-01-20 13:35:33 -05:00
Anthony Sottile
c3125a4d36
Merge pull request #3389 from lorenzwalthert/dev-always-unset-renv
Fix language:r hook installation when initiated in RStudio
2025-01-20 13:18:20 -05:00
Lorenz Walthert
c2c061cf63 fix: ensure env patch is applied for vanilla emulation
otherwise, installing the hooks when RENV_USER env variable is set (e.g. in RStudio with renv project) will result in executing the installation script in the wrong renv
2025-01-20 13:13:36 -05:00
Anthony Sottile
cd429db5e2
Merge pull request #3382 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2025-01-07 20:21:38 -05:00
pre-commit-ci[bot]
9b9f8e254d
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.14.0 → v1.14.1](https://github.com/pre-commit/mirrors-mypy/compare/v1.14.0...v1.14.1)
2025-01-06 23:30:19 +00:00
Anthony Sottile
86300a4a7e
Merge pull request #3376 from pre-commit/r-gone
install r on ubuntu runners
2024-12-28 16:16:53 -05:00
Anthony Sottile
77edad8455 install r on ubuntu runners
this was removed in ubuntu-24.04 github actions runner
2024-12-28 16:06:00 -05:00
Anthony Sottile
18b393905e
Merge pull request #3375 from pre-commit/dotnet-tests-ubuntu-latest
update .net tests to use .net 8
2024-12-28 16:03:04 -05:00
Anthony Sottile
31cb945ffb
Merge pull request #3374 from pre-commit/docker-image-tests-ubuntu-22-not-present
fix docker_image test when ubuntu:22.04 image is not pre-pulled
2024-12-28 15:52:16 -05:00
Anthony Sottile
28c3d81bd2 update .net tests to use .net 8
.net 6 and 7 were removed from github actions runners
2024-12-28 15:50:58 -05:00
Anthony Sottile
aa85be9340 fix docker_image test when ubuntu:22.04 image is not pre-pulled 2024-12-28 15:45:05 -05:00
Anthony Sottile
1027596280
Merge pull request #3373 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-12-23 21:10:03 -05:00
pre-commit-ci[bot]
db85eeed2d
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/pyupgrade: v3.19.0 → v3.19.1](https://github.com/asottile/pyupgrade/compare/v3.19.0...v3.19.1)
- [github.com/pre-commit/mirrors-mypy: v1.13.0 → v1.14.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.13.0...v1.14.0)
2024-12-23 22:45:24 +00:00
Anthony Sottile
cb14bc2d9c
Merge pull request #3304 from AleksaC/go-toolchain
disable automatic toolchain switching for golang hooks
2024-11-25 18:54:37 -05:00
AleksaC
109628c505 disable automatic toolchain switching for golang hooks 2024-11-25 18:47:18 -05:00
Anthony Sottile
74233a125a
Merge pull request #3348 from fredrikekre/fe/julia
Add support for julia hooks
2024-11-25 18:38:49 -05:00
Fredrik Ekre
85783bdc0b Add support for julia hooks
This patch adds 2nd class support for hooks using julia as the language.
pre-commit will install any dependencies defined in the hooks repo
`Project.toml` file, with support for `additional_dependencies` as well.
Julia doesn't (yet) have a way to install binaries/scripts so for julia
hooks the `entry` value is a (relative) path to a julia script within
the hooks repository. When executing a julia hook the (globally
installed) julia interpreter is prepended to the entry.

Example `.pre-commit-hooks.yaml`:

```yaml
- id: foo
  name: ...
  language: julia
  entry: bin/foo.jl --arg1
```

Example hooks repo: https://github.com/fredrikekre/runic-pre-commit/tree/fe/julia
Accompanying pre-commit.com PR: https://github.com/pre-commit/pre-commit.com/pull/998

Fixes #2689.
2024-11-25 18:31:25 -05:00
Anthony Sottile
9da45a686a
Merge pull request #3345 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-10-28 21:23:33 -04:00
pre-commit-ci[bot]
708ca3b581
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/pyupgrade: v3.18.0 → v3.19.0](https://github.com/asottile/pyupgrade/compare/v3.18.0...v3.19.0)
- [github.com/pre-commit/mirrors-mypy: v1.12.1 → v1.13.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.12.1...v1.13.0)
2024-10-28 22:56:52 +00:00
Anthony Sottile
611195a088
Merge pull request #3333 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-10-21 20:42:52 -04:00
Anthony Sottile
0de4c8028a remove unused type ignore 2024-10-21 20:35:56 -04:00
pre-commit-ci[bot]
46de4da34e
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/setup-cfg-fmt: v2.5.0 → v2.7.0](https://github.com/asottile/setup-cfg-fmt/compare/v2.5.0...v2.7.0)
- [github.com/asottile/reorder-python-imports: v3.13.0 → v3.14.0](https://github.com/asottile/reorder-python-imports/compare/v3.13.0...v3.14.0)
- [github.com/asottile/pyupgrade: v3.17.0 → v3.18.0](https://github.com/asottile/pyupgrade/compare/v3.17.0...v3.18.0)
- [github.com/pre-commit/mirrors-mypy: v1.11.2 → v1.12.1](https://github.com/pre-commit/mirrors-mypy/compare/v1.11.2...v1.12.1)
2024-10-21 22:30:38 +00:00
Anthony Sottile
cc4a522415 v4.0.1 2024-10-08 12:08:49 -04:00
Anthony Sottile
772d7d45d3
Merge pull request #3324 from pre-commit/migrate-config-purelib
fix migrate-config for purelib yaml
2024-10-08 12:01:05 -04:00
Anthony Sottile
222c62bc5d fix migrate-config for purelib yaml 2024-10-08 11:46:48 -04:00
Anthony Sottile
3d5548b487
Merge pull request #3323 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-10-08 06:48:13 -04:00
pre-commit-ci[bot]
4235a877f3
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/pre-commit-hooks: v4.6.0 → v5.0.0](https://github.com/pre-commit/pre-commit-hooks/compare/v4.6.0...v5.0.0)
2024-10-08 00:02:26 +00:00
Anthony Sottile
dbccd57db0 v4.0.0 2024-10-05 14:58:22 -04:00
Anthony Sottile
d07e52901c
Merge pull request #3320 from pre-commit/remove-python-venv
remove deprecated python_venv alias
2024-10-05 13:58:57 -04:00
Anthony Sottile
801b956304 remove deprecated python_venv alias 2024-10-05 13:30:25 -04:00
Anthony Sottile
a2f7b80e89
Merge pull request #3315 from pre-commit/warn-deprecated-stage-names-on-init
add warning for deprecates stages for remote repos on init
2024-09-30 20:48:47 -04:00
Anthony Sottile
d31722386e add warning for deprecates stages for remote repos on init 2024-09-30 20:41:50 -04:00
Anthony Sottile
7555e11098
Merge pull request #3314 from pre-commit/remove-log-info-mock
replace log_info_mock with pytest's caplog
2024-09-30 20:07:00 -04:00
Anthony Sottile
05e365fe08
Merge pull request #3313 from pre-commit/default-stages-warning
add warning for deprecated stages values in `default_stages`
2024-09-30 20:02:15 -04:00
Anthony Sottile
1d2f1c0cce replace log_info_mock with pytest's caplog 2024-09-30 19:58:16 -04:00
Anthony Sottile
33e020f315 add warning for deprecated stages values in default_stages 2024-09-30 19:22:14 -04:00
Anthony Sottile
e7cfc0d2cb
Merge pull request #3312 from pre-commit/warning-for-old-stage-names
add warning for deprecated stages names
2024-09-30 18:48:39 -04:00
Anthony Sottile
7441a62eb1 add warning for deprecated stages names 2024-09-30 18:41:13 -04:00
Anthony Sottile
eec11bd124
Merge pull request #3311 from pre-commit/sensible-regex-for-meta
also apply sensible regex warning for `repo: meta`
2024-09-30 18:16:05 -04:00
Anthony Sottile
fa08d1d637 also apply sensible regex warning for repo: meta 2024-09-30 18:09:04 -04:00
Anthony Sottile
6c068a78d6
Merge pull request #3199 from ThisGuyCodes/thisguycodes/upgrade-ruby-build
Upgrade to ruby-build v20240501
2024-09-28 13:16:09 -04:00
Anthony Sottile
c9454e2ec3 regenerate ruby-build archive 2024-09-28 13:07:55 -04:00
Anthony Sottile
e687548842 regenerate archives with python3.12 2024-09-28 13:07:38 -04:00
Travis Johnson
a4e4cef335 Upgrade to ruby-build v20240917 2024-09-28 13:07:38 -04:00
Anthony Sottile
de8590064e
Merge pull request #3302 from pre-commit/migrate-config-stages
migrate-config rewrites deprecated stages
2024-09-16 20:45:21 -04:00
Anthony Sottile
5679399d90 migrate-config rewrites deprecated stages 2024-09-16 20:36:33 -04:00
Anthony Sottile
a7b671a758
Merge pull request #3301 from pre-commit/yaml-rewrite
change migrate-config to use yaml parse tree instead
2024-09-16 20:24:35 -04:00
Anthony Sottile
364e6d77f0 change migrate-config to use yaml parse tree instead 2024-09-16 20:16:16 -04:00
Anthony Sottile
504149d2ca
Merge pull request #3286 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-08-26 19:03:22 -04:00
pre-commit-ci[bot]
c2c68d985c
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.11.1 → v1.11.2](https://github.com/pre-commit/mirrors-mypy/compare/v1.11.1...v1.11.2)
2024-08-26 22:18:35 +00:00
Anthony Sottile
0f8f383d53
Merge pull request #3275 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-08-05 18:46:27 -04:00
pre-commit-ci[bot]
d5c21926ab
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/PyCQA/flake8: 7.1.0 → 7.1.1](https://github.com/PyCQA/flake8/compare/7.1.0...7.1.1)
- [github.com/pre-commit/mirrors-mypy: v1.11.0 → v1.11.1](https://github.com/pre-commit/mirrors-mypy/compare/v1.11.0...v1.11.1)
2024-08-05 22:39:33 +00:00
Anthony Sottile
8a3ee454a2
Merge pull request #3270 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-07-29 21:36:13 -04:00
pre-commit-ci[bot]
917e2102be [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-07-29 21:59:19 +00:00
pre-commit-ci[bot]
9d4ab670d1
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/pyupgrade: v3.16.0 → v3.17.0](https://github.com/asottile/pyupgrade/compare/v3.16.0...v3.17.0)
2024-07-29 21:59:01 +00:00
Anthony Sottile
d46423ffe1 v3.8.0 2024-07-28 15:58:29 -04:00
Anthony Sottile
8133abd730
Merge pull request #3265 from lorenzwalthert/issue-3206
Support health check for `language: r`
2024-07-28 15:54:27 -04:00
Lorenz Walthert
da0c1d0cfa implement health check for language:r 2024-07-28 15:44:07 -04:00
Anthony Sottile
f641f6a157
Merge pull request #3264 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-07-28 15:07:33 -04:00
Anthony Sottile
a68a19d217 fixes for mypy 1.11 2024-07-28 14:57:13 -04:00
pre-commit-ci[bot]
88317ddb34
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.10.1 → v1.11.0](https://github.com/pre-commit/mirrors-mypy/compare/v1.10.1...v1.11.0)
2024-07-22 22:04:19 +00:00
Anthony Sottile
faa6f8c70c
Merge pull request #3244 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-07-05 18:34:07 -04:00
pre-commit-ci[bot]
f632459bc6
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/pre-commit/mirrors-mypy: v1.10.0 → v1.10.1](https://github.com/pre-commit/mirrors-mypy/compare/v1.10.0...v1.10.1)
2024-07-01 23:34:14 +00:00
Anthony Sottile
0252908c27
Merge pull request #3240 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-06-25 14:16:25 -04:00
pre-commit-ci[bot]
69b5dce12a
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/hhatto/autopep8: v2.3.0 → v2.3.1](https://github.com/hhatto/autopep8/compare/v2.3.0...v2.3.1)
2024-06-24 21:49:02 +00:00
Anthony Sottile
d56502acab
Merge pull request #3237 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-06-18 20:14:03 -04:00
pre-commit-ci[bot]
49a9664cd0
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/hhatto/autopep8: v2.2.0 → v2.3.0](https://github.com/hhatto/autopep8/compare/v2.2.0...v2.3.0)
- [github.com/PyCQA/flake8: 7.0.0 → 7.1.0](https://github.com/PyCQA/flake8/compare/7.0.0...7.1.0)
2024-06-17 21:57:20 +00:00
Anthony Sottile
60db5d78d1
Merge pull request #3227 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-06-13 10:28:30 -04:00
pre-commit-ci[bot]
9dd247898c
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/pyupgrade: v3.15.2 → v3.16.0](https://github.com/asottile/pyupgrade/compare/v3.15.2...v3.16.0)
2024-06-10 21:56:51 +00:00
Anthony Sottile
15d9f7f61e
Merge pull request #3217 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-06-03 21:46:10 -04:00
pre-commit-ci[bot]
1f128556e4
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/asottile/reorder-python-imports: v3.12.0 → v3.13.0](https://github.com/asottile/reorder-python-imports/compare/v3.12.0...v3.13.0)
- [github.com/hhatto/autopep8: v2.1.1 → v2.2.0](https://github.com/hhatto/autopep8/compare/v2.1.1...v2.2.0)
2024-06-03 21:47:18 +00:00
Anthony Sottile
dd144c95f6
Merge pull request #3207 from pre-commit/pre-commit-ci-update-config
[pre-commit.ci] pre-commit autoupdate
2024-05-27 17:46:19 -04:00
pre-commit-ci[bot]
5526bb2137
[pre-commit.ci] pre-commit autoupdate
updates:
- [github.com/hhatto/autopep8: v2.1.0 → v2.1.1](https://github.com/hhatto/autopep8/compare/v2.1.0...v2.1.1)
2024-05-27 21:34:15 +00:00
81 changed files with 2112 additions and 417 deletions

View file

@ -21,7 +21,7 @@ jobs:
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@v4 - uses: actions/setup-python@v4
with: with:
python-version: 3.9 python-version: '3.10'
- name: install deps - name: install deps
run: python -mpip install -e . -r requirements-dev.txt run: python -mpip install -e . -r requirements-dev.txt
- name: vars - name: vars
@ -36,10 +36,10 @@ jobs:
matrix: matrix:
include: ${{ fromJSON(needs.vars.outputs.languages) }} include: ${{ fromJSON(needs.vars.outputs.languages) }}
steps: steps:
- uses: asottile/workflows/.github/actions/fast-checkout@v1.4.0 - uses: asottile/workflows/.github/actions/fast-checkout@v1.8.1
- uses: actions/setup-python@v4 - uses: actions/setup-python@v4
with: with:
python-version: 3.9 python-version: '3.10'
- run: echo "$CONDA\Scripts" >> "$GITHUB_PATH" - run: echo "$CONDA\Scripts" >> "$GITHUB_PATH"
shell: bash shell: bash
@ -65,6 +65,8 @@ jobs:
if: matrix.os == 'windows-latest' && matrix.language == 'perl' if: matrix.os == 'windows-latest' && matrix.language == 'perl'
- uses: haskell/actions/setup@v2 - uses: haskell/actions/setup@v2
if: matrix.language == 'haskell' if: matrix.language == 'haskell'
- uses: r-lib/actions/setup-r@v2
if: matrix.os == 'ubuntu-latest' && matrix.language == 'r'
- name: install deps - name: install deps
run: python -mpip install -e . -r requirements-dev.txt run: python -mpip install -e . -r requirements-dev.txt

View file

@ -12,12 +12,12 @@ concurrency:
jobs: jobs:
main-windows: main-windows:
uses: asottile/workflows/.github/workflows/tox.yml@v1.6.0 uses: asottile/workflows/.github/workflows/tox.yml@v1.8.1
with: with:
env: '["py39"]' env: '["py310"]'
os: windows-latest os: windows-latest
main-linux: main-linux:
uses: asottile/workflows/.github/workflows/tox.yml@v1.6.0 uses: asottile/workflows/.github/workflows/tox.yml@v1.8.1
with: with:
env: '["py39", "py310", "py311", "py312"]' env: '["py310", "py311", "py312", "py313"]'
os: ubuntu-latest os: ubuntu-latest

View file

@ -1,6 +1,6 @@
repos: repos:
- repo: https://github.com/pre-commit/pre-commit-hooks - repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0 rev: v6.0.0
hooks: hooks:
- id: trailing-whitespace - id: trailing-whitespace
- id: end-of-file-fixer - id: end-of-file-fixer
@ -10,35 +10,35 @@ repos:
- id: name-tests-test - id: name-tests-test
- id: requirements-txt-fixer - id: requirements-txt-fixer
- repo: https://github.com/asottile/setup-cfg-fmt - repo: https://github.com/asottile/setup-cfg-fmt
rev: v2.5.0 rev: v3.2.0
hooks: hooks:
- id: setup-cfg-fmt - id: setup-cfg-fmt
- repo: https://github.com/asottile/reorder-python-imports - repo: https://github.com/asottile/reorder-python-imports
rev: v3.12.0 rev: v3.16.0
hooks: hooks:
- id: reorder-python-imports - id: reorder-python-imports
exclude: ^(pre_commit/resources/|testing/resources/python3_hooks_repo/) exclude: ^pre_commit/resources/
args: [--py39-plus, --add-import, 'from __future__ import annotations'] args: [--py310-plus, --add-import, 'from __future__ import annotations']
- repo: https://github.com/asottile/add-trailing-comma - repo: https://github.com/asottile/add-trailing-comma
rev: v3.1.0 rev: v4.0.0
hooks: hooks:
- id: add-trailing-comma - id: add-trailing-comma
- repo: https://github.com/asottile/pyupgrade - repo: https://github.com/asottile/pyupgrade
rev: v3.15.2 rev: v3.21.2
hooks: hooks:
- id: pyupgrade - id: pyupgrade
args: [--py39-plus] args: [--py310-plus]
- repo: https://github.com/hhatto/autopep8 - repo: https://github.com/hhatto/autopep8
rev: v2.1.0 rev: v2.3.2
hooks: hooks:
- id: autopep8 - id: autopep8
- repo: https://github.com/PyCQA/flake8 - repo: https://github.com/PyCQA/flake8
rev: 7.0.0 rev: 7.3.0
hooks: hooks:
- id: flake8 - id: flake8
- repo: https://github.com/pre-commit/mirrors-mypy - repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.10.0 rev: v1.19.1
hooks: hooks:
- id: mypy - id: mypy
additional_dependencies: [types-all] additional_dependencies: [types-pyyaml]
exclude: ^testing/resources/ exclude: ^testing/resources/

View file

@ -1,3 +1,128 @@
4.5.1 - 2025-12-16
==================
### Fixes
- Fix `language: python` with `repo: local` without `additional_dependencies`.
- #3597 PR by @asottile.
4.5.0 - 2025-11-22
==================
### Features
- Add `pre-commit hazmat`.
- #3585 PR by @asottile.
4.4.0 - 2025-11-08
==================
### Features
- Add `--fail-fast` option to `pre-commit run`.
- #3528 PR by @JulianMaurin.
- Upgrade `ruby-build` / `rbenv`.
- #3566 PR by @asottile.
- #3565 issue by @MRigal.
- Add `language: unsupported` / `language: unsupported_script` as aliases
for `language: system` / `language: script` (which will eventually be
deprecated).
- #3577 PR by @asottile.
- Add support docker-in-docker detection for cgroups v2.
- #3535 PR by @br-rhrbacek.
- #3360 issue by @JasonAlt.
### Fixes
- Handle when docker gives `SecurityOptions: null`.
- #3537 PR by @asottile.
- #3514 issue by @jenstroeger.
- Fix error context for invalid `stages` in `.pre-commit-config.yaml`.
- #3576 PR by @asottile.
4.3.0 - 2025-08-09
==================
### Features
- `language: docker` / `language: docker_image`: detect rootless docker.
- #3446 PR by @matthewhughes934.
- #1243 issue by @dkolepp.
- `language: julia`: avoid `startup.jl` when executing hooks.
- #3496 PR by @ericphanson.
- `language: dart`: support latest dart versions which require a higher sdk
lower bound.
- #3507 PR by @bc-lee.
4.2.0 - 2025-03-18
==================
### Features
- For `language: python` first attempt a versioned python executable for
the default language version before consulting a potentially unversioned
`sys.executable`.
- #3430 PR by @asottile.
### Fixes
- Handle error during conflict detection when a file is named "HEAD"
- #3425 PR by @tusharsadhwani.
4.1.0 - 2025-01-20
==================
### Features
- Add `language: julia`.
- #3348 PR by @fredrikekre.
- #2689 issue @jmuchovej.
### Fixes
- Disable automatic toolchain switching for `language: golang`.
- #3304 PR by @AleksaC.
- #3300 issue by @AleksaC.
- #3149 issue by @nijel.
- Fix `language: r` installation when initiated by RStudio.
- #3389 PR by @lorenzwalthert.
- #3385 issue by @lorenzwalthert.
4.0.1 - 2024-10-08
==================
### Fixes
- Fix `pre-commit migrate-config` for unquoted deprecated stages names with
purelib `pyyaml`.
- #3324 PR by @asottile.
- pre-commit-ci/issues#234 issue by @lorenzwalthert.
4.0.0 - 2024-10-05
==================
### Features
- Improve `pre-commit migrate-config` to handle more yaml formats.
- #3301 PR by @asottile.
- Handle `stages` deprecation in `pre-commit migrate-config`.
- #3302 PR by @asottile.
- #2732 issue by @asottile.
- Upgrade `ruby-build`.
- #3199 PR by @ThisGuyCodes.
- Add "sensible regex" warnings to `repo: meta`.
- #3311 PR by @asottile.
- Add warnings for deprecated `stages` (`commit` -> `pre-commit`, `push` ->
`pre-push`, `merge-commit` -> `pre-merge-commit`).
- #3312 PR by @asottile.
- #3313 PR by @asottile.
- #3315 PR by @asottile.
- #2732 issue by @asottile.
### Updating
- `language: python_venv` has been removed -- use `language: python` instead.
- #3320 PR by @asottile.
- #2734 issue by @asottile.
3.8.0 - 2024-07-28
==================
### Features
- Implement health checks for `language: r` so environments are recreated if
the system version of R changes.
- #3206 issue by @lorenzwalthert.
- #3265 PR by @lorenzwalthert.
3.7.1 - 2024-05-10 3.7.1 - 2024-05-10
================== ==================
@ -72,7 +197,7 @@
- Use `time.monotonic()` for more accurate hook timing. - Use `time.monotonic()` for more accurate hook timing.
- #3024 PR by @adamchainz. - #3024 PR by @adamchainz.
### Migrating ### Updating
- Require npm 6.x+ for `language: node` hooks. - Require npm 6.x+ for `language: node` hooks.
- #2996 PR by @RoelAdriaans. - #2996 PR by @RoelAdriaans.
- #1983 issue by @henryiii. - #1983 issue by @henryiii.

View file

@ -10,6 +10,7 @@ from pre_commit.languages import dotnet
from pre_commit.languages import fail from pre_commit.languages import fail
from pre_commit.languages import golang from pre_commit.languages import golang
from pre_commit.languages import haskell from pre_commit.languages import haskell
from pre_commit.languages import julia
from pre_commit.languages import lua from pre_commit.languages import lua
from pre_commit.languages import node from pre_commit.languages import node
from pre_commit.languages import perl from pre_commit.languages import perl
@ -18,9 +19,9 @@ from pre_commit.languages import python
from pre_commit.languages import r from pre_commit.languages import r
from pre_commit.languages import ruby from pre_commit.languages import ruby
from pre_commit.languages import rust from pre_commit.languages import rust
from pre_commit.languages import script
from pre_commit.languages import swift from pre_commit.languages import swift
from pre_commit.languages import system from pre_commit.languages import unsupported
from pre_commit.languages import unsupported_script
languages: dict[str, Language] = { languages: dict[str, Language] = {
@ -33,6 +34,7 @@ languages: dict[str, Language] = {
'fail': fail, 'fail': fail,
'golang': golang, 'golang': golang,
'haskell': haskell, 'haskell': haskell,
'julia': julia,
'lua': lua, 'lua': lua,
'node': node, 'node': node,
'perl': perl, 'perl': perl,
@ -41,10 +43,8 @@ languages: dict[str, Language] = {
'r': r, 'r': r,
'ruby': ruby, 'ruby': ruby,
'rust': rust, 'rust': rust,
'script': script,
'swift': swift, 'swift': swift,
'system': system, 'unsupported': unsupported,
# TODO: fully deprecate `python_venv` 'unsupported_script': unsupported_script,
'python_venv': python,
} }
language_names = sorted(languages) language_names = sorted(languages)

View file

@ -2,9 +2,11 @@ from __future__ import annotations
import functools import functools
import logging import logging
import os.path
import re import re
import shlex import shlex
import sys import sys
from collections.abc import Callable
from collections.abc import Sequence from collections.abc import Sequence
from typing import Any from typing import Any
from typing import NamedTuple from typing import NamedTuple
@ -70,6 +72,43 @@ def transform_stage(stage: str) -> str:
return _STAGES.get(stage, stage) return _STAGES.get(stage, stage)
MINIMAL_MANIFEST_SCHEMA = cfgv.Array(
cfgv.Map(
'Hook', 'id',
cfgv.Required('id', cfgv.check_string),
cfgv.Optional('stages', cfgv.check_array(cfgv.check_string), []),
),
)
def warn_for_stages_on_repo_init(repo: str, directory: str) -> None:
try:
manifest = cfgv.load_from_filename(
os.path.join(directory, C.MANIFEST_FILE),
schema=MINIMAL_MANIFEST_SCHEMA,
load_strategy=yaml_load,
exc_tp=InvalidManifestError,
)
except InvalidManifestError:
return # they'll get a better error message when it actually loads!
legacy_stages = {} # sorted set
for hook in manifest:
for stage in hook.get('stages', ()):
if stage in _STAGES:
legacy_stages[stage] = True
if legacy_stages:
logger.warning(
f'repo `{repo}` uses deprecated stage names '
f'({", ".join(legacy_stages)}) which will be removed in a '
f'future version. '
f'Hint: often `pre-commit autoupdate --repo {shlex.quote(repo)}` '
f'will fix this. '
f'if it does not -- consider reporting an issue to that repo.',
)
class StagesMigrationNoDefault(NamedTuple): class StagesMigrationNoDefault(NamedTuple):
key: str key: str
default: Sequence[str] default: Sequence[str]
@ -78,6 +117,7 @@ class StagesMigrationNoDefault(NamedTuple):
if self.key not in dct: if self.key not in dct:
return return
with cfgv.validate_context(f'At key: {self.key}'):
val = dct[self.key] val = dct[self.key]
cfgv.check_array(cfgv.check_any)(val) cfgv.check_array(cfgv.check_any)(val)
@ -99,6 +139,94 @@ class StagesMigration(StagesMigrationNoDefault):
super().apply_default(dct) super().apply_default(dct)
class DeprecatedStagesWarning(NamedTuple):
key: str
def check(self, dct: dict[str, Any]) -> None:
if self.key not in dct:
return
val = dct[self.key]
cfgv.check_array(cfgv.check_any)(val)
legacy_stages = [stage for stage in val if stage in _STAGES]
if legacy_stages:
logger.warning(
f'hook id `{dct["id"]}` uses deprecated stage names '
f'({", ".join(legacy_stages)}) which will be removed in a '
f'future version. '
f'run: `pre-commit migrate-config` to automatically fix this.',
)
def apply_default(self, dct: dict[str, Any]) -> None:
pass
def remove_default(self, dct: dict[str, Any]) -> None:
raise NotImplementedError
class DeprecatedDefaultStagesWarning(NamedTuple):
key: str
def check(self, dct: dict[str, Any]) -> None:
if self.key not in dct:
return
val = dct[self.key]
cfgv.check_array(cfgv.check_any)(val)
legacy_stages = [stage for stage in val if stage in _STAGES]
if legacy_stages:
logger.warning(
f'top-level `default_stages` uses deprecated stage names '
f'({", ".join(legacy_stages)}) which will be removed in a '
f'future version. '
f'run: `pre-commit migrate-config` to automatically fix this.',
)
def apply_default(self, dct: dict[str, Any]) -> None:
pass
def remove_default(self, dct: dict[str, Any]) -> None:
raise NotImplementedError
def _translate_language(name: str) -> str:
return {
'system': 'unsupported',
'script': 'unsupported_script',
}.get(name, name)
class LanguageMigration(NamedTuple): # remove
key: str
check_fn: Callable[[object], None]
def check(self, dct: dict[str, Any]) -> None:
if self.key not in dct:
return
with cfgv.validate_context(f'At key: {self.key}'):
self.check_fn(_translate_language(dct[self.key]))
def apply_default(self, dct: dict[str, Any]) -> None:
if self.key not in dct:
return
dct[self.key] = _translate_language(dct[self.key])
def remove_default(self, dct: dict[str, Any]) -> None:
raise NotImplementedError
class LanguageMigrationRequired(LanguageMigration): # replace with Required
def check(self, dct: dict[str, Any]) -> None:
if self.key not in dct:
raise cfgv.ValidationError(f'Missing required key: {self.key}')
super().check(dct)
MANIFEST_HOOK_DICT = cfgv.Map( MANIFEST_HOOK_DICT = cfgv.Map(
'Hook', 'id', 'Hook', 'id',
@ -112,7 +240,7 @@ MANIFEST_HOOK_DICT = cfgv.Map(
cfgv.Required('id', cfgv.check_string), cfgv.Required('id', cfgv.check_string),
cfgv.Required('name', cfgv.check_string), cfgv.Required('name', cfgv.check_string),
cfgv.Required('entry', cfgv.check_string), cfgv.Required('entry', cfgv.check_string),
cfgv.Required('language', cfgv.check_one_of(language_names)), LanguageMigrationRequired('language', cfgv.check_one_of(language_names)),
cfgv.Optional('alias', cfgv.check_string, ''), cfgv.Optional('alias', cfgv.check_string, ''),
cfgv.Optional('files', check_string_regex, ''), cfgv.Optional('files', check_string_regex, ''),
@ -142,10 +270,19 @@ class InvalidManifestError(FatalError):
pass pass
def _load_manifest_forward_compat(contents: str) -> object:
obj = yaml_load(contents)
if isinstance(obj, dict):
check_min_version('5')
raise AssertionError('unreachable')
else:
return obj
load_manifest = functools.partial( load_manifest = functools.partial(
cfgv.load_from_filename, cfgv.load_from_filename,
schema=MANIFEST_SCHEMA, schema=MANIFEST_SCHEMA,
load_strategy=yaml_load, load_strategy=_load_manifest_forward_compat,
exc_tp=InvalidManifestError, exc_tp=InvalidManifestError,
) )
@ -267,12 +404,20 @@ class NotAllowed(cfgv.OptionalNoDefault):
raise cfgv.ValidationError(f'{self.key!r} cannot be overridden') raise cfgv.ValidationError(f'{self.key!r} cannot be overridden')
_COMMON_HOOK_WARNINGS = (
OptionalSensibleRegexAtHook('files', cfgv.check_string),
OptionalSensibleRegexAtHook('exclude', cfgv.check_string),
DeprecatedStagesWarning('stages'),
)
META_HOOK_DICT = cfgv.Map( META_HOOK_DICT = cfgv.Map(
'Hook', 'id', 'Hook', 'id',
cfgv.Required('id', cfgv.check_string), cfgv.Required('id', cfgv.check_string),
cfgv.Required('id', cfgv.check_one_of(tuple(k for k, _ in _meta))), cfgv.Required('id', cfgv.check_one_of(tuple(k for k, _ in _meta))),
# language must be system # language must be `unsupported`
cfgv.Optional('language', cfgv.check_one_of({'system'}), 'system'), cfgv.Optional(
'language', cfgv.check_one_of({'unsupported'}), 'unsupported',
),
# entry cannot be overridden # entry cannot be overridden
NotAllowed('entry', cfgv.check_any), NotAllowed('entry', cfgv.check_any),
*( *(
@ -289,6 +434,7 @@ META_HOOK_DICT = cfgv.Map(
item item
for item in MANIFEST_HOOK_DICT.items for item in MANIFEST_HOOK_DICT.items
), ),
*_COMMON_HOOK_WARNINGS,
) )
CONFIG_HOOK_DICT = cfgv.Map( CONFIG_HOOK_DICT = cfgv.Map(
'Hook', 'id', 'Hook', 'id',
@ -304,18 +450,17 @@ CONFIG_HOOK_DICT = cfgv.Map(
for item in MANIFEST_HOOK_DICT.items for item in MANIFEST_HOOK_DICT.items
if item.key != 'id' if item.key != 'id'
if item.key != 'stages' if item.key != 'stages'
if item.key != 'language' # remove
), ),
StagesMigrationNoDefault('stages', []), StagesMigrationNoDefault('stages', []),
OptionalSensibleRegexAtHook('files', cfgv.check_string), LanguageMigration('language', cfgv.check_one_of(language_names)), # remove
OptionalSensibleRegexAtHook('exclude', cfgv.check_string), *_COMMON_HOOK_WARNINGS,
) )
LOCAL_HOOK_DICT = cfgv.Map( LOCAL_HOOK_DICT = cfgv.Map(
'Hook', 'id', 'Hook', 'id',
*MANIFEST_HOOK_DICT.items, *MANIFEST_HOOK_DICT.items,
*_COMMON_HOOK_WARNINGS,
OptionalSensibleRegexAtHook('files', cfgv.check_string),
OptionalSensibleRegexAtHook('exclude', cfgv.check_string),
) )
CONFIG_REPO_DICT = cfgv.Map( CONFIG_REPO_DICT = cfgv.Map(
'Repository', 'repo', 'Repository', 'repo',
@ -368,6 +513,7 @@ CONFIG_SCHEMA = cfgv.Map(
'default_language_version', DEFAULT_LANGUAGE_VERSION, {}, 'default_language_version', DEFAULT_LANGUAGE_VERSION, {},
), ),
StagesMigration('default_stages', STAGES), StagesMigration('default_stages', STAGES),
DeprecatedDefaultStagesWarning('default_stages'),
cfgv.Optional('files', check_string_regex, ''), cfgv.Optional('files', check_string_regex, ''),
cfgv.Optional('exclude', check_string_regex, '^$'), cfgv.Optional('exclude', check_string_regex, '^$'),
cfgv.Optional('fail_fast', cfgv.check_bool, False), cfgv.Optional('fail_fast', cfgv.check_bool, False),

View file

@ -12,6 +12,7 @@ from pre_commit.clientlib import load_manifest
from pre_commit.clientlib import LOCAL from pre_commit.clientlib import LOCAL
from pre_commit.clientlib import META from pre_commit.clientlib import META
from pre_commit.store import Store from pre_commit.store import Store
from pre_commit.util import rmtree
def _mark_used_repos( def _mark_used_repos(
@ -26,7 +27,8 @@ def _mark_used_repos(
for hook in repo['hooks']: for hook in repo['hooks']:
deps = hook.get('additional_dependencies') deps = hook.get('additional_dependencies')
unused_repos.discard(( unused_repos.discard((
store.db_repo_name(repo['repo'], deps), C.LOCAL_REPO_VERSION, store.db_repo_name(repo['repo'], deps),
C.LOCAL_REPO_VERSION,
)) ))
else: else:
key = (repo['repo'], repo['rev']) key = (repo['repo'], repo['rev'])
@ -56,17 +58,19 @@ def _mark_used_repos(
)) ))
def _gc_repos(store: Store) -> int: def _gc(store: Store) -> int:
configs = store.select_all_configs() with store.exclusive_lock(), store.connect() as db:
repos = store.select_all_repos() store._create_configs_table(db)
# delete config paths which do not exist
dead_configs = [p for p in configs if not os.path.exists(p)]
live_configs = [p for p in configs if os.path.exists(p)]
repos = db.execute('SELECT repo, ref, path FROM repos').fetchall()
all_repos = {(repo, ref): path for repo, ref, path in repos} all_repos = {(repo, ref): path for repo, ref, path in repos}
unused_repos = set(all_repos) unused_repos = set(all_repos)
for config_path in live_configs:
configs_rows = db.execute('SELECT path FROM configs').fetchall()
configs = [path for path, in configs_rows]
dead_configs = []
for config_path in configs:
try: try:
config = load_config(config_path) config = load_config(config_path)
except InvalidConfigError: except InvalidConfigError:
@ -76,14 +80,19 @@ def _gc_repos(store: Store) -> int:
for repo in config['repos']: for repo in config['repos']:
_mark_used_repos(store, all_repos, unused_repos, repo) _mark_used_repos(store, all_repos, unused_repos, repo)
store.delete_configs(dead_configs) paths = [(path,) for path in dead_configs]
for db_repo_name, ref in unused_repos: db.executemany('DELETE FROM configs WHERE path = ?', paths)
store.delete_repo(db_repo_name, ref, all_repos[(db_repo_name, ref)])
db.executemany(
'DELETE FROM repos WHERE repo = ? and ref = ?',
sorted(unused_repos),
)
for k in unused_repos:
rmtree(all_repos[k])
return len(unused_repos) return len(unused_repos)
def gc(store: Store) -> int: def gc(store: Store) -> int:
with store.exclusive_lock(): output.write_line(f'{_gc(store)} repo(s) removed.')
repos_removed = _gc_repos(store)
output.write_line(f'{repos_removed} repo(s) removed.')
return 0 return 0

View file

@ -0,0 +1,95 @@
from __future__ import annotations
import argparse
import subprocess
from collections.abc import Sequence
from pre_commit.parse_shebang import normalize_cmd
def add_parsers(parser: argparse.ArgumentParser) -> None:
subparsers = parser.add_subparsers(dest='tool')
cd_parser = subparsers.add_parser(
'cd', help='cd to a subdir and run the command',
)
cd_parser.add_argument('subdir')
cd_parser.add_argument('cmd', nargs=argparse.REMAINDER)
ignore_exit_code_parser = subparsers.add_parser(
'ignore-exit-code', help='run the command but ignore the exit code',
)
ignore_exit_code_parser.add_argument('cmd', nargs=argparse.REMAINDER)
n1_parser = subparsers.add_parser(
'n1', help='run the command once per filename',
)
n1_parser.add_argument('cmd', nargs=argparse.REMAINDER)
def _cmd_filenames(cmd: tuple[str, ...]) -> tuple[
tuple[str, ...],
tuple[str, ...],
]:
for idx, val in enumerate(reversed(cmd)):
if val == '--':
split = len(cmd) - idx
break
else:
raise SystemExit('hazmat entry must end with `--`')
return cmd[:split - 1], cmd[split:]
def cd(subdir: str, cmd: tuple[str, ...]) -> int:
cmd, filenames = _cmd_filenames(cmd)
prefix = f'{subdir}/'
new_filenames = []
for filename in filenames:
if not filename.startswith(prefix):
raise SystemExit(f'unexpected file without {prefix=}: {filename}')
else:
new_filenames.append(filename.removeprefix(prefix))
cmd = normalize_cmd(cmd)
return subprocess.call((*cmd, *new_filenames), cwd=subdir)
def ignore_exit_code(cmd: tuple[str, ...]) -> int:
cmd = normalize_cmd(cmd)
subprocess.call(cmd)
return 0
def n1(cmd: tuple[str, ...]) -> int:
cmd, filenames = _cmd_filenames(cmd)
cmd = normalize_cmd(cmd)
ret = 0
for filename in filenames:
ret |= subprocess.call((*cmd, filename))
return ret
def impl(args: argparse.Namespace) -> int:
args.cmd = tuple(args.cmd)
if args.tool == 'cd':
return cd(args.subdir, args.cmd)
elif args.tool == 'ignore-exit-code':
return ignore_exit_code(args.cmd)
elif args.tool == 'n1':
return n1(args.cmd)
else:
raise NotImplementedError(f'unexpected tool: {args.tool}')
def main(argv: Sequence[str] | None = None) -> int:
parser = argparse.ArgumentParser()
add_parsers(parser)
args = parser.parse_args(argv)
return impl(args)
if __name__ == '__main__':
raise SystemExit(main())

View file

@ -106,6 +106,7 @@ def _ns(
hook=None, hook=None,
verbose=False, verbose=False,
show_diff_on_failure=False, show_diff_on_failure=False,
fail_fast=False,
) )

View file

@ -1,13 +1,21 @@
from __future__ import annotations from __future__ import annotations
import re import functools
import itertools
import textwrap import textwrap
from collections.abc import Callable
import cfgv import cfgv
import yaml import yaml
from yaml.nodes import ScalarNode
from pre_commit.clientlib import InvalidConfigError from pre_commit.clientlib import InvalidConfigError
from pre_commit.yaml import yaml_compose
from pre_commit.yaml import yaml_load from pre_commit.yaml import yaml_load
from pre_commit.yaml_rewrite import MappingKey
from pre_commit.yaml_rewrite import MappingValue
from pre_commit.yaml_rewrite import match
from pre_commit.yaml_rewrite import SequenceItem
def _is_header_line(line: str) -> bool: def _is_header_line(line: str) -> bool:
@ -38,16 +46,69 @@ def _migrate_map(contents: str) -> str:
return contents return contents
def _migrate_sha_to_rev(contents: str) -> str: def _preserve_style(n: ScalarNode, *, s: str) -> str:
return re.sub(r'(\n\s+)sha:', r'\1rev:', contents) style = n.style or ''
return f'{style}{s}{style}'
def _migrate_python_venv(contents: str) -> str: def _fix_stage(n: ScalarNode) -> str:
return re.sub( return _preserve_style(n, s=f'pre-{n.value}')
r'(\n\s+)language: python_venv\b',
r'\1language: python',
contents, def _migrate_composed(contents: str) -> str:
tree = yaml_compose(contents)
rewrites: list[tuple[ScalarNode, Callable[[ScalarNode], str]]] = []
# sha -> rev
sha_to_rev_replace = functools.partial(_preserve_style, s='rev')
sha_to_rev_matcher = (
MappingValue('repos'),
SequenceItem(),
MappingKey('sha'),
) )
for node in match(tree, sha_to_rev_matcher):
rewrites.append((node, sha_to_rev_replace))
# python_venv -> python
language_matcher = (
MappingValue('repos'),
SequenceItem(),
MappingValue('hooks'),
SequenceItem(),
MappingValue('language'),
)
python_venv_replace = functools.partial(_preserve_style, s='python')
for node in match(tree, language_matcher):
if node.value == 'python_venv':
rewrites.append((node, python_venv_replace))
# stages rewrites
default_stages_matcher = (MappingValue('default_stages'), SequenceItem())
default_stages_match = match(tree, default_stages_matcher)
hook_stages_matcher = (
MappingValue('repos'),
SequenceItem(),
MappingValue('hooks'),
SequenceItem(),
MappingValue('stages'),
SequenceItem(),
)
hook_stages_match = match(tree, hook_stages_matcher)
for node in itertools.chain(default_stages_match, hook_stages_match):
if node.value in {'commit', 'push', 'merge-commit'}:
rewrites.append((node, _fix_stage))
rewrites.sort(reverse=True, key=lambda nf: nf[0].start_mark.index)
src_parts = []
end: int | None = None
for node, func in rewrites:
src_parts.append(contents[node.end_mark.index:end])
src_parts.append(func(node))
end = node.start_mark.index
src_parts.append(contents[:end])
src_parts.reverse()
return ''.join(src_parts)
def migrate_config(config_file: str, quiet: bool = False) -> int: def migrate_config(config_file: str, quiet: bool = False) -> int:
@ -62,8 +123,7 @@ def migrate_config(config_file: str, quiet: bool = False) -> int:
raise cfgv.ValidationError(str(e)) raise cfgv.ValidationError(str(e))
contents = _migrate_map(contents) contents = _migrate_map(contents)
contents = _migrate_sha_to_rev(contents) contents = _migrate_composed(contents)
contents = _migrate_python_venv(contents)
if contents != orig_contents: if contents != orig_contents:
with open(config_file, 'w') as f: with open(config_file, 'w') as f:

View file

@ -61,7 +61,7 @@ def filter_by_include_exclude(
names: Iterable[str], names: Iterable[str],
include: str, include: str,
exclude: str, exclude: str,
) -> Generator[str, None, None]: ) -> Generator[str]:
include_re, exclude_re = re.compile(include), re.compile(exclude) include_re, exclude_re = re.compile(include), re.compile(exclude)
return ( return (
filename for filename in names filename for filename in names
@ -84,7 +84,7 @@ class Classifier:
types: Iterable[str], types: Iterable[str],
types_or: Iterable[str], types_or: Iterable[str],
exclude_types: Iterable[str], exclude_types: Iterable[str],
) -> Generator[str, None, None]: ) -> Generator[str]:
types = frozenset(types) types = frozenset(types)
types_or = frozenset(types_or) types_or = frozenset(types_or)
exclude_types = frozenset(exclude_types) exclude_types = frozenset(exclude_types)
@ -97,7 +97,7 @@ class Classifier:
): ):
yield filename yield filename
def filenames_for_hook(self, hook: Hook) -> Generator[str, None, None]: def filenames_for_hook(self, hook: Hook) -> Generator[str]:
return self.by_types( return self.by_types(
filter_by_include_exclude( filter_by_include_exclude(
self.filenames, self.filenames,
@ -298,7 +298,8 @@ def _run_hooks(
verbose=args.verbose, use_color=args.color, verbose=args.verbose, use_color=args.color,
) )
retval |= current_retval retval |= current_retval
if current_retval and (config['fail_fast'] or hook.fail_fast): fail_fast = (config['fail_fast'] or hook.fail_fast or args.fail_fast)
if current_retval and fail_fast:
break break
if retval and args.show_diff_on_failure and prior_diff: if retval and args.show_diff_on_failure and prior_diff:
if args.all_files: if args.all_files:

View file

@ -33,7 +33,7 @@ def format_env(parts: SubstitutionT, env: MutableMapping[str, str]) -> str:
def envcontext( def envcontext(
patch: PatchesT, patch: PatchesT,
_env: MutableMapping[str, str] | None = None, _env: MutableMapping[str, str] | None = None,
) -> Generator[None, None, None]: ) -> Generator[None]:
"""In this context, `os.environ` is modified according to `patch`. """In this context, `os.environ` is modified according to `patch`.
`patch` is an iterable of 2-tuples (key, value): `patch` is an iterable of 2-tuples (key, value):

View file

@ -68,7 +68,7 @@ def _log_and_exit(
@contextlib.contextmanager @contextlib.contextmanager
def error_handler() -> Generator[None, None, None]: def error_handler() -> Generator[None]:
try: try:
yield yield
except (Exception, KeyboardInterrupt) as e: except (Exception, KeyboardInterrupt) as e:

View file

@ -3,8 +3,8 @@ from __future__ import annotations
import contextlib import contextlib
import errno import errno
import sys import sys
from collections.abc import Callable
from collections.abc import Generator from collections.abc import Generator
from typing import Callable
if sys.platform == 'win32': # pragma: no cover (windows) if sys.platform == 'win32': # pragma: no cover (windows)
@ -20,7 +20,7 @@ if sys.platform == 'win32': # pragma: no cover (windows)
def _locked( def _locked(
fileno: int, fileno: int,
blocked_cb: Callable[[], None], blocked_cb: Callable[[], None],
) -> Generator[None, None, None]: ) -> Generator[None]:
try: try:
msvcrt.locking(fileno, msvcrt.LK_NBLCK, _region) msvcrt.locking(fileno, msvcrt.LK_NBLCK, _region)
except OSError: except OSError:
@ -53,7 +53,7 @@ else: # pragma: win32 no cover
def _locked( def _locked(
fileno: int, fileno: int,
blocked_cb: Callable[[], None], blocked_cb: Callable[[], None],
) -> Generator[None, None, None]: ) -> Generator[None]:
try: try:
fcntl.flock(fileno, fcntl.LOCK_EX | fcntl.LOCK_NB) fcntl.flock(fileno, fcntl.LOCK_EX | fcntl.LOCK_NB)
except OSError: # pragma: no cover (tests are single-threaded) except OSError: # pragma: no cover (tests are single-threaded)
@ -69,7 +69,7 @@ else: # pragma: win32 no cover
def lock( def lock(
path: str, path: str,
blocked_cb: Callable[[], None], blocked_cb: Callable[[], None],
) -> Generator[None, None, None]: ) -> Generator[None]:
with open(path, 'a+') as f: with open(path, 'a+') as f:
with _locked(f.fileno(), blocked_cb): with _locked(f.fileno(), blocked_cb):
yield yield

View file

@ -126,7 +126,7 @@ def get_conflicted_files() -> set[str]:
merge_diff_filenames = zsplit( merge_diff_filenames = zsplit(
cmd_output( cmd_output(
'git', 'diff', '--name-only', '--no-ext-diff', '-z', 'git', 'diff', '--name-only', '--no-ext-diff', '-z',
'-m', tree_hash, 'HEAD', 'MERGE_HEAD', '-m', tree_hash, 'HEAD', 'MERGE_HEAD', '--',
)[1], )[1],
) )
return set(merge_conflict_filenames) | set(merge_diff_filenames) return set(merge_conflict_filenames) | set(merge_diff_filenames)
@ -219,7 +219,7 @@ def check_for_cygwin_mismatch() -> None:
if is_cygwin_python ^ is_cygwin_git: if is_cygwin_python ^ is_cygwin_git:
exe_type = {True: '(cygwin)', False: '(windows)'} exe_type = {True: '(cygwin)', False: '(windows)'}
logger.warn( logger.warning(
f'pre-commit has detected a mix of cygwin python / git\n' f'pre-commit has detected a mix of cygwin python / git\n'
f'This combination is not supported, it is likely you will ' f'This combination is not supported, it is likely you will '
f'receive an error later in the program.\n' f'receive an error later in the program.\n'

View file

@ -5,6 +5,7 @@ import os
import random import random
import re import re
import shlex import shlex
import sys
from collections.abc import Generator from collections.abc import Generator
from collections.abc import Sequence from collections.abc import Sequence
from typing import Any from typing import Any
@ -127,7 +128,7 @@ def no_install(
@contextlib.contextmanager @contextlib.contextmanager
def no_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def no_env(prefix: Prefix, version: str) -> Generator[None]:
yield yield
@ -171,7 +172,10 @@ def run_xargs(
def hook_cmd(entry: str, args: Sequence[str]) -> tuple[str, ...]: def hook_cmd(entry: str, args: Sequence[str]) -> tuple[str, ...]:
return (*shlex.split(entry), *args) cmd = shlex.split(entry)
if cmd[:2] == ['pre-commit', 'hazmat']:
cmd = [sys.executable, '-m', 'pre_commit.commands.hazmat', *cmd[2:]]
return (*cmd, *args)
def basic_run_hook( def basic_run_hook(

View file

@ -41,7 +41,7 @@ def get_env_patch(env: str) -> PatchesT:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield

View file

@ -70,7 +70,7 @@ def get_env_patch(target_dir: str) -> PatchesT:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield

View file

@ -29,7 +29,7 @@ def get_env_patch(venv: str) -> PatchesT:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield

View file

@ -1,8 +1,11 @@
from __future__ import annotations from __future__ import annotations
import contextlib
import functools
import hashlib import hashlib
import json import json
import os import os
import re
from collections.abc import Sequence from collections.abc import Sequence
from pre_commit import lang_base from pre_commit import lang_base
@ -16,31 +19,33 @@ get_default_version = lang_base.basic_get_default_version
health_check = lang_base.basic_health_check health_check = lang_base.basic_health_check
in_env = lang_base.no_env # no special environment for docker in_env = lang_base.no_env # no special environment for docker
_HOSTNAME_MOUNT_RE = re.compile(
def _is_in_docker() -> bool: rb"""
try: /containers
with open('/proc/1/cgroup', 'rb') as f: (?:/overlay-containers)?
return b'docker' in f.read() /([a-z0-9]{64})
except FileNotFoundError: (?:/userdata)?
return False /hostname
""",
re.VERBOSE,
)
def _get_container_id() -> str: def _get_container_id() -> str | None:
# It's assumed that we already check /proc/1/cgroup in _is_in_docker. The with contextlib.suppress(FileNotFoundError):
# cpuset cgroup controller existed since cgroups were introduced so this with open('/proc/1/mountinfo', 'rb') as f:
# way of getting the container ID is pretty reliable. for line in f:
with open('/proc/1/cgroup', 'rb') as f: m = _HOSTNAME_MOUNT_RE.search(line)
for line in f.readlines(): if m:
if line.split(b':')[1] == b'cpuset': return m[1].decode()
return os.path.basename(line.split(b':')[2]).strip().decode()
raise RuntimeError('Failed to find the container ID in /proc/1/cgroup.') return None
def _get_docker_path(path: str) -> str: def _get_docker_path(path: str) -> str:
if not _is_in_docker():
return path
container_id = _get_container_id() container_id = _get_container_id()
if container_id is None:
return path
try: try:
_, out, _ = cmd_output_b('docker', 'inspect', container_id) _, out, _ = cmd_output_b('docker', 'inspect', container_id)
@ -101,7 +106,32 @@ def install_environment(
os.mkdir(directory) os.mkdir(directory)
@functools.lru_cache(maxsize=1)
def _is_rootless() -> bool: # pragma: win32 no cover
retcode, out, _ = cmd_output_b(
'docker', 'system', 'info', '--format', '{{ json . }}',
)
if retcode != 0:
return False
info = json.loads(out)
try:
return (
# docker:
# https://docs.docker.com/reference/api/engine/version/v1.48/#tag/System/operation/SystemInfo
'name=rootless' in (info.get('SecurityOptions') or ()) or
# podman:
# https://docs.podman.io/en/latest/_static/api.html?version=v5.4#tag/system/operation/SystemInfoLibpod
info['host']['security']['rootless']
)
except KeyError:
return False
def get_docker_user() -> tuple[str, ...]: # pragma: win32 no cover def get_docker_user() -> tuple[str, ...]: # pragma: win32 no cover
if _is_rootless():
return ()
try: try:
return ('-u', f'{os.getuid()}:{os.getgid()}') return ('-u', f'{os.getuid()}:{os.getgid()}')
except AttributeError: except AttributeError:

View file

@ -30,14 +30,14 @@ def get_env_patch(venv: str) -> PatchesT:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield
@contextlib.contextmanager @contextlib.contextmanager
def _nuget_config_no_sources() -> Generator[str, None, None]: def _nuget_config_no_sources() -> Generator[str]:
with tempfile.TemporaryDirectory() as tmpdir: with tempfile.TemporaryDirectory() as tmpdir:
nuget_config = os.path.join(tmpdir, 'nuget.config') nuget_config = os.path.join(tmpdir, 'nuget.config')
with open(nuget_config, 'w') as f: with open(nuget_config, 'w') as f:

View file

@ -75,6 +75,7 @@ def get_env_patch(venv: str, version: str) -> PatchesT:
return ( return (
('GOROOT', os.path.join(venv, '.go')), ('GOROOT', os.path.join(venv, '.go')),
('GOTOOLCHAIN', 'local'),
( (
'PATH', ( 'PATH', (
os.path.join(venv, 'bin'), os.pathsep, os.path.join(venv, 'bin'), os.pathsep,
@ -89,8 +90,7 @@ def _infer_go_version(version: str) -> str:
if version != C.DEFAULT: if version != C.DEFAULT:
return version return version
resp = urllib.request.urlopen('https://go.dev/dl/?mode=json') resp = urllib.request.urlopen('https://go.dev/dl/?mode=json')
# TODO: 3.9+ .removeprefix('go') return json.load(resp)[0]['version'].removeprefix('go')
return json.load(resp)[0]['version'][2:]
def _get_url(version: str) -> str: def _get_url(version: str) -> str:
@ -121,7 +121,7 @@ def _install_go(version: str, dest: str) -> None:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir, version)): with envcontext(get_env_patch(envdir, version)):
yield yield
@ -145,6 +145,7 @@ def install_environment(
env = no_git_env(dict(os.environ, GOPATH=gopath)) env = no_git_env(dict(os.environ, GOPATH=gopath))
env.pop('GOBIN', None) env.pop('GOBIN', None)
if version != 'system': if version != 'system':
env['GOTOOLCHAIN'] = 'local'
env['GOROOT'] = os.path.join(env_dir, '.go') env['GOROOT'] = os.path.join(env_dir, '.go')
env['PATH'] = os.pathsep.join(( env['PATH'] = os.pathsep.join((
os.path.join(env_dir, '.go', 'bin'), os.environ['PATH'], os.path.join(env_dir, '.go', 'bin'), os.environ['PATH'],

View file

@ -24,7 +24,7 @@ def get_env_patch(target_dir: str) -> PatchesT:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield

View file

@ -0,0 +1,133 @@
from __future__ import annotations
import contextlib
import os
import shutil
from collections.abc import Generator
from collections.abc import Sequence
from pre_commit import lang_base
from pre_commit.envcontext import envcontext
from pre_commit.envcontext import PatchesT
from pre_commit.envcontext import UNSET
from pre_commit.prefix import Prefix
from pre_commit.util import cmd_output_b
ENVIRONMENT_DIR = 'juliaenv'
health_check = lang_base.basic_health_check
get_default_version = lang_base.basic_get_default_version
def run_hook(
prefix: Prefix,
entry: str,
args: Sequence[str],
file_args: Sequence[str],
*,
is_local: bool,
require_serial: bool,
color: bool,
) -> tuple[int, bytes]:
# `entry` is a (hook-repo relative) file followed by (optional) args, e.g.
# `bin/id.jl` or `bin/hook.jl --arg1 --arg2` so we
# 1) shell parse it and join with args with hook_cmd
# 2) prepend the hooks prefix path to the first argument (the file), unless
# it is a local script
# 3) prepend `julia` as the interpreter
cmd = lang_base.hook_cmd(entry, args)
script = cmd[0] if is_local else prefix.path(cmd[0])
cmd = ('julia', '--startup-file=no', script, *cmd[1:])
return lang_base.run_xargs(
cmd,
file_args,
require_serial=require_serial,
color=color,
)
def get_env_patch(target_dir: str, version: str) -> PatchesT:
return (
('JULIA_LOAD_PATH', target_dir),
# May be set, remove it to not interfer with LOAD_PATH
('JULIA_PROJECT', UNSET),
)
@contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir, version)):
yield
def install_environment(
prefix: Prefix,
version: str,
additional_dependencies: Sequence[str],
) -> None:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with in_env(prefix, version):
# TODO: Support language_version with juliaup similar to rust via
# rustup
# if version != 'system':
# ...
# Copy Project.toml to hook env if it exist
os.makedirs(envdir, exist_ok=True)
project_names = ('JuliaProject.toml', 'Project.toml')
project_found = False
for project_name in project_names:
project_file = prefix.path(project_name)
if not os.path.isfile(project_file):
continue
shutil.copy(project_file, envdir)
project_found = True
break
# If no project file was found we create an empty one so that the
# package manager doesn't error
if not project_found:
open(os.path.join(envdir, 'Project.toml'), 'a').close()
# Copy Manifest.toml to hook env if it exists
manifest_names = ('JuliaManifest.toml', 'Manifest.toml')
for manifest_name in manifest_names:
manifest_file = prefix.path(manifest_name)
if not os.path.isfile(manifest_file):
continue
shutil.copy(manifest_file, envdir)
break
# Julia code to instantiate the hook environment
julia_code = """
@assert length(ARGS) > 0
hook_env = ARGS[1]
deps = join(ARGS[2:end], " ")
# We prepend @stdlib here so that we can load the package manager even
# though `get_env_patch` limits `JULIA_LOAD_PATH` to just the hook env.
pushfirst!(LOAD_PATH, "@stdlib")
using Pkg
popfirst!(LOAD_PATH)
# Instantiate the environment shipped with the hook repo. If we have
# additional dependencies we disable precompilation in this step to
# avoid double work.
precompile = isempty(deps) ? "1" : "0"
withenv("JULIA_PKG_PRECOMPILE_AUTO" => precompile) do
Pkg.instantiate()
end
# Add additional dependencies (with precompilation)
if !isempty(deps)
withenv("JULIA_PKG_PRECOMPILE_AUTO" => "1") do
Pkg.REPLMode.pkgstr("add " * deps)
end
end
"""
cmd_output_b(
'julia', '--startup-file=no', '-e', julia_code, '--', envdir,
*additional_dependencies,
cwd=prefix.prefix_dir,
)

View file

@ -44,7 +44,7 @@ def get_env_patch(d: str) -> PatchesT: # pragma: win32 no cover
@contextlib.contextmanager # pragma: win32 no cover @contextlib.contextmanager # pragma: win32 no cover
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield

View file

@ -59,7 +59,7 @@ def get_env_patch(venv: str) -> PatchesT:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield

View file

@ -33,7 +33,7 @@ def get_env_patch(venv: str) -> PatchesT:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield

View file

@ -75,6 +75,13 @@ def _find_by_py_launcher(
return None return None
def _impl_exe_name() -> str:
if sys.implementation.name == 'cpython': # pragma: cpython cover
return 'python'
else: # pragma: cpython no cover
return sys.implementation.name # pypy mostly
def _find_by_sys_executable() -> str | None: def _find_by_sys_executable() -> str | None:
def _norm(path: str) -> str | None: def _norm(path: str) -> str | None:
_, exe = os.path.split(path.lower()) _, exe = os.path.split(path.lower())
@ -100,16 +107,23 @@ def _find_by_sys_executable() -> str | None:
@functools.lru_cache(maxsize=1) @functools.lru_cache(maxsize=1)
def get_default_version() -> str: # pragma: no cover (platform dependent) def get_default_version() -> str: # pragma: no cover (platform dependent)
# First attempt from `sys.executable` (or the realpath) v_major = f'{sys.version_info[0]}'
exe = _find_by_sys_executable() v_minor = f'{sys.version_info[0]}.{sys.version_info[1]}'
if exe:
return exe
# Next try the `pythonX.X` executable # attempt the likely implementation exe
exe = f'python{sys.version_info[0]}.{sys.version_info[1]}' for potential in (v_minor, v_major):
exe = f'{_impl_exe_name()}{potential}'
if find_executable(exe): if find_executable(exe):
return exe return exe
# next try `sys.executable` (or the realpath)
maybe_exe = _find_by_sys_executable()
if maybe_exe:
return maybe_exe
# maybe on windows we can find it via py launcher?
if sys.platform == 'win32': # pragma: win32 cover
exe = f'python{v_minor}'
if _find_by_py_launcher(exe): if _find_by_py_launcher(exe):
return exe return exe
@ -152,7 +166,7 @@ def norm_version(version: str) -> str | None:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield

View file

@ -14,17 +14,101 @@ from pre_commit.envcontext import envcontext
from pre_commit.envcontext import PatchesT from pre_commit.envcontext import PatchesT
from pre_commit.envcontext import UNSET from pre_commit.envcontext import UNSET
from pre_commit.prefix import Prefix from pre_commit.prefix import Prefix
from pre_commit.util import cmd_output_b from pre_commit.util import cmd_output
from pre_commit.util import win_exe from pre_commit.util import win_exe
ENVIRONMENT_DIR = 'renv' ENVIRONMENT_DIR = 'renv'
RSCRIPT_OPTS = ('--no-save', '--no-restore', '--no-site-file', '--no-environ')
get_default_version = lang_base.basic_get_default_version get_default_version = lang_base.basic_get_default_version
health_check = lang_base.basic_health_check
_RENV_ACTIVATED_OPTS = (
'--no-save', '--no-restore', '--no-site-file', '--no-environ',
)
def _execute_r(
code: str, *,
prefix: Prefix, version: str, args: Sequence[str] = (), cwd: str,
cli_opts: Sequence[str],
) -> str:
with in_env(prefix, version), _r_code_in_tempfile(code) as f:
_, out, _ = cmd_output(
_rscript_exec(), *cli_opts, f, *args, cwd=cwd,
)
return out.rstrip('\n')
def _execute_r_in_renv(
code: str, *,
prefix: Prefix, version: str, args: Sequence[str] = (), cwd: str,
) -> str:
return _execute_r(
code=code, prefix=prefix, version=version, args=args, cwd=cwd,
cli_opts=_RENV_ACTIVATED_OPTS,
)
def _execute_vanilla_r(
code: str, *,
prefix: Prefix, version: str, args: Sequence[str] = (), cwd: str,
) -> str:
return _execute_r(
code=code, prefix=prefix, version=version, args=args, cwd=cwd,
cli_opts=('--vanilla',),
)
def _read_installed_version(envdir: str, prefix: Prefix, version: str) -> str:
return _execute_r_in_renv(
'cat(renv::settings$r.version())',
prefix=prefix, version=version,
cwd=envdir,
)
def _read_executable_version(envdir: str, prefix: Prefix, version: str) -> str:
return _execute_r_in_renv(
'cat(as.character(getRversion()))',
prefix=prefix, version=version,
cwd=envdir,
)
def _write_current_r_version(
envdir: str, prefix: Prefix, version: str,
) -> None:
_execute_r_in_renv(
'renv::settings$r.version(as.character(getRversion()))',
prefix=prefix, version=version,
cwd=envdir,
)
def health_check(prefix: Prefix, version: str) -> str | None:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
r_version_installation = _read_installed_version(
envdir=envdir, prefix=prefix, version=version,
)
r_version_current_executable = _read_executable_version(
envdir=envdir, prefix=prefix, version=version,
)
if r_version_installation in {'NULL', ''}:
return (
f'Hooks were installed with an unknown R version. R version for '
f'hook repo now set to {r_version_current_executable}'
)
elif r_version_installation != r_version_current_executable:
return (
f'Hooks were installed for R version {r_version_installation}, '
f'but current R executable has version '
f'{r_version_current_executable}'
)
return None
@contextlib.contextmanager @contextlib.contextmanager
def _r_code_in_tempfile(code: str) -> Generator[str, None, None]: def _r_code_in_tempfile(code: str) -> Generator[str]:
""" """
To avoid quoting and escaping issues, avoid `Rscript [options] -e {expr}` To avoid quoting and escaping issues, avoid `Rscript [options] -e {expr}`
but use `Rscript [options] path/to/file_with_expr.R` but use `Rscript [options] path/to/file_with_expr.R`
@ -44,7 +128,7 @@ def get_env_patch(venv: str) -> PatchesT:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield
@ -100,7 +184,7 @@ def _cmd_from_hook(
_entry_validate(cmd) _entry_validate(cmd)
cmd_part = _prefix_if_file_entry(cmd, prefix, is_local=is_local) cmd_part = _prefix_if_file_entry(cmd, prefix, is_local=is_local)
return (cmd[0], *RSCRIPT_OPTS, *cmd_part, *args) return (cmd[0], *_RENV_ACTIVATED_OPTS, *cmd_part, *args)
def install_environment( def install_environment(
@ -143,18 +227,17 @@ def install_environment(
renv::install(prefix_dir) renv::install(prefix_dir)
}} }}
""" """
_execute_vanilla_r(
r_code_inst_environment,
prefix=prefix, version=version, cwd=env_dir,
)
with _r_code_in_tempfile(r_code_inst_environment) as f: _write_current_r_version(envdir=env_dir, prefix=prefix, version=version)
cmd_output_b(_rscript_exec(), '--vanilla', f, cwd=env_dir)
if additional_dependencies: if additional_dependencies:
r_code_inst_add = 'renv::install(commandArgs(trailingOnly = TRUE))' r_code_inst_add = 'renv::install(commandArgs(trailingOnly = TRUE))'
with in_env(prefix, version): _execute_r_in_renv(
with _r_code_in_tempfile(r_code_inst_add) as f: code=r_code_inst_add, prefix=prefix, version=version,
cmd_output_b( args=additional_dependencies,
_rscript_exec(), *RSCRIPT_OPTS,
f,
*additional_dependencies,
cwd=env_dir, cwd=env_dir,
) )

View file

@ -73,7 +73,7 @@ def get_env_patch(
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir, version)): with envcontext(get_env_patch(envdir, version)):
yield yield

View file

@ -61,7 +61,7 @@ def get_env_patch(target_dir: str, version: str) -> PatchesT:
@contextlib.contextmanager @contextlib.contextmanager
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir, version)): with envcontext(get_env_patch(envdir, version)):
yield yield

View file

@ -27,7 +27,7 @@ def get_env_patch(venv: str) -> PatchesT: # pragma: win32 no cover
@contextlib.contextmanager # pragma: win32 no cover @contextlib.contextmanager # pragma: win32 no cover
def in_env(prefix: Prefix, version: str) -> Generator[None, None, None]: def in_env(prefix: Prefix, version: str) -> Generator[None]:
envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version) envdir = lang_base.environment_dir(prefix, ENVIRONMENT_DIR, version)
with envcontext(get_env_patch(envdir)): with envcontext(get_env_patch(envdir)):
yield yield

View file

@ -32,7 +32,7 @@ class LoggingHandler(logging.Handler):
@contextlib.contextmanager @contextlib.contextmanager
def logging_handler(use_color: bool) -> Generator[None, None, None]: def logging_handler(use_color: bool) -> Generator[None]:
handler = LoggingHandler(use_color) handler = LoggingHandler(use_color)
logger.addHandler(handler) logger.addHandler(handler)
logger.setLevel(logging.INFO) logger.setLevel(logging.INFO)

View file

@ -10,6 +10,7 @@ import pre_commit.constants as C
from pre_commit import clientlib from pre_commit import clientlib
from pre_commit import git from pre_commit import git
from pre_commit.color import add_color_option from pre_commit.color import add_color_option
from pre_commit.commands import hazmat
from pre_commit.commands.autoupdate import autoupdate from pre_commit.commands.autoupdate import autoupdate
from pre_commit.commands.clean import clean from pre_commit.commands.clean import clean
from pre_commit.commands.gc import gc from pre_commit.commands.gc import gc
@ -41,7 +42,7 @@ os.environ.pop('__PYVENV_LAUNCHER__', None)
os.environ.pop('PYTHONEXECUTABLE', None) os.environ.pop('PYTHONEXECUTABLE', None)
COMMANDS_NO_GIT = { COMMANDS_NO_GIT = {
'clean', 'gc', 'init-templatedir', 'sample-config', 'clean', 'gc', 'hazmat', 'init-templatedir', 'sample-config',
'validate-config', 'validate-manifest', 'validate-config', 'validate-manifest',
} }
@ -62,10 +63,10 @@ def _add_hook_type_option(parser: argparse.ArgumentParser) -> None:
def _add_run_options(parser: argparse.ArgumentParser) -> None: def _add_run_options(parser: argparse.ArgumentParser) -> None:
parser.add_argument('hook', nargs='?', help='A single hook-id to run') parser.add_argument('hook', nargs='?', help='A single hook-id to run')
parser.add_argument('--verbose', '-v', action='store_true', default=False) parser.add_argument('--verbose', '-v', action='store_true')
mutex_group = parser.add_mutually_exclusive_group(required=False) mutex_group = parser.add_mutually_exclusive_group(required=False)
mutex_group.add_argument( mutex_group.add_argument(
'--all-files', '-a', action='store_true', default=False, '--all-files', '-a', action='store_true',
help='Run on all the files in the repo.', help='Run on all the files in the repo.',
) )
mutex_group.add_argument( mutex_group.add_argument(
@ -76,6 +77,10 @@ def _add_run_options(parser: argparse.ArgumentParser) -> None:
'--show-diff-on-failure', action='store_true', '--show-diff-on-failure', action='store_true',
help='When hooks fail, run `git diff` directly afterward.', help='When hooks fail, run `git diff` directly afterward.',
) )
parser.add_argument(
'--fail-fast', action='store_true',
help='Stop after the first failing hook.',
)
parser.add_argument( parser.add_argument(
'--hook-stage', '--hook-stage',
choices=clientlib.STAGES, choices=clientlib.STAGES,
@ -241,6 +246,11 @@ def main(argv: Sequence[str] | None = None) -> int:
_add_cmd('gc', help='Clean unused cached repos.') _add_cmd('gc', help='Clean unused cached repos.')
hazmat_parser = _add_cmd(
'hazmat', help='Composable tools for rare use in hook `entry`.',
)
hazmat.add_parsers(hazmat_parser)
init_templatedir_parser = _add_cmd( init_templatedir_parser = _add_cmd(
'init-templatedir', 'init-templatedir',
help=( help=(
@ -275,7 +285,7 @@ def main(argv: Sequence[str] | None = None) -> int:
) )
_add_hook_type_option(install_parser) _add_hook_type_option(install_parser)
install_parser.add_argument( install_parser.add_argument(
'--allow-missing-config', action='store_true', default=False, '--allow-missing-config', action='store_true',
help=( help=(
'Whether to allow a missing `pre-commit` configuration file ' 'Whether to allow a missing `pre-commit` configuration file '
'or exit with a failure code.' 'or exit with a failure code.'
@ -385,6 +395,8 @@ def main(argv: Sequence[str] | None = None) -> int:
return clean(store) return clean(store)
elif args.command == 'gc': elif args.command == 'gc':
return gc(store) return gc(store)
elif args.command == 'hazmat':
return hazmat.impl(args)
elif args.command == 'hook-impl': elif args.command == 'hook-impl':
return hook_impl( return hook_impl(
store, store,

View file

@ -3,7 +3,6 @@ from __future__ import annotations
import json import json
import logging import logging
import os import os
import shlex
from collections.abc import Sequence from collections.abc import Sequence
from typing import Any from typing import Any
@ -68,14 +67,6 @@ def _hook_install(hook: Hook) -> None:
logger.info('Once installed this environment will be reused.') logger.info('Once installed this environment will be reused.')
logger.info('This may take a few minutes...') logger.info('This may take a few minutes...')
if hook.language == 'python_venv':
logger.warning(
f'`repo: {hook.src}` uses deprecated `language: python_venv`. '
f'This is an alias for `language: python`. '
f'Often `pre-commit autoupdate --repo {shlex.quote(hook.src)}` '
f'will fix this.',
)
lang = languages[hook.language] lang = languages[hook.language]
assert lang.ENVIRONMENT_DIR is not None assert lang.ENVIRONMENT_DIR is not None

View file

@ -1,4 +1,4 @@
name: pre_commit_empty_pubspec name: pre_commit_empty_pubspec
environment: environment:
sdk: '>=2.10.0' sdk: '>=2.12.0'
executables: {} executables: {}

View file

@ -1,4 +1,4 @@
from setuptools import setup from setuptools import setup
setup(name='pre-commit-placeholder-package', version='0.0.0') setup(name='pre-commit-placeholder-package', version='0.0.0', py_modules=[])

Binary file not shown.

View file

@ -33,7 +33,7 @@ def _git_apply(patch: str) -> None:
@contextlib.contextmanager @contextlib.contextmanager
def _intent_to_add_cleared() -> Generator[None, None, None]: def _intent_to_add_cleared() -> Generator[None]:
intent_to_add = git.intent_to_add_files() intent_to_add = git.intent_to_add_files()
if intent_to_add: if intent_to_add:
logger.warning('Unstaged intent-to-add files detected.') logger.warning('Unstaged intent-to-add files detected.')
@ -48,7 +48,7 @@ def _intent_to_add_cleared() -> Generator[None, None, None]:
@contextlib.contextmanager @contextlib.contextmanager
def _unstaged_changes_cleared(patch_dir: str) -> Generator[None, None, None]: def _unstaged_changes_cleared(patch_dir: str) -> Generator[None]:
tree = cmd_output('git', 'write-tree')[1].strip() tree = cmd_output('git', 'write-tree')[1].strip()
diff_cmd = ( diff_cmd = (
'git', 'diff-index', '--ignore-submodules', '--binary', 'git', 'diff-index', '--ignore-submodules', '--binary',
@ -105,7 +105,7 @@ def _unstaged_changes_cleared(patch_dir: str) -> Generator[None, None, None]:
@contextlib.contextmanager @contextlib.contextmanager
def staged_files_only(patch_dir: str) -> Generator[None, None, None]: def staged_files_only(patch_dir: str) -> Generator[None]:
"""Clear any unstaged changes from the git working directory inside this """Clear any unstaged changes from the git working directory inside this
context. context.
""" """

View file

@ -5,18 +5,18 @@ import logging
import os.path import os.path
import sqlite3 import sqlite3
import tempfile import tempfile
from collections.abc import Callable
from collections.abc import Generator from collections.abc import Generator
from collections.abc import Sequence from collections.abc import Sequence
from typing import Callable
import pre_commit.constants as C import pre_commit.constants as C
from pre_commit import clientlib
from pre_commit import file_lock from pre_commit import file_lock
from pre_commit import git from pre_commit import git
from pre_commit.util import CalledProcessError from pre_commit.util import CalledProcessError
from pre_commit.util import clean_path_on_failure from pre_commit.util import clean_path_on_failure
from pre_commit.util import cmd_output_b from pre_commit.util import cmd_output_b
from pre_commit.util import resource_text from pre_commit.util import resource_text
from pre_commit.util import rmtree
logger = logging.getLogger('pre_commit') logger = logging.getLogger('pre_commit')
@ -95,13 +95,13 @@ class Store:
' PRIMARY KEY (repo, ref)' ' PRIMARY KEY (repo, ref)'
');', ');',
) )
self._create_config_table(db) self._create_configs_table(db)
# Atomic file move # Atomic file move
os.replace(tmpfile, self.db_path) os.replace(tmpfile, self.db_path)
@contextlib.contextmanager @contextlib.contextmanager
def exclusive_lock(self) -> Generator[None, None, None]: def exclusive_lock(self) -> Generator[None]:
def blocked_cb() -> None: # pragma: no cover (tests are in-process) def blocked_cb() -> None: # pragma: no cover (tests are in-process)
logger.info('Locking pre-commit directory') logger.info('Locking pre-commit directory')
@ -112,7 +112,7 @@ class Store:
def connect( def connect(
self, self,
db_path: str | None = None, db_path: str | None = None,
) -> Generator[sqlite3.Connection, None, None]: ) -> Generator[sqlite3.Connection]:
db_path = db_path or self.db_path db_path = db_path or self.db_path
# sqlite doesn't close its fd with its contextmanager >.< # sqlite doesn't close its fd with its contextmanager >.<
# contextlib.closing fixes this. # contextlib.closing fixes this.
@ -136,6 +136,7 @@ class Store:
deps: Sequence[str], deps: Sequence[str],
make_strategy: Callable[[str], None], make_strategy: Callable[[str], None],
) -> str: ) -> str:
original_repo = repo
repo = self.db_repo_name(repo, deps) repo = self.db_repo_name(repo, deps)
def _get_result() -> str | None: def _get_result() -> str | None:
@ -168,6 +169,9 @@ class Store:
'INSERT INTO repos (repo, ref, path) VALUES (?, ?, ?)', 'INSERT INTO repos (repo, ref, path) VALUES (?, ?, ?)',
[repo, ref, directory], [repo, ref, directory],
) )
clientlib.warn_for_stages_on_repo_init(original_repo, directory)
return directory return directory
def _complete_clone(self, ref: str, git_cmd: Callable[..., None]) -> None: def _complete_clone(self, ref: str, git_cmd: Callable[..., None]) -> None:
@ -210,7 +214,7 @@ class Store:
'local', C.LOCAL_REPO_VERSION, deps, _make_local_repo, 'local', C.LOCAL_REPO_VERSION, deps, _make_local_repo,
) )
def _create_config_table(self, db: sqlite3.Connection) -> None: def _create_configs_table(self, db: sqlite3.Connection) -> None:
db.executescript( db.executescript(
'CREATE TABLE IF NOT EXISTS configs (' 'CREATE TABLE IF NOT EXISTS configs ('
' path TEXT NOT NULL,' ' path TEXT NOT NULL,'
@ -227,28 +231,5 @@ class Store:
return return
with self.connect() as db: with self.connect() as db:
# TODO: eventually remove this and only create in _create # TODO: eventually remove this and only create in _create
self._create_config_table(db) self._create_configs_table(db)
db.execute('INSERT OR IGNORE INTO configs VALUES (?)', (path,)) db.execute('INSERT OR IGNORE INTO configs VALUES (?)', (path,))
def select_all_configs(self) -> list[str]:
with self.connect() as db:
self._create_config_table(db)
rows = db.execute('SELECT path FROM configs').fetchall()
return [path for path, in rows]
def delete_configs(self, configs: list[str]) -> None:
with self.connect() as db:
rows = [(path,) for path in configs]
db.executemany('DELETE FROM configs WHERE path = ?', rows)
def select_all_repos(self) -> list[tuple[str, str, str]]:
with self.connect() as db:
return db.execute('SELECT repo, ref, path from repos').fetchall()
def delete_repo(self, db_repo_name: str, ref: str, path: str) -> None:
with self.connect() as db:
db.execute(
'DELETE FROM repos WHERE repo = ? and ref = ?',
(db_repo_name, ref),
)
rmtree(path)

View file

@ -8,10 +8,10 @@ import shutil
import stat import stat
import subprocess import subprocess
import sys import sys
from collections.abc import Callable
from collections.abc import Generator from collections.abc import Generator
from types import TracebackType from types import TracebackType
from typing import Any from typing import Any
from typing import Callable
from pre_commit import parse_shebang from pre_commit import parse_shebang
@ -25,7 +25,7 @@ def force_bytes(exc: Any) -> bytes:
@contextlib.contextmanager @contextlib.contextmanager
def clean_path_on_failure(path: str) -> Generator[None, None, None]: def clean_path_on_failure(path: str) -> Generator[None]:
"""Cleans up the directory on an exceptional failure.""" """Cleans up the directory on an exceptional failure."""
try: try:
yield yield
@ -205,7 +205,7 @@ else: # pragma: no cover
def _handle_readonly( def _handle_readonly(
func: Callable[[str], object], func: Callable[[str], object],
path: str, path: str,
exc: Exception, exc: BaseException,
) -> None: ) -> None:
if ( if (
func in (os.rmdir, os.remove, os.unlink) and func in (os.rmdir, os.remove, os.unlink) and
@ -223,7 +223,7 @@ if sys.version_info < (3, 12): # pragma: <3.12 cover
def _handle_readonly_old( def _handle_readonly_old(
func: Callable[[str], object], func: Callable[[str], object],
path: str, path: str,
excinfo: tuple[type[Exception], Exception, TracebackType], excinfo: tuple[type[BaseException], BaseException, TracebackType],
) -> None: ) -> None:
return _handle_readonly(func, path, excinfo[1]) return _handle_readonly(func, path, excinfo[1])

View file

@ -7,12 +7,12 @@ import multiprocessing
import os import os
import subprocess import subprocess
import sys import sys
from collections.abc import Callable
from collections.abc import Generator from collections.abc import Generator
from collections.abc import Iterable from collections.abc import Iterable
from collections.abc import MutableMapping from collections.abc import MutableMapping
from collections.abc import Sequence from collections.abc import Sequence
from typing import Any from typing import Any
from typing import Callable
from typing import TypeVar from typing import TypeVar
from pre_commit import parse_shebang from pre_commit import parse_shebang
@ -120,7 +120,6 @@ def partition(
@contextlib.contextmanager @contextlib.contextmanager
def _thread_mapper(maxsize: int) -> Generator[ def _thread_mapper(maxsize: int) -> Generator[
Callable[[Callable[[TArg], TRet], Iterable[TArg]], Iterable[TRet]], Callable[[Callable[[TArg], TRet], Iterable[TArg]], Iterable[TRet]],
None, None,
]: ]:
if maxsize == 1: if maxsize == 1:
yield map yield map

View file

@ -6,6 +6,7 @@ from typing import Any
import yaml import yaml
Loader = getattr(yaml, 'CSafeLoader', yaml.SafeLoader) Loader = getattr(yaml, 'CSafeLoader', yaml.SafeLoader)
yaml_compose = functools.partial(yaml.compose, Loader=Loader)
yaml_load = functools.partial(yaml.load, Loader=Loader) yaml_load = functools.partial(yaml.load, Loader=Loader)
Dumper = getattr(yaml, 'CSafeDumper', yaml.SafeDumper) Dumper = getattr(yaml, 'CSafeDumper', yaml.SafeDumper)

View file

@ -0,0 +1,52 @@
from __future__ import annotations
from collections.abc import Generator
from collections.abc import Iterable
from typing import NamedTuple
from typing import Protocol
from yaml.nodes import MappingNode
from yaml.nodes import Node
from yaml.nodes import ScalarNode
from yaml.nodes import SequenceNode
class _Matcher(Protocol):
def match(self, n: Node) -> Generator[Node]: ...
class MappingKey(NamedTuple):
k: str
def match(self, n: Node) -> Generator[Node]:
if isinstance(n, MappingNode):
for k, _ in n.value:
if k.value == self.k:
yield k
class MappingValue(NamedTuple):
k: str
def match(self, n: Node) -> Generator[Node]:
if isinstance(n, MappingNode):
for k, v in n.value:
if k.value == self.k:
yield v
class SequenceItem(NamedTuple):
def match(self, n: Node) -> Generator[Node]:
if isinstance(n, SequenceNode):
yield from n.value
def _match(gen: Iterable[Node], m: _Matcher) -> Iterable[Node]:
return (n for src in gen for n in m.match(src))
def match(n: Node, matcher: tuple[_Matcher, ...]) -> Generator[ScalarNode]:
gen: Iterable[Node] = (n,)
for m in matcher:
gen = _match(gen, m)
return (n for n in gen if isinstance(n, ScalarNode))

View file

@ -1,6 +1,6 @@
[metadata] [metadata]
name = pre_commit name = pre_commit
version = 3.7.1 version = 4.5.1
description = A framework for managing and maintaining multi-language pre-commit hooks. description = A framework for managing and maintaining multi-language pre-commit hooks.
long_description = file: README.md long_description = file: README.md
long_description_content_type = text/markdown long_description_content_type = text/markdown
@ -10,7 +10,6 @@ author_email = asottile@umich.edu
license = MIT license = MIT
license_files = LICENSE license_files = LICENSE
classifiers = classifiers =
License :: OSI Approved :: MIT License
Programming Language :: Python :: 3 Programming Language :: Python :: 3
Programming Language :: Python :: 3 :: Only Programming Language :: Python :: 3 :: Only
Programming Language :: Python :: Implementation :: CPython Programming Language :: Python :: Implementation :: CPython
@ -24,7 +23,7 @@ install_requires =
nodeenv>=0.11.1 nodeenv>=0.11.1
pyyaml>=5.1 pyyaml>=5.1
virtualenv>=20.10.0 virtualenv>=20.10.0
python_requires = >=3.9 python_requires = >=3.10
[options.packages.find] [options.packages.find]
exclude = exclude =
@ -53,6 +52,7 @@ check_untyped_defs = true
disallow_any_generics = true disallow_any_generics = true
disallow_incomplete_defs = true disallow_incomplete_defs = true
disallow_untyped_defs = true disallow_untyped_defs = true
enable_error_code = deprecated
warn_redundant_casts = true warn_redundant_casts = true
warn_unused_ignores = true warn_unused_ignores = true

View file

@ -1,7 +1,7 @@
#!/usr/bin/env bash #!/usr/bin/env bash
set -euo pipefail set -euo pipefail
VERSION=2.13.4 VERSION=2.19.6
if [ "$OSTYPE" = msys ]; then if [ "$OSTYPE" = msys ]; then
URL="https://storage.googleapis.com/dart-archive/channels/stable/release/${VERSION}/sdk/dartsdk-windows-x64-release.zip" URL="https://storage.googleapis.com/dart-archive/channels/stable/release/${VERSION}/sdk/dartsdk-windows-x64-release.zip"

View file

@ -16,8 +16,8 @@ from collections.abc import Sequence
REPOS = ( REPOS = (
('rbenv', 'https://github.com/rbenv/rbenv', '38e1fbb'), ('rbenv', 'https://github.com/rbenv/rbenv', '10e96bfc'),
('ruby-build', 'https://github.com/rbenv/ruby-build', '855b963'), ('ruby-build', 'https://github.com/rbenv/ruby-build', '447468b1'),
( (
'ruby-download', 'ruby-download',
'https://github.com/garnieretienne/rvm-download', 'https://github.com/garnieretienne/rvm-download',
@ -57,8 +57,7 @@ def make_archive(name: str, repo: str, ref: str, destdir: str) -> str:
arcs.sort() arcs.sort()
with gzip.GzipFile(output_path, 'wb', mtime=0) as gzipf: with gzip.GzipFile(output_path, 'wb', mtime=0) as gzipf:
# https://github.com/python/typeshed/issues/5491 with tarfile.open(fileobj=gzipf, mode='w') as tf:
with tarfile.open(fileobj=gzipf, mode='w') as tf: # type: ignore
for arcname, abspath in arcs: for arcname, abspath in arcs:
tf.add( tf.add(
abspath, abspath,

View file

@ -1,6 +0,0 @@
- id: python3-hook
name: Python 3 Hook
entry: python3-hook
language: python
language_version: python3
files: \.py$

View file

@ -1,8 +0,0 @@
import sys
def main():
print(sys.version_info[0])
print(repr(sys.argv[1:]))
print('Hello World')
return 0

View file

@ -1,8 +0,0 @@
from setuptools import setup
setup(
name='python3_hook',
version='0.0.0',
py_modules=['py3_hook'],
entry_points={'console_scripts': ['python3-hook = py3_hook:main']},
)

View file

@ -1,5 +0,0 @@
- id: system-hook-with-spaces
name: System hook with spaces
entry: bash -c 'echo "Hello World"'
language: system
files: \.sh$

View file

@ -40,6 +40,7 @@ def run_opts(
color=False, color=False,
verbose=False, verbose=False,
hook=None, hook=None,
fail_fast=False,
remote_branch='', remote_branch='',
local_branch='', local_branch='',
from_ref='', from_ref='',
@ -65,6 +66,7 @@ def run_opts(
color=color, color=color,
verbose=verbose, verbose=verbose,
hook=hook, hook=hook,
fail_fast=fail_fast,
remote_branch=remote_branch, remote_branch=remote_branch,
local_branch=local_branch, local_branch=local_branch,
from_ref=from_ref, from_ref=from_ref,

View file

@ -107,9 +107,6 @@ def main() -> int:
shebang = '/usr/bin/env python3' shebang = '/usr/bin/env python3'
zipapp.create_archive(tmpdir, filename, interpreter=shebang) zipapp.create_archive(tmpdir, filename, interpreter=shebang)
with open(f'{filename}.sha256sum', 'w') as f:
subprocess.check_call(('sha256sum', filename), stdout=f)
return 0 return 0

View file

@ -1,7 +0,0 @@
from __future__ import annotations
from pre_commit.all_languages import languages
def test_python_venv_is_an_alias_to_python():
assert languages['python_venv'] is languages['python']

View file

@ -12,6 +12,8 @@ from pre_commit.clientlib import CONFIG_HOOK_DICT
from pre_commit.clientlib import CONFIG_REPO_DICT from pre_commit.clientlib import CONFIG_REPO_DICT
from pre_commit.clientlib import CONFIG_SCHEMA from pre_commit.clientlib import CONFIG_SCHEMA
from pre_commit.clientlib import DEFAULT_LANGUAGE_VERSION from pre_commit.clientlib import DEFAULT_LANGUAGE_VERSION
from pre_commit.clientlib import InvalidManifestError
from pre_commit.clientlib import load_manifest
from pre_commit.clientlib import MANIFEST_HOOK_DICT from pre_commit.clientlib import MANIFEST_HOOK_DICT
from pre_commit.clientlib import MANIFEST_SCHEMA from pre_commit.clientlib import MANIFEST_SCHEMA
from pre_commit.clientlib import META_HOOK_DICT from pre_commit.clientlib import META_HOOK_DICT
@ -256,6 +258,24 @@ def test_validate_optional_sensible_regex_at_local_hook(caplog):
] ]
def test_validate_optional_sensible_regex_at_meta_hook(caplog):
config_obj = {
'repo': 'meta',
'hooks': [{'id': 'identity', 'files': 'dir/*.py'}],
}
cfgv.validate(config_obj, CONFIG_REPO_DICT)
assert caplog.record_tuples == [
(
'pre_commit',
logging.WARNING,
"The 'files' field in hook 'identity' is a regex, not a glob "
"-- matching '/*' probably isn't what you want here",
),
]
@pytest.mark.parametrize( @pytest.mark.parametrize(
('regex', 'warning'), ('regex', 'warning'),
( (
@ -291,6 +311,97 @@ def test_validate_optional_sensible_regex_at_top_level(caplog, regex, warning):
assert caplog.record_tuples == [('pre_commit', logging.WARNING, warning)] assert caplog.record_tuples == [('pre_commit', logging.WARNING, warning)]
def test_invalid_stages_error():
cfg = {'repos': [sample_local_config()]}
cfg['repos'][0]['hooks'][0]['stages'] = ['invalid']
with pytest.raises(cfgv.ValidationError) as excinfo:
cfgv.validate(cfg, CONFIG_SCHEMA)
assert str(excinfo.value) == (
'\n'
'==> At Config()\n'
'==> At key: repos\n'
"==> At Repository(repo='local')\n"
'==> At key: hooks\n'
"==> At Hook(id='do_not_commit')\n"
# this line was missing due to the custom validator
'==> At key: stages\n'
'==> At index 0\n'
"=====> Expected one of commit-msg, manual, post-checkout, post-commit, post-merge, post-rewrite, pre-commit, pre-merge-commit, pre-push, pre-rebase, prepare-commit-msg but got: 'invalid'" # noqa: E501
)
def test_warning_for_deprecated_stages(caplog):
config_obj = sample_local_config()
config_obj['hooks'][0]['stages'] = ['commit', 'push']
cfgv.validate(config_obj, CONFIG_REPO_DICT)
assert caplog.record_tuples == [
(
'pre_commit',
logging.WARNING,
'hook id `do_not_commit` uses deprecated stage names '
'(commit, push) which will be removed in a future version. '
'run: `pre-commit migrate-config` to automatically fix this.',
),
]
def test_no_warning_for_non_deprecated_stages(caplog):
config_obj = sample_local_config()
config_obj['hooks'][0]['stages'] = ['pre-commit', 'pre-push']
cfgv.validate(config_obj, CONFIG_REPO_DICT)
assert caplog.record_tuples == []
def test_warning_for_deprecated_default_stages(caplog):
cfg = {'default_stages': ['commit', 'push'], 'repos': []}
cfgv.validate(cfg, CONFIG_SCHEMA)
assert caplog.record_tuples == [
(
'pre_commit',
logging.WARNING,
'top-level `default_stages` uses deprecated stage names '
'(commit, push) which will be removed in a future version. '
'run: `pre-commit migrate-config` to automatically fix this.',
),
]
def test_no_warning_for_non_deprecated_default_stages(caplog):
cfg = {'default_stages': ['pre-commit', 'pre-push'], 'repos': []}
cfgv.validate(cfg, CONFIG_SCHEMA)
assert caplog.record_tuples == []
def test_unsupported_language_migration():
cfg = {'repos': [sample_local_config(), sample_local_config()]}
cfg['repos'][0]['hooks'][0]['language'] = 'system'
cfg['repos'][1]['hooks'][0]['language'] = 'script'
cfgv.validate(cfg, CONFIG_SCHEMA)
ret = cfgv.apply_defaults(cfg, CONFIG_SCHEMA)
assert ret['repos'][0]['hooks'][0]['language'] == 'unsupported'
assert ret['repos'][1]['hooks'][0]['language'] == 'unsupported_script'
def test_unsupported_language_migration_language_required():
cfg = {'repos': [sample_local_config()]}
del cfg['repos'][0]['hooks'][0]['language']
with pytest.raises(cfgv.ValidationError):
cfgv.validate(cfg, CONFIG_SCHEMA)
@pytest.mark.parametrize( @pytest.mark.parametrize(
'manifest_obj', 'manifest_obj',
( (
@ -479,3 +590,18 @@ def test_config_hook_stages_defaulting():
'id': 'fake-hook', 'id': 'fake-hook',
'stages': ['commit-msg', 'pre-push', 'pre-commit', 'pre-merge-commit'], 'stages': ['commit-msg', 'pre-push', 'pre-commit', 'pre-merge-commit'],
} }
def test_manifest_v5_forward_compat(tmp_path):
manifest = tmp_path.joinpath('.pre-commit-hooks.yaml')
manifest.write_text('hooks: {}')
with pytest.raises(InvalidManifestError) as excinfo:
load_manifest(manifest)
assert str(excinfo.value) == (
f'\n'
f'==> File {manifest}\n'
f'=====> \n'
f'=====> pre-commit version 5 is required but version {C.VERSION} '
f'is installed. Perhaps run `pip install --upgrade pre-commit`.'
)

View file

@ -19,11 +19,13 @@ from testing.util import git_commit
def _repo_count(store): def _repo_count(store):
return len(store.select_all_repos()) with store.connect() as db:
return db.execute('SELECT COUNT(1) FROM repos').fetchone()[0]
def _config_count(store): def _config_count(store):
return len(store.select_all_configs()) with store.connect() as db:
return db.execute('SELECT COUNT(1) FROM configs').fetchone()[0]
def _remove_config_assert_cleared(store, cap_out): def _remove_config_assert_cleared(store, cap_out):
@ -153,7 +155,8 @@ def test_invalid_manifest_gcd(tempdir_factory, store, in_git_dir, cap_out):
install_hooks(C.CONFIG_FILE, store) install_hooks(C.CONFIG_FILE, store)
# we'll "break" the manifest to simulate an old version clone # we'll "break" the manifest to simulate an old version clone
(_, _, path), = store.select_all_repos() with store.connect() as db:
path, = db.execute('SELECT path FROM repos').fetchone()
os.remove(os.path.join(path, C.MANIFEST_FILE)) os.remove(os.path.join(path, C.MANIFEST_FILE))
assert _config_count(store) == 1 assert _config_count(store) == 1
@ -162,3 +165,11 @@ def test_invalid_manifest_gcd(tempdir_factory, store, in_git_dir, cap_out):
assert _config_count(store) == 1 assert _config_count(store) == 1
assert _repo_count(store) == 0 assert _repo_count(store) == 0
assert cap_out.get().splitlines()[-1] == '1 repo(s) removed.' assert cap_out.get().splitlines()[-1] == '1 repo(s) removed.'
def test_gc_pre_1_14_roll_forward(store, cap_out):
with store.connect() as db: # simulate pre-1.14.0
db.executescript('DROP TABLE configs')
assert not gc(store)
assert cap_out.get() == '0 repo(s) removed.\n'

View file

@ -0,0 +1,99 @@
from __future__ import annotations
import sys
import pytest
from pre_commit.commands.hazmat import _cmd_filenames
from pre_commit.commands.hazmat import main
from testing.util import cwd
def test_cmd_filenames_no_dash_dash():
with pytest.raises(SystemExit) as excinfo:
_cmd_filenames(('no', 'dashdash', 'here'))
msg, = excinfo.value.args
assert msg == 'hazmat entry must end with `--`'
def test_cmd_filenames_no_filenames():
cmd, filenames = _cmd_filenames(('hello', 'world', '--'))
assert cmd == ('hello', 'world')
assert filenames == ()
def test_cmd_filenames_some_filenames():
cmd, filenames = _cmd_filenames(('hello', 'world', '--', 'f1', 'f2'))
assert cmd == ('hello', 'world')
assert filenames == ('f1', 'f2')
def test_cmd_filenames_multiple_dashdash():
cmd, filenames = _cmd_filenames(('hello', '--', 'arg', '--', 'f1', 'f2'))
assert cmd == ('hello', '--', 'arg')
assert filenames == ('f1', 'f2')
def test_cd_unexpected_filename():
with pytest.raises(SystemExit) as excinfo:
main(('cd', 'subdir', 'cmd', '--', 'subdir/1', 'not-subdir/2'))
msg, = excinfo.value.args
assert msg == "unexpected file without prefix='subdir/': not-subdir/2"
def _norm(out):
return out.replace('\r\n', '\n')
def test_cd(tmp_path, capfd):
subdir = tmp_path.joinpath('subdir')
subdir.mkdir()
subdir.joinpath('a').write_text('a')
subdir.joinpath('b').write_text('b')
with cwd(tmp_path):
ret = main((
'cd', 'subdir',
sys.executable, '-c',
'import os; print(os.getcwd());'
'import sys; [print(open(f).read()) for f in sys.argv[1:]]',
'--',
'subdir/a', 'subdir/b',
))
assert ret == 0
out, err = capfd.readouterr()
assert _norm(out) == f'{subdir}\na\nb\n'
assert err == ''
def test_ignore_exit_code(capfd):
ret = main((
'ignore-exit-code', sys.executable, '-c', 'raise SystemExit("bye")',
))
assert ret == 0
out, err = capfd.readouterr()
assert out == ''
assert _norm(err) == 'bye\n'
def test_n1(capfd):
ret = main((
'n1', sys.executable, '-c', 'import sys; print(sys.argv[1:])',
'--',
'foo', 'bar', 'baz',
))
assert ret == 0
out, err = capfd.readouterr()
assert _norm(out) == "['foo']\n['bar']\n['baz']\n"
assert err == ''
def test_n1_some_error_code():
ret = main((
'n1', sys.executable, '-c',
'import sys; raise SystemExit(sys.argv[1] == "error")',
'--',
'ok', 'error', 'ok',
))
assert ret == 1

View file

@ -1,10 +1,26 @@
from __future__ import annotations from __future__ import annotations
from unittest import mock
import pytest import pytest
import yaml
import pre_commit.constants as C import pre_commit.constants as C
from pre_commit.clientlib import InvalidConfigError from pre_commit.clientlib import InvalidConfigError
from pre_commit.commands.migrate_config import migrate_config from pre_commit.commands.migrate_config import migrate_config
from pre_commit.yaml import yaml_compose
@pytest.fixture(autouse=True, params=['c', 'pure'])
def switch_pyyaml_impl(request):
if request.param == 'c':
yield
else:
with mock.patch.dict(
yaml_compose.keywords,
{'Loader': yaml.SafeLoader},
):
yield
def test_migrate_config_normal_format(tmpdir, capsys): def test_migrate_config_normal_format(tmpdir, capsys):
@ -134,6 +150,27 @@ def test_migrate_config_sha_to_rev(tmpdir):
) )
def test_migrate_config_sha_to_rev_json(tmp_path):
contents = """\
{"repos": [{
"repo": "https://github.com/pre-commit/pre-commit-hooks",
"sha": "v1.2.0",
"hooks": []
}]}
"""
expected = """\
{"repos": [{
"repo": "https://github.com/pre-commit/pre-commit-hooks",
"rev": "v1.2.0",
"hooks": []
}]}
"""
cfg = tmp_path.joinpath('cfg.yaml')
cfg.write_text(contents)
assert not migrate_config(str(cfg))
assert cfg.read_text() == expected
def test_migrate_config_language_python_venv(tmp_path): def test_migrate_config_language_python_venv(tmp_path):
src = '''\ src = '''\
repos: repos:
@ -167,6 +204,73 @@ repos:
assert cfg.read_text() == expected assert cfg.read_text() == expected
def test_migrate_config_quoted_python_venv(tmp_path):
src = '''\
repos:
- repo: local
hooks:
- id: example
name: example
entry: example
language: "python_venv"
'''
expected = '''\
repos:
- repo: local
hooks:
- id: example
name: example
entry: example
language: "python"
'''
cfg = tmp_path.joinpath('cfg.yaml')
cfg.write_text(src)
assert migrate_config(str(cfg)) == 0
assert cfg.read_text() == expected
def test_migrate_config_default_stages(tmp_path):
src = '''\
default_stages: [commit, push, merge-commit, commit-msg]
repos: []
'''
expected = '''\
default_stages: [pre-commit, pre-push, pre-merge-commit, commit-msg]
repos: []
'''
cfg = tmp_path.joinpath('cfg.yaml')
cfg.write_text(src)
assert migrate_config(str(cfg)) == 0
assert cfg.read_text() == expected
def test_migrate_config_hook_stages(tmp_path):
src = '''\
repos:
- repo: local
hooks:
- id: example
name: example
entry: example
language: system
stages: ["commit", "push", "merge-commit", "commit-msg"]
'''
expected = '''\
repos:
- repo: local
hooks:
- id: example
name: example
entry: example
language: system
stages: ["pre-commit", "pre-push", "pre-merge-commit", "commit-msg"]
'''
cfg = tmp_path.joinpath('cfg.yaml')
cfg.write_text(src)
assert migrate_config(str(cfg)) == 0
assert cfg.read_text() == expected
def test_migrate_config_invalid_yaml(tmpdir): def test_migrate_config_invalid_yaml(tmpdir):
contents = '[' contents = '['
cfg = tmpdir.join(C.CONFIG_FILE) cfg = tmpdir.join(C.CONFIG_FILE)

View file

@ -1104,6 +1104,19 @@ def test_fail_fast_not_prev_failures(cap_out, store, repo_with_failing_hook):
assert printed.count(b'run me!') == 1 assert printed.count(b'run me!') == 1
def test_fail_fast_run_arg(cap_out, store, repo_with_failing_hook):
with modify_config() as config:
# More than one hook to demonstrate early exit
config['repos'][0]['hooks'] *= 2
stage_a_file()
ret, printed = _do_run(
cap_out, store, repo_with_failing_hook, run_opts(fail_fast=True),
)
# it should have only run one hook due to the CLI flag
assert printed.count(b'Failing hook') == 1
def test_classifier_removes_dne(): def test_classifier_removes_dne():
classifier = Classifier(('this_file_does_not_exist',)) classifier = Classifier(('this_file_does_not_exist',))
assert classifier.filenames == [] assert classifier.filenames == []

View file

@ -2,7 +2,6 @@ from __future__ import annotations
import functools import functools
import io import io
import logging
import os.path import os.path
from unittest import mock from unittest import mock
@ -203,42 +202,25 @@ def store(tempdir_factory):
yield Store(os.path.join(tempdir_factory.get(), '.pre-commit')) yield Store(os.path.join(tempdir_factory.get(), '.pre-commit'))
@pytest.fixture
def log_info_mock():
with mock.patch.object(logging.getLogger('pre_commit'), 'info') as mck:
yield mck
class FakeStream:
def __init__(self):
self.data = io.BytesIO()
def write(self, s):
self.data.write(s)
def flush(self):
pass
class Fixture: class Fixture:
def __init__(self, stream): def __init__(self, stream: io.BytesIO) -> None:
self._stream = stream self._stream = stream
def get_bytes(self): def get_bytes(self) -> bytes:
"""Get the output as-if no encoding occurred""" """Get the output as-if no encoding occurred"""
data = self._stream.data.getvalue() data = self._stream.getvalue()
self._stream.data.seek(0) self._stream.seek(0)
self._stream.data.truncate() self._stream.truncate()
return data.replace(b'\r\n', b'\n') return data.replace(b'\r\n', b'\n')
def get(self): def get(self) -> str:
"""Get the output assuming it was written as UTF-8 bytes""" """Get the output assuming it was written as UTF-8 bytes"""
return self.get_bytes().decode() return self.get_bytes().decode()
@pytest.fixture @pytest.fixture
def cap_out(): def cap_out():
stream = FakeStream() stream = io.BytesIO()
write = functools.partial(output.write, stream=stream) write = functools.partial(output.write, stream=stream)
write_line_b = functools.partial(output.write_line_b, stream=stream) write_line_b = functools.partial(output.write_line_b, stream=stream)
with mock.patch.multiple(output, write=write, write_line_b=write_line_b): with mock.patch.multiple(output, write=write, write_line_b=write_line_b):

View file

@ -141,6 +141,15 @@ def test_get_conflicted_files_unstaged_files(in_merge_conflict):
assert ret == {'conflict_file'} assert ret == {'conflict_file'}
def test_get_conflicted_files_with_file_named_head(in_merge_conflict):
resolve_conflict()
open('HEAD', 'w').close()
cmd_output('git', 'add', 'HEAD')
ret = set(git.get_conflicted_files())
assert ret == {'conflict_file', 'HEAD'}
MERGE_MSG = b"Merge branch 'foo' into bar\n\nConflicts:\n\tconflict_file\n" MERGE_MSG = b"Merge branch 'foo' into bar\n\nConflicts:\n\tconflict_file\n"
OTHER_MERGE_MSG = MERGE_MSG + b'\tother_conflict_file\n' OTHER_MERGE_MSG = MERGE_MSG + b'\tother_conflict_file\n'

View file

@ -164,3 +164,15 @@ def test_basic_run_hook(tmp_path):
assert ret == 0 assert ret == 0
out = out.replace(b'\r\n', b'\n') out = out.replace(b'\r\n', b'\n')
assert out == b'hi hello file file file\n' assert out == b'hi hello file file file\n'
def test_hook_cmd():
assert lang_base.hook_cmd('echo hi', ()) == ('echo', 'hi')
def test_hook_cmd_hazmat():
ret = lang_base.hook_cmd('pre-commit hazmat cd a echo -- b', ())
assert ret == (
sys.executable, '-m', 'pre_commit.commands.hazmat',
'cd', 'a', 'echo', '--', 'b',
)

View file

@ -10,7 +10,7 @@ from testing.language_helpers import run_language
def test_dart(tmp_path): def test_dart(tmp_path):
pubspec_yaml = '''\ pubspec_yaml = '''\
environment: environment:
sdk: '>=2.10.0 <3.0.0' sdk: '>=2.12.0 <4.0.0'
name: hello_world_dart name: hello_world_dart

View file

@ -1,10 +1,18 @@
from __future__ import annotations from __future__ import annotations
import pytest
from pre_commit.languages import docker_image from pre_commit.languages import docker_image
from pre_commit.util import cmd_output_b
from testing.language_helpers import run_language from testing.language_helpers import run_language
from testing.util import xfailif_windows from testing.util import xfailif_windows
@pytest.fixture(autouse=True, scope='module')
def _ensure_image_available():
cmd_output_b('docker', 'run', '--rm', 'ubuntu:22.04', 'echo')
@xfailif_windows # pragma: win32 no cover @xfailif_windows # pragma: win32 no cover
def test_docker_image_hook_via_entrypoint(tmp_path): def test_docker_image_hook_via_entrypoint(tmp_path):
ret = run_language( ret = run_language(

View file

@ -14,40 +14,173 @@ from pre_commit.util import CalledProcessError
from testing.language_helpers import run_language from testing.language_helpers import run_language
from testing.util import xfailif_windows from testing.util import xfailif_windows
DOCKER_CGROUP_EXAMPLE = b'''\ DOCKER_CGROUPS_V1_MOUNTINFO_EXAMPLE = b'''\
12:hugetlb:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 759 717 0:52 / / rw,relatime master:300 - overlay overlay rw,lowerdir=/var/lib/docker/overlay2/l/PCPE5P5IVGM7CFCPJR353N3ONK:/var/lib/docker/overlay2/l/EQFSDHFAJ333VEMEJD4ZTRIZCB,upperdir=/var/lib/docker/overlay2/0d9f6bf186030d796505b87d6daa92297355e47641e283d3c09d83a7f221e462/diff,workdir=/var/lib/docker/overlay2/0d9f6bf186030d796505b87d6daa92297355e47641e283d3c09d83a7f221e462/work
11:blkio:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 760 759 0:58 / /proc rw,nosuid,nodev,noexec,relatime - proc proc rw
10:freezer:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 761 759 0:59 / /dev rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,inode64
9:cpu,cpuacct:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 762 761 0:60 / /dev/pts rw,nosuid,noexec,relatime - devpts devpts rw,gid=5,mode=620,ptmxmode=666
8:pids:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 763 759 0:61 / /sys ro,nosuid,nodev,noexec,relatime - sysfs sysfs ro
7:rdma:/ 764 763 0:62 / /sys/fs/cgroup rw,nosuid,nodev,noexec,relatime - tmpfs tmpfs rw,mode=755,inode64
6:net_cls,net_prio:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 765 764 0:29 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/systemd ro,nosuid,nodev,noexec,relatime master:11 - cgroup cgroup rw,xattr,name=systemd
5:cpuset:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 766 764 0:32 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/rdma ro,nosuid,nodev,noexec,relatime master:15 - cgroup cgroup rw,rdma
4:devices:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 767 764 0:33 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/cpu,cpuacct ro,nosuid,nodev,noexec,relatime master:16 - cgroup cgroup rw,cpu,cpuacct
3:memory:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 768 764 0:34 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/cpuset ro,nosuid,nodev,noexec,relatime master:17 - cgroup cgroup rw,cpuset
2:perf_event:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 769 764 0:35 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/pids ro,nosuid,nodev,noexec,relatime master:18 - cgroup cgroup rw,pids
1:name=systemd:/docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 770 764 0:36 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/memory ro,nosuid,nodev,noexec,relatime master:19 - cgroup cgroup rw,memory
0::/system.slice/containerd.service 771 764 0:37 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/perf_event ro,nosuid,nodev,noexec,relatime master:20 - cgroup cgroup rw,perf_event
772 764 0:38 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/net_cls,net_prio ro,nosuid,nodev,noexec,relatime master:21 - cgroup cgroup rw,net_cls,net_prio
773 764 0:39 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/blkio ro,nosuid,nodev,noexec,relatime master:22 - cgroup cgroup rw,blkio
774 764 0:40 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/misc ro,nosuid,nodev,noexec,relatime master:23 - cgroup cgroup rw,misc
775 764 0:41 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/hugetlb ro,nosuid,nodev,noexec,relatime master:24 - cgroup cgroup rw,hugetlb
776 764 0:42 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/devices ro,nosuid,nodev,noexec,relatime master:25 - cgroup cgroup rw,devices
777 764 0:43 /docker/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7 /sys/fs/cgroup/freezer ro,nosuid,nodev,noexec,relatime master:26 - cgroup cgroup rw,freezer
778 761 0:57 / /dev/mqueue rw,nosuid,nodev,noexec,relatime - mqueue mqueue rw
779 761 0:63 / /dev/shm rw,nosuid,nodev,noexec,relatime - tmpfs shm rw,size=65536k,inode64
780 759 8:5 /var/lib/docker/containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/resolv.conf /etc/resolv.conf rw,relatime - ext4 /dev/sda5 rw,errors=remount-ro
781 759 8:5 /var/lib/docker/containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/hostname /etc/hostname rw,relatime - ext4 /dev/sda5 rw,errors=remount-ro
782 759 8:5 /var/lib/docker/containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/hosts /etc/hosts rw,relatime - ext4 /dev/sda5 rw,errors=remount-ro
718 761 0:60 /0 /dev/console rw,nosuid,noexec,relatime - devpts devpts rw,gid=5,mode=620,ptmxmode=666
719 760 0:58 /bus /proc/bus ro,nosuid,nodev,noexec,relatime - proc proc rw
720 760 0:58 /fs /proc/fs ro,nosuid,nodev,noexec,relatime - proc proc rw
721 760 0:58 /irq /proc/irq ro,nosuid,nodev,noexec,relatime - proc proc rw
722 760 0:58 /sys /proc/sys ro,nosuid,nodev,noexec,relatime - proc proc rw
723 760 0:58 /sysrq-trigger /proc/sysrq-trigger ro,nosuid,nodev,noexec,relatime - proc proc rw
724 760 0:64 / /proc/asound ro,relatime - tmpfs tmpfs ro,inode64
725 760 0:65 / /proc/acpi ro,relatime - tmpfs tmpfs ro,inode64
726 760 0:59 /null /proc/kcore rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,inode64
727 760 0:59 /null /proc/keys rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,inode64
728 760 0:59 /null /proc/timer_list rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,inode64
729 760 0:66 / /proc/scsi ro,relatime - tmpfs tmpfs ro,inode64
730 763 0:67 / /sys/firmware ro,relatime - tmpfs tmpfs ro,inode64
731 763 0:68 / /sys/devices/virtual/powercap ro,relatime - tmpfs tmpfs ro,inode64
''' # noqa: E501
DOCKER_CGROUPS_V2_MOUNTINFO_EXAMPLE = b'''\
721 386 0:45 / / rw,relatime master:218 - overlay overlay rw,lowerdir=/var/lib/docker/overlay2/l/QHZ7OM7P4AQD3XLG274ZPWAJCV:/var/lib/docker/overlay2/l/5RFG6SZWVGOG2NKEYXJDQCQYX5,upperdir=/var/lib/docker/overlay2/e4ad859fc5d4791932b9b976052f01fb0063e01de3cef916e40ae2121f6a166e/diff,workdir=/var/lib/docker/overlay2/e4ad859fc5d4791932b9b976052f01fb0063e01de3cef916e40ae2121f6a166e/work,nouserxattr
722 721 0:48 / /proc rw,nosuid,nodev,noexec,relatime - proc proc rw
723 721 0:50 / /dev rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,inode64
724 723 0:51 / /dev/pts rw,nosuid,noexec,relatime - devpts devpts rw,gid=5,mode=620,ptmxmode=666
725 721 0:52 / /sys ro,nosuid,nodev,noexec,relatime - sysfs sysfs ro
726 725 0:26 / /sys/fs/cgroup ro,nosuid,nodev,noexec,relatime - cgroup2 cgroup rw,nsdelegate,memory_recursiveprot
727 723 0:47 / /dev/mqueue rw,nosuid,nodev,noexec,relatime - mqueue mqueue rw
728 723 0:53 / /dev/shm rw,nosuid,nodev,noexec,relatime - tmpfs shm rw,size=65536k,inode64
729 721 8:3 /var/lib/docker/containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/resolv.conf /etc/resolv.conf rw,relatime - ext4 /dev/sda3 rw,errors=remount-ro
730 721 8:3 /var/lib/docker/containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/hostname /etc/hostname rw,relatime - ext4 /dev/sda3 rw,errors=remount-ro
731 721 8:3 /var/lib/docker/containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/hosts /etc/hosts rw,relatime - ext4 /dev/sda3 rw,errors=remount-ro
387 723 0:51 /0 /dev/console rw,nosuid,noexec,relatime - devpts devpts rw,gid=5,mode=620,ptmxmode=666
388 722 0:48 /bus /proc/bus ro,nosuid,nodev,noexec,relatime - proc proc rw
389 722 0:48 /fs /proc/fs ro,nosuid,nodev,noexec,relatime - proc proc rw
525 722 0:48 /irq /proc/irq ro,nosuid,nodev,noexec,relatime - proc proc rw
526 722 0:48 /sys /proc/sys ro,nosuid,nodev,noexec,relatime - proc proc rw
571 722 0:48 /sysrq-trigger /proc/sysrq-trigger ro,nosuid,nodev,noexec,relatime - proc proc rw
572 722 0:57 / /proc/asound ro,relatime - tmpfs tmpfs ro,inode64
575 722 0:58 / /proc/acpi ro,relatime - tmpfs tmpfs ro,inode64
576 722 0:50 /null /proc/kcore rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,inode64
577 722 0:50 /null /proc/keys rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,inode64
578 722 0:50 /null /proc/timer_list rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,inode64
579 722 0:59 / /proc/scsi ro,relatime - tmpfs tmpfs ro,inode64
580 725 0:60 / /sys/firmware ro,relatime - tmpfs tmpfs ro,inode64
''' # noqa: E501
PODMAN_CGROUPS_V1_MOUNTINFO_EXAMPLE = b'''\
1200 915 0:57 / / rw,relatime - overlay overlay rw,lowerdir=/home/asottile/.local/share/containers/storage/overlay/l/ZWAU3VY3ZHABQJRBUAFPBX7R5D,upperdir=/home/asottile/.local/share/containers/storage/overlay/72504ef163fda63838930450553b7306412ccad139a007626732b3dc43af5200/diff,workdir=/home/asottile/.local/share/containers/storage/overlay/72504ef163fda63838930450553b7306412ccad139a007626732b3dc43af5200/work,volatile,userxattr
1204 1200 0:62 / /proc rw,nosuid,nodev,noexec,relatime - proc proc rw
1205 1200 0:63 / /dev rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,uid=1000,gid=1000,inode64
1206 1200 0:64 / /sys ro,nosuid,nodev,noexec,relatime - sysfs sysfs rw
1207 1205 0:65 / /dev/pts rw,nosuid,noexec,relatime - devpts devpts rw,gid=100004,mode=620,ptmxmode=666
1208 1205 0:61 / /dev/mqueue rw,nosuid,nodev,noexec,relatime - mqueue mqueue rw
1209 1200 0:53 /containers/overlay-containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/userdata/.containerenv /run/.containerenv rw,nosuid,nodev,relatime - tmpfs tmpfs rw,size=814036k,mode=700,uid=1000,gid=1000,inode64
1210 1200 0:53 /containers/overlay-containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/userdata/resolv.conf /etc/resolv.conf rw,nosuid,nodev,relatime - tmpfs tmpfs rw,size=814036k,mode=700,uid=1000,gid=1000,inode64
1211 1200 0:53 /containers/overlay-containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/userdata/hosts /etc/hosts rw,nosuid,nodev,relatime - tmpfs tmpfs rw,size=814036k,mode=700,uid=1000,gid=1000,inode64
1212 1205 0:56 / /dev/shm rw,relatime - tmpfs shm rw,size=64000k,uid=1000,gid=1000,inode64
1213 1200 0:53 /containers/overlay-containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/userdata/hostname /etc/hostname rw,nosuid,nodev,relatime - tmpfs tmpfs rw,size=814036k,mode=700,uid=1000,gid=1000,inode64
1214 1206 0:66 / /sys/fs/cgroup rw,nosuid,nodev,noexec,relatime - tmpfs cgroup rw,size=1024k,uid=1000,gid=1000,inode64
1215 1214 0:43 / /sys/fs/cgroup/freezer ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,freezer
1216 1214 0:42 /user.slice /sys/fs/cgroup/devices ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,devices
1217 1214 0:41 / /sys/fs/cgroup/hugetlb ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,hugetlb
1218 1214 0:40 / /sys/fs/cgroup/misc ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,misc
1219 1214 0:39 / /sys/fs/cgroup/blkio ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,blkio
1220 1214 0:38 / /sys/fs/cgroup/net_cls,net_prio ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,net_cls,net_prio
1221 1214 0:37 / /sys/fs/cgroup/perf_event ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,perf_event
1222 1214 0:36 /user.slice/user-1000.slice/user@1000.service /sys/fs/cgroup/memory ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,memory
1223 1214 0:35 /user.slice/user-1000.slice/user@1000.service /sys/fs/cgroup/pids ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,pids
1224 1214 0:34 / /sys/fs/cgroup/cpuset ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,cpuset
1225 1214 0:33 / /sys/fs/cgroup/cpu,cpuacct ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,cpu,cpuacct
1226 1214 0:32 / /sys/fs/cgroup/rdma ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,rdma
1227 1214 0:29 /user.slice/user-1000.slice/user@1000.service/apps.slice/apps-org.gnome.Terminal.slice/vte-spawn-0c50448e-b395-4d76-8b92-379f16e5066f.scope /sys/fs/cgroup/systemd ro,nosuid,nodev,noexec,relatime - cgroup cgroup rw,xattr,name=systemd
1228 1205 0:5 /null /dev/null rw,nosuid,noexec,relatime - devtmpfs udev rw,size=4031656k,nr_inodes=1007914,mode=755,inode64
1229 1205 0:5 /zero /dev/zero rw,nosuid,noexec,relatime - devtmpfs udev rw,size=4031656k,nr_inodes=1007914,mode=755,inode64
1230 1205 0:5 /full /dev/full rw,nosuid,noexec,relatime - devtmpfs udev rw,size=4031656k,nr_inodes=1007914,mode=755,inode64
1231 1205 0:5 /tty /dev/tty rw,nosuid,noexec,relatime - devtmpfs udev rw,size=4031656k,nr_inodes=1007914,mode=755,inode64
1232 1205 0:5 /random /dev/random rw,nosuid,noexec,relatime - devtmpfs udev rw,size=4031656k,nr_inodes=1007914,mode=755,inode64
1233 1205 0:5 /urandom /dev/urandom rw,nosuid,noexec,relatime - devtmpfs udev rw,size=4031656k,nr_inodes=1007914,mode=755,inode64
1234 1204 0:67 / /proc/acpi ro,relatime - tmpfs tmpfs rw,size=0k,uid=1000,gid=1000,inode64
1235 1204 0:5 /null /proc/kcore rw,nosuid,noexec,relatime - devtmpfs udev rw,size=4031656k,nr_inodes=1007914,mode=755,inode64
1236 1204 0:5 /null /proc/keys rw,nosuid,noexec,relatime - devtmpfs udev rw,size=4031656k,nr_inodes=1007914,mode=755,inode64
1237 1204 0:5 /null /proc/timer_list rw,nosuid,noexec,relatime - devtmpfs udev rw,size=4031656k,nr_inodes=1007914,mode=755,inode64
1238 1204 0:68 / /proc/scsi ro,relatime - tmpfs tmpfs rw,size=0k,uid=1000,gid=1000,inode64
1239 1206 0:69 / /sys/firmware ro,relatime - tmpfs tmpfs rw,size=0k,uid=1000,gid=1000,inode64
1240 1206 0:70 / /sys/dev/block ro,relatime - tmpfs tmpfs rw,size=0k,uid=1000,gid=1000,inode64
1241 1204 0:62 /asound /proc/asound ro,relatime - proc proc rw
1242 1204 0:62 /bus /proc/bus ro,relatime - proc proc rw
1243 1204 0:62 /fs /proc/fs ro,relatime - proc proc rw
1244 1204 0:62 /irq /proc/irq ro,relatime - proc proc rw
1245 1204 0:62 /sys /proc/sys ro,relatime - proc proc rw
1256 1204 0:62 /sysrq-trigger /proc/sysrq-trigger ro,relatime - proc proc rw
916 1205 0:65 /0 /dev/console rw,relatime - devpts devpts rw,gid=100004,mode=620,ptmxmode=666
''' # noqa: E501
PODMAN_CGROUPS_V2_MOUNTINFO_EXAMPLE = b'''\
685 690 0:63 /containers/overlay-containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/userdata/resolv.conf /etc/resolv.conf rw,nosuid,nodev,relatime - tmpfs tmpfs rw,size=1637624k,nr_inodes=409406,mode=700,uid=1000,gid=1000,inode64
686 690 0:63 /containers/overlay-containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/userdata/hosts /etc/hosts rw,nosuid,nodev,relatime - tmpfs tmpfs rw,size=1637624k,nr_inodes=409406,mode=700,uid=1000,gid=1000,inode64
687 692 0:50 / /dev/shm rw,nosuid,nodev,noexec,relatime - tmpfs shm rw,size=64000k,uid=1000,gid=1000,inode64
688 690 0:63 /containers/overlay-containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/userdata/.containerenv /run/.containerenv rw,nosuid,nodev,relatime - tmpfs tmpfs rw,size=1637624k,nr_inodes=409406,mode=700,uid=1000,gid=1000,inode64
689 690 0:63 /containers/overlay-containers/c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7/userdata/hostname /etc/hostname rw,nosuid,nodev,relatime - tmpfs tmpfs rw,size=1637624k,nr_inodes=409406,mode=700,uid=1000,gid=1000,inode64
690 546 0:55 / / rw,relatime - overlay overlay rw,lowerdir=/home/asottile/.local/share/containers/storage/overlay/l/NPOHYOD3PI3YW6TQSGBOVOUSK6,upperdir=/home/asottile/.local/share/containers/storage/overlay/565c206fb79f876ffd5f069b8bd7a97fb5e47d5d07396b0c395a4ed6725d4a8e/diff,workdir=/home/asottile/.local/share/containers/storage/overlay/565c206fb79f876ffd5f069b8bd7a97fb5e47d5d07396b0c395a4ed6725d4a8e/work,redirect_dir=nofollow,uuid=on,volatile,userxattr
691 690 0:59 / /proc rw,nosuid,nodev,noexec,relatime - proc proc rw
692 690 0:61 / /dev rw,nosuid - tmpfs tmpfs rw,size=65536k,mode=755,uid=1000,gid=1000,inode64
693 690 0:62 / /sys ro,nosuid,nodev,noexec,relatime - sysfs sysfs rw
694 692 0:66 / /dev/pts rw,nosuid,noexec,relatime - devpts devpts rw,gid=100004,mode=620,ptmxmode=666
695 692 0:58 / /dev/mqueue rw,nosuid,nodev,noexec,relatime - mqueue mqueue rw
696 693 0:28 / /sys/fs/cgroup ro,nosuid,nodev,noexec,relatime - cgroup2 cgroup2 rw,nsdelegate,memory_recursiveprot
698 692 0:6 /null /dev/null rw,nosuid,noexec,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
699 692 0:6 /zero /dev/zero rw,nosuid,noexec,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
700 692 0:6 /full /dev/full rw,nosuid,noexec,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
701 692 0:6 /tty /dev/tty rw,nosuid,noexec,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
702 692 0:6 /random /dev/random rw,nosuid,noexec,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
703 692 0:6 /urandom /dev/urandom rw,nosuid,noexec,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
704 691 0:67 / /proc/acpi ro,relatime - tmpfs tmpfs rw,size=0k,uid=1000,gid=1000,inode64
705 691 0:6 /null /proc/kcore ro,nosuid,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
706 691 0:6 /null /proc/keys ro,nosuid,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
707 691 0:6 /null /proc/latency_stats ro,nosuid,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
708 691 0:6 /null /proc/timer_list ro,nosuid,relatime - devtmpfs udev rw,size=8147812k,nr_inodes=2036953,mode=755,inode64
709 691 0:68 / /proc/scsi ro,relatime - tmpfs tmpfs rw,size=0k,uid=1000,gid=1000,inode64
710 693 0:69 / /sys/firmware ro,relatime - tmpfs tmpfs rw,size=0k,uid=1000,gid=1000,inode64
711 693 0:70 / /sys/dev/block ro,relatime - tmpfs tmpfs rw,size=0k,uid=1000,gid=1000,inode64
712 693 0:71 / /sys/devices/virtual/powercap ro,relatime - tmpfs tmpfs rw,size=0k,uid=1000,gid=1000,inode64
713 691 0:59 /asound /proc/asound ro,nosuid,nodev,noexec,relatime - proc proc rw
714 691 0:59 /bus /proc/bus ro,nosuid,nodev,noexec,relatime - proc proc rw
715 691 0:59 /fs /proc/fs ro,nosuid,nodev,noexec,relatime - proc proc rw
716 691 0:59 /irq /proc/irq ro,nosuid,nodev,noexec,relatime - proc proc rw
717 691 0:59 /sys /proc/sys ro,nosuid,nodev,noexec,relatime - proc proc rw
718 691 0:59 /sysrq-trigger /proc/sysrq-trigger ro,nosuid,nodev,noexec,relatime - proc proc rw
547 692 0:66 /0 /dev/console rw,relatime - devpts devpts rw,gid=100004,mode=620,ptmxmode=666
''' # noqa: E501 ''' # noqa: E501
# The ID should match the above cgroup example. # The ID should match the above cgroup example.
CONTAINER_ID = 'c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7' # noqa: E501 CONTAINER_ID = 'c33988ec7651ebc867cb24755eaf637a6734088bc7eef59d5799293a9e5450f7' # noqa: E501
NON_DOCKER_CGROUP_EXAMPLE = b'''\ NON_DOCKER_MOUNTINFO_EXAMPLE = b'''\
12:perf_event:/ 21 27 0:19 / /sys rw,nosuid,nodev,noexec,relatime shared:7 - sysfs sysfs rw
11:hugetlb:/ 22 27 0:20 / /proc rw,nosuid,nodev,noexec,relatime shared:14 - proc proc rw
10:devices:/ 23 27 0:5 / /dev rw,nosuid,relatime shared:2 - devtmpfs udev rw,size=10219484k,nr_inodes=2554871,mode=755,inode64
9:blkio:/ 24 23 0:21 / /dev/pts rw,nosuid,noexec,relatime shared:3 - devpts devpts rw,gid=5,mode=620,ptmxmode=000
8:rdma:/ 25 27 0:22 / /run rw,nosuid,nodev,noexec,relatime shared:5 - tmpfs tmpfs rw,size=2047768k,mode=755,inode64
7:cpuset:/ 27 1 8:2 / / rw,relatime shared:1 - ext4 /dev/sda2 rw,errors=remount-ro
6:cpu,cpuacct:/ 28 21 0:6 / /sys/kernel/security rw,nosuid,nodev,noexec,relatime shared:8 - securityfs securityfs rw
5:freezer:/ 29 23 0:24 / /dev/shm rw,nosuid,nodev shared:4 - tmpfs tmpfs rw,inode64
4:memory:/ 30 25 0:25 / /run/lock rw,nosuid,nodev,noexec,relatime shared:6 - tmpfs tmpfs rw,size=5120k,inode64
3:pids:/ ''' # noqa: E501
2:net_cls,net_prio:/
1:name=systemd:/init.scope
0::/init.scope
'''
def test_docker_fallback_user(): def test_docker_fallback_user():
@ -62,9 +195,46 @@ def test_docker_fallback_user():
assert docker.get_docker_user() == () assert docker.get_docker_user() == ()
def test_in_docker_no_file(): @pytest.fixture(autouse=True)
def _avoid_cache():
with mock.patch.object(
docker,
'_is_rootless',
docker._is_rootless.__wrapped__,
):
yield
@pytest.mark.parametrize(
'info_ret',
(
(0, b'{"SecurityOptions": ["name=rootless","name=cgroupns"]}', b''),
(0, b'{"host": {"security": {"rootless": true}}}', b''),
),
)
def test_docker_user_rootless(info_ret):
with mock.patch.object(docker, 'cmd_output_b', return_value=info_ret):
assert docker.get_docker_user() == ()
@pytest.mark.parametrize(
'info_ret',
(
(0, b'{"SecurityOptions": ["name=cgroupns"]}', b''),
(0, b'{"host": {"security": {"rootless": false}}}', b''),
(0, b'{"response_from_some_other_container_engine": true}', b''),
(0, b'{"SecurityOptions": null}', b''),
(1, b'', b''),
),
)
def test_docker_user_non_rootless(info_ret):
with mock.patch.object(docker, 'cmd_output_b', return_value=info_ret):
assert docker.get_docker_user() != ()
def test_container_id_no_file():
with mock.patch.object(builtins, 'open', side_effect=FileNotFoundError): with mock.patch.object(builtins, 'open', side_effect=FileNotFoundError):
assert docker._is_in_docker() is False assert docker._get_container_id() is None
def _mock_open(data): def _mock_open(data):
@ -76,34 +246,29 @@ def _mock_open(data):
) )
def test_in_docker_docker_in_file(): def test_container_id_not_in_file():
with _mock_open(DOCKER_CGROUP_EXAMPLE): with _mock_open(NON_DOCKER_MOUNTINFO_EXAMPLE):
assert docker._is_in_docker() is True assert docker._get_container_id() is None
def test_in_docker_docker_not_in_file():
with _mock_open(NON_DOCKER_CGROUP_EXAMPLE):
assert docker._is_in_docker() is False
def test_get_container_id(): def test_get_container_id():
with _mock_open(DOCKER_CGROUP_EXAMPLE): with _mock_open(DOCKER_CGROUPS_V1_MOUNTINFO_EXAMPLE):
assert docker._get_container_id() == CONTAINER_ID
with _mock_open(DOCKER_CGROUPS_V2_MOUNTINFO_EXAMPLE):
assert docker._get_container_id() == CONTAINER_ID
with _mock_open(PODMAN_CGROUPS_V1_MOUNTINFO_EXAMPLE):
assert docker._get_container_id() == CONTAINER_ID
with _mock_open(PODMAN_CGROUPS_V2_MOUNTINFO_EXAMPLE):
assert docker._get_container_id() == CONTAINER_ID assert docker._get_container_id() == CONTAINER_ID
def test_get_container_id_failure():
with _mock_open(b''), pytest.raises(RuntimeError):
docker._get_container_id()
def test_get_docker_path_not_in_docker_returns_same(): def test_get_docker_path_not_in_docker_returns_same():
with mock.patch.object(docker, '_is_in_docker', return_value=False): with _mock_open(b''):
assert docker._get_docker_path('abc') == 'abc' assert docker._get_docker_path('abc') == 'abc'
@pytest.fixture @pytest.fixture
def in_docker(): def in_docker():
with mock.patch.object(docker, '_is_in_docker', return_value=True):
with mock.patch.object( with mock.patch.object(
docker, '_get_container_id', return_value=CONTAINER_ID, docker, '_get_container_id', return_value=CONTAINER_ID,
): ):
@ -195,3 +360,14 @@ CMD ["echo", "This is overwritten by the entry"']
ret = run_language(tmp_path, docker, 'echo hello hello world') ret = run_language(tmp_path, docker, 'echo hello hello world')
assert ret == (0, b'hello hello world\n') assert ret == (0, b'hello hello world\n')
@xfailif_windows # pragma: win32 no cover
def test_docker_hook_mount_permissions(tmp_path):
dockerfile = '''\
FROM ubuntu:22.04
'''
tmp_path.joinpath('Dockerfile').write_text(dockerfile)
retcode, _ = run_language(tmp_path, docker, 'touch', ('README.md',))
assert retcode == 0

View file

@ -27,7 +27,7 @@ def _csproj(tool_name):
<Project Sdk="Microsoft.NET.Sdk"> <Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup> <PropertyGroup>
<OutputType>Exe</OutputType> <OutputType>Exe</OutputType>
<TargetFramework>net6</TargetFramework> <TargetFramework>net8</TargetFramework>
<PackAsTool>true</PackAsTool> <PackAsTool>true</PackAsTool>
<ToolCommandName>{tool_name}</ToolCommandName> <ToolCommandName>{tool_name}</ToolCommandName>
<PackageOutputPath>./nupkg</PackageOutputPath> <PackageOutputPath>./nupkg</PackageOutputPath>

View file

@ -11,11 +11,13 @@ from pre_commit.commands.install_uninstall import install
from pre_commit.envcontext import envcontext from pre_commit.envcontext import envcontext
from pre_commit.languages import golang from pre_commit.languages import golang
from pre_commit.store import _make_local_repo from pre_commit.store import _make_local_repo
from pre_commit.util import CalledProcessError
from pre_commit.util import cmd_output from pre_commit.util import cmd_output
from testing.fixtures import add_config_to_repo from testing.fixtures import add_config_to_repo
from testing.fixtures import make_config_from_repo from testing.fixtures import make_config_from_repo
from testing.language_helpers import run_language from testing.language_helpers import run_language
from testing.util import cmd_output_mocked_pre_commit_home from testing.util import cmd_output_mocked_pre_commit_home
from testing.util import cwd
from testing.util import git_commit from testing.util import git_commit
@ -165,3 +167,70 @@ def test_during_commit_all(tmp_path, tempdir_factory, store, in_git_dir):
fn=cmd_output_mocked_pre_commit_home, fn=cmd_output_mocked_pre_commit_home,
tempdir_factory=tempdir_factory, tempdir_factory=tempdir_factory,
) )
def test_automatic_toolchain_switching(tmp_path):
go_mod = '''\
module toolchain-version-test
go 1.23.1
'''
main_go = '''\
package main
func main() {}
'''
tmp_path.joinpath('go.mod').write_text(go_mod)
mod_dir = tmp_path.joinpath('toolchain-version-test')
mod_dir.mkdir()
main_file = mod_dir.joinpath('main.go')
main_file.write_text(main_go)
with pytest.raises(CalledProcessError) as excinfo:
run_language(
path=tmp_path,
language=golang,
version='1.22.0',
exe='golang-version-test',
)
assert 'go.mod requires go >= 1.23.1' in excinfo.value.stderr.decode()
def test_automatic_toolchain_switching_go_fmt(tmp_path, monkeypatch):
go_mod_hook = '''\
module toolchain-version-test
go 1.22.0
'''
go_mod = '''\
module toolchain-version-test
go 1.23.1
'''
main_go = '''\
package main
func main() {}
'''
hook_dir = tmp_path.joinpath('hook')
hook_dir.mkdir()
hook_dir.joinpath('go.mod').write_text(go_mod_hook)
test_dir = tmp_path.joinpath('test')
test_dir.mkdir()
test_dir.joinpath('go.mod').write_text(go_mod)
main_file = test_dir.joinpath('main.go')
main_file.write_text(main_go)
with cwd(test_dir):
ret, out = run_language(
path=hook_dir,
language=golang,
version='1.22.0',
exe='go fmt',
file_args=(str(main_file),),
)
assert ret == 1
assert 'go.mod requires go >= 1.23.1' in out.decode()

View file

@ -0,0 +1,111 @@
from __future__ import annotations
import os
from unittest import mock
from pre_commit.languages import julia
from testing.language_helpers import run_language
from testing.util import cwd
def _make_hook(tmp_path, julia_code):
src_dir = tmp_path.joinpath('src')
src_dir.mkdir()
src_dir.joinpath('main.jl').write_text(julia_code)
tmp_path.joinpath('Project.toml').write_text(
'[deps]\n'
'Example = "7876af07-990d-54b4-ab0e-23690620f79a"\n',
)
def test_julia_hook(tmp_path):
code = """
using Example
function main()
println("Hello, world!")
end
main()
"""
_make_hook(tmp_path, code)
expected = (0, b'Hello, world!\n')
assert run_language(tmp_path, julia, 'src/main.jl') == expected
def test_julia_hook_with_startup(tmp_path):
depot_path = tmp_path.joinpath('depot')
depot_path.joinpath('config').mkdir(parents=True)
startup = depot_path.joinpath('config', 'startup.jl')
startup.write_text('error("Startup file used!")\n')
depo_path_var = f'{depot_path}{os.pathsep}'
with mock.patch.dict(os.environ, {'JULIA_DEPOT_PATH': depo_path_var}):
test_julia_hook(tmp_path)
def test_julia_hook_manifest(tmp_path):
code = """
using Example
println(pkgversion(Example))
"""
_make_hook(tmp_path, code)
tmp_path.joinpath('Manifest.toml').write_text(
'manifest_format = "2.0"\n\n'
'[[deps.Example]]\n'
'git-tree-sha1 = "11820aa9c229fd3833d4bd69e5e75ef4e7273bf1"\n'
'uuid = "7876af07-990d-54b4-ab0e-23690620f79a"\n'
'version = "0.5.4"\n',
)
expected = (0, b'0.5.4\n')
assert run_language(tmp_path, julia, 'src/main.jl') == expected
def test_julia_hook_args(tmp_path):
code = """
function main(argv)
foreach(println, argv)
end
main(ARGS)
"""
_make_hook(tmp_path, code)
expected = (0, b'--arg1\n--arg2\n')
assert run_language(
tmp_path, julia, 'src/main.jl --arg1 --arg2',
) == expected
def test_julia_hook_additional_deps(tmp_path):
code = """
using TOML
function main()
project_file = Base.active_project()
dict = TOML.parsefile(project_file)
for (k, v) in dict["deps"]
println(k, " = ", v)
end
end
main()
"""
_make_hook(tmp_path, code)
deps = ('TOML=fa267f1f-6049-4f14-aa54-33bafae1ed76',)
ret, out = run_language(tmp_path, julia, 'src/main.jl', deps=deps)
assert ret == 0
assert b'Example = 7876af07-990d-54b4-ab0e-23690620f79a' in out
assert b'TOML = fa267f1f-6049-4f14-aa54-33bafae1ed76' in out
def test_julia_repo_local(tmp_path):
env_dir = tmp_path.joinpath('envdir')
env_dir.mkdir()
local_dir = tmp_path.joinpath('local')
local_dir.mkdir()
local_dir.joinpath('local.jl').write_text(
'using TOML; foreach(println, ARGS)',
)
with cwd(local_dir):
deps = ('TOML=fa267f1f-6049-4f14-aa54-33bafae1ed76',)
expected = (0, b'--local-arg1\n--local-arg2\n')
assert run_language(
env_dir, julia, 'local.jl --local-arg1 --local-arg2',
deps=deps, is_local=True,
) == expected

View file

@ -10,8 +10,11 @@ import pre_commit.constants as C
from pre_commit.envcontext import envcontext from pre_commit.envcontext import envcontext
from pre_commit.languages import python from pre_commit.languages import python
from pre_commit.prefix import Prefix from pre_commit.prefix import Prefix
from pre_commit.store import _make_local_repo
from pre_commit.util import cmd_output_b
from pre_commit.util import make_executable from pre_commit.util import make_executable
from pre_commit.util import win_exe from pre_commit.util import win_exe
from testing.auto_namedtuple import auto_namedtuple
from testing.language_helpers import run_language from testing.language_helpers import run_language
@ -34,6 +37,72 @@ def test_read_pyvenv_cfg_non_utf8(tmpdir):
assert python._read_pyvenv_cfg(pyvenv_cfg) == expected assert python._read_pyvenv_cfg(pyvenv_cfg) == expected
def _get_default_version(
*,
impl: str,
exe: str,
found: set[str],
version: tuple[int, int],
) -> str:
sys_exe = f'/fake/path/{exe}'
sys_impl = auto_namedtuple(name=impl)
sys_ver = auto_namedtuple(major=version[0], minor=version[1])
def find_exe(s):
if s in found:
return f'/fake/path/found/{exe}'
else:
return None
with (
mock.patch.object(sys, 'implementation', sys_impl),
mock.patch.object(sys, 'executable', sys_exe),
mock.patch.object(sys, 'version_info', sys_ver),
mock.patch.object(python, 'find_executable', find_exe),
):
return python.get_default_version.__wrapped__()
def test_default_version_sys_executable_found():
ret = _get_default_version(
impl='cpython',
exe='python3.12',
found={'python3.12'},
version=(3, 12),
)
assert ret == 'python3.12'
def test_default_version_picks_specific_when_found():
ret = _get_default_version(
impl='cpython',
exe='python3',
found={'python3', 'python3.12'},
version=(3, 12),
)
assert ret == 'python3.12'
def test_default_version_picks_pypy_versioned_exe():
ret = _get_default_version(
impl='pypy',
exe='python',
found={'pypy3.12', 'python3'},
version=(3, 12),
)
assert ret == 'pypy3.12'
def test_default_version_picks_pypy_unversioned_exe():
ret = _get_default_version(
impl='pypy',
exe='python',
found={'pypy3', 'python3'},
version=(3, 12),
)
assert ret == 'pypy3'
def test_norm_version_expanduser(): def test_norm_version_expanduser():
home = os.path.expanduser('~') home = os.path.expanduser('~')
if sys.platform == 'win32': # pragma: win32 cover if sys.platform == 'win32': # pragma: win32 cover
@ -284,3 +353,15 @@ def test_python_hook_weird_setup_cfg(tmp_path):
ret = run_language(tmp_path, python, 'socks', [os.devnull]) ret = run_language(tmp_path, python, 'socks', [os.devnull])
assert ret == (0, f'[{os.devnull!r}]\nhello hello\n'.encode()) assert ret == (0, f'[{os.devnull!r}]\nhello hello\n'.encode())
def test_local_repo_with_other_artifacts(tmp_path):
cmd_output_b('git', 'init', tmp_path)
_make_local_repo(str(tmp_path))
# pretend a rust install also ran here
tmp_path.joinpath('target').mkdir()
ret, out = run_language(tmp_path, python, 'python --version')
assert ret == 0
assert out.startswith(b'Python ')

View file

@ -1,14 +1,17 @@
from __future__ import annotations from __future__ import annotations
import os.path import os.path
import shutil from unittest import mock
import pytest import pytest
import pre_commit.constants as C
from pre_commit import envcontext from pre_commit import envcontext
from pre_commit import lang_base
from pre_commit.languages import r from pre_commit.languages import r
from pre_commit.prefix import Prefix from pre_commit.prefix import Prefix
from pre_commit.store import _make_local_repo from pre_commit.store import _make_local_repo
from pre_commit.util import resource_text
from pre_commit.util import win_exe from pre_commit.util import win_exe
from testing.language_helpers import run_language from testing.language_helpers import run_language
@ -127,7 +130,8 @@ def test_path_rscript_exec_no_r_home_set():
assert r._rscript_exec() == 'Rscript' assert r._rscript_exec() == 'Rscript'
def test_r_hook(tmp_path): @pytest.fixture
def renv_lock_file(tmp_path):
renv_lock = '''\ renv_lock = '''\
{ {
"R": { "R": {
@ -157,6 +161,12 @@ def test_r_hook(tmp_path):
} }
} }
''' '''
tmp_path.joinpath('renv.lock').write_text(renv_lock)
yield
@pytest.fixture
def description_file(tmp_path):
description = '''\ description = '''\
Package: gli.clu Package: gli.clu
Title: What the Package Does (One Line, Title Case) Title: What the Package Does (One Line, Title Case)
@ -178,27 +188,39 @@ RoxygenNote: 7.1.1
Imports: Imports:
rprojroot rprojroot
''' '''
hello_world_r = '''\ tmp_path.joinpath('DESCRIPTION').write_text(description)
yield
@pytest.fixture
def hello_world_file(tmp_path):
hello_world = '''\
stopifnot( stopifnot(
packageVersion('rprojroot') == '1.0', packageVersion('rprojroot') == '1.0',
packageVersion('gli.clu') == '0.0.0.9000' packageVersion('gli.clu') == '0.0.0.9000'
) )
cat("Hello, World, from R!\n") cat("Hello, World, from R!\n")
''' '''
tmp_path.joinpath('hello-world.R').write_text(hello_world)
yield
tmp_path.joinpath('renv.lock').write_text(renv_lock)
tmp_path.joinpath('DESCRIPTION').write_text(description) @pytest.fixture
tmp_path.joinpath('hello-world.R').write_text(hello_world_r) def renv_folder(tmp_path):
renv_dir = tmp_path.joinpath('renv') renv_dir = tmp_path.joinpath('renv')
renv_dir.mkdir() renv_dir.mkdir()
shutil.copy( activate_r = resource_text('empty_template_activate.R')
os.path.join( renv_dir.joinpath('activate.R').write_text(activate_r)
os.path.dirname(__file__), yield
'../../pre_commit/resources/empty_template_activate.R',
),
renv_dir.joinpath('activate.R'),
)
def test_r_hook(
tmp_path,
renv_lock_file,
description_file,
hello_world_file,
renv_folder,
):
expected = (0, b'Hello, World, from R!\n') expected = (0, b'Hello, World, from R!\n')
assert run_language(tmp_path, r, 'Rscript hello-world.R') == expected assert run_language(tmp_path, r, 'Rscript hello-world.R') == expected
@ -221,3 +243,55 @@ Rscript -e '
args=('hi', 'hello'), args=('hi', 'hello'),
) )
assert ret == (0, b'hi, hello, from R!\n') assert ret == (0, b'hi, hello, from R!\n')
@pytest.fixture
def prefix(tmpdir):
yield Prefix(str(tmpdir))
@pytest.fixture
def installed_environment(
renv_lock_file,
hello_world_file,
renv_folder,
prefix,
):
env_dir = lang_base.environment_dir(
prefix, r.ENVIRONMENT_DIR, r.get_default_version(),
)
r.install_environment(prefix, C.DEFAULT, ())
yield prefix, env_dir
def test_health_check_healthy(installed_environment):
# should be healthy right after creation
prefix, _ = installed_environment
assert r.health_check(prefix, C.DEFAULT) is None
def test_health_check_after_downgrade(installed_environment):
prefix, _ = installed_environment
# pretend the saved installed version is old
with mock.patch.object(r, '_read_installed_version', return_value='1.0.0'):
output = r.health_check(prefix, C.DEFAULT)
assert output is not None
assert output.startswith('Hooks were installed for R version')
@pytest.mark.parametrize('version', ('NULL', 'NA', "''"))
def test_health_check_without_version(prefix, installed_environment, version):
prefix, env_dir = installed_environment
# simulate old pre-commit install by unsetting the installed version
r._execute_r_in_renv(
f'renv::settings$r.version({version})',
prefix=prefix, version=C.DEFAULT, cwd=env_dir,
)
# no R version specified fails as unhealty
msg = 'Hooks were installed with an unknown R version'
check_output = r.health_check(prefix, C.DEFAULT)
assert check_output is not None and check_output.startswith(msg)

View file

@ -1,9 +0,0 @@
from __future__ import annotations
from pre_commit.languages import system
from testing.language_helpers import run_language
def test_system_language(tmp_path):
expected = (0, b'hello hello world\n')
assert run_language(tmp_path, system, 'echo hello hello world') == expected

View file

@ -1,14 +1,14 @@
from __future__ import annotations from __future__ import annotations
from pre_commit.languages import script from pre_commit.languages import unsupported_script
from pre_commit.util import make_executable from pre_commit.util import make_executable
from testing.language_helpers import run_language from testing.language_helpers import run_language
def test_script_language(tmp_path): def test_unsupported_script_language(tmp_path):
exe = tmp_path.joinpath('main') exe = tmp_path.joinpath('main')
exe.write_text('#!/usr/bin/env bash\necho hello hello world\n') exe.write_text('#!/usr/bin/env bash\necho hello hello world\n')
make_executable(exe) make_executable(exe)
expected = (0, b'hello hello world\n') expected = (0, b'hello hello world\n')
assert run_language(tmp_path, script, 'main') == expected assert run_language(tmp_path, unsupported_script, 'main') == expected

View file

@ -0,0 +1,10 @@
from __future__ import annotations
from pre_commit.languages import unsupported
from testing.language_helpers import run_language
def test_unsupported_language(tmp_path):
expected = (0, b'hello hello world\n')
ret = run_language(tmp_path, unsupported, 'echo hello hello world')
assert ret == expected

View file

@ -1,6 +1,7 @@
from __future__ import annotations from __future__ import annotations
import argparse import argparse
import contextlib
import os.path import os.path
from unittest import mock from unittest import mock
@ -8,6 +9,7 @@ import pytest
import pre_commit.constants as C import pre_commit.constants as C
from pre_commit import main from pre_commit import main
from pre_commit.commands import hazmat
from pre_commit.errors import FatalError from pre_commit.errors import FatalError
from pre_commit.util import cmd_output from pre_commit.util import cmd_output
from testing.auto_namedtuple import auto_namedtuple from testing.auto_namedtuple import auto_namedtuple
@ -97,11 +99,9 @@ CMDS = tuple(fn.replace('_', '-') for fn in FNS)
@pytest.fixture @pytest.fixture
def mock_commands(): def mock_commands():
mcks = {fn: mock.patch.object(main, fn).start() for fn in FNS} with contextlib.ExitStack() as ctx:
ret = auto_namedtuple(**mcks) mcks = {f: ctx.enter_context(mock.patch.object(main, f)) for f in FNS}
yield ret yield auto_namedtuple(**mcks)
for mck in ret:
mck.stop()
@pytest.fixture @pytest.fixture
@ -158,6 +158,17 @@ def test_all_cmds(command, mock_commands, mock_store_dir):
assert_only_one_mock_called(mock_commands) assert_only_one_mock_called(mock_commands)
def test_hazmat(mock_store_dir):
with mock.patch.object(hazmat, 'impl') as mck:
main.main(('hazmat', 'cd', 'subdir', '--', 'cmd', '--', 'f1', 'f2'))
assert mck.call_count == 1
(arg,), dct = mck.call_args
assert dct == {}
assert arg.tool == 'cd'
assert arg.subdir == 'subdir'
assert arg.cmd == ['cmd', '--', 'f1', 'f2']
def test_try_repo(mock_store_dir): def test_try_repo(mock_store_dir):
with mock.patch.object(main, 'try_repo') as patch: with mock.patch.object(main, 'try_repo') as patch:
main.main(('try-repo', '.')) main.main(('try-repo', '.'))

View file

@ -17,7 +17,7 @@ from pre_commit.clientlib import CONFIG_SCHEMA
from pre_commit.clientlib import load_manifest from pre_commit.clientlib import load_manifest
from pre_commit.hook import Hook from pre_commit.hook import Hook
from pre_commit.languages import python from pre_commit.languages import python
from pre_commit.languages import system from pre_commit.languages import unsupported
from pre_commit.prefix import Prefix from pre_commit.prefix import Prefix
from pre_commit.repository import _hook_installed from pre_commit.repository import _hook_installed
from pre_commit.repository import all_hooks from pre_commit.repository import all_hooks
@ -80,31 +80,6 @@ def _test_hook_repo(
assert out == expected assert out == expected
def test_python_venv_deprecation(store, caplog):
config = {
'repo': 'local',
'hooks': [{
'id': 'example',
'name': 'example',
'language': 'python_venv',
'entry': 'echo hi',
}],
}
_get_hook(config, store, 'example')
assert caplog.messages[-1] == (
'`repo: local` uses deprecated `language: python_venv`. '
'This is an alias for `language: python`. '
'Often `pre-commit autoupdate --repo local` will fix this.'
)
def test_system_hook_with_spaces(tempdir_factory, store):
_test_hook_repo(
tempdir_factory, store, 'system_hook_with_spaces_repo',
'system-hook-with-spaces', [os.devnull], b'Hello World\n',
)
def test_missing_executable(tempdir_factory, store): def test_missing_executable(tempdir_factory, store):
_test_hook_repo( _test_hook_repo(
tempdir_factory, store, 'not_found_exe', tempdir_factory, store, 'not_found_exe',
@ -240,16 +215,16 @@ def test_unknown_keys(store, caplog):
assert msg == 'Unexpected key(s) present on local => too-much: foo, hello' assert msg == 'Unexpected key(s) present on local => too-much: foo, hello'
def test_reinstall(tempdir_factory, store, log_info_mock): def test_reinstall(tempdir_factory, store, caplog):
path = make_repo(tempdir_factory, 'python_hooks_repo') path = make_repo(tempdir_factory, 'python_hooks_repo')
config = make_config_from_repo(path) config = make_config_from_repo(path)
_get_hook(config, store, 'foo') _get_hook(config, store, 'foo')
# We print some logging during clone (1) + install (3) # We print some logging during clone (1) + install (3)
assert log_info_mock.call_count == 4 assert len(caplog.record_tuples) == 4
log_info_mock.reset_mock() caplog.clear()
# Reinstall on another run should not trigger another install # Reinstall on another run should not trigger another install
_get_hook(config, store, 'foo') _get_hook(config, store, 'foo')
assert log_info_mock.call_count == 0 assert len(caplog.record_tuples) == 0
def test_control_c_control_c_on_install(tempdir_factory, store): def test_control_c_control_c_on_install(tempdir_factory, store):
@ -449,7 +424,7 @@ def test_manifest_hooks(tempdir_factory, store):
exclude_types=[], exclude_types=[],
files='', files='',
id='bash_hook', id='bash_hook',
language='script', language='unsupported_script',
language_version='default', language_version='default',
log_file='', log_file='',
minimum_pre_commit_version='0', minimum_pre_commit_version='0',
@ -482,7 +457,7 @@ def test_non_installable_hook_error_for_language_version(store, caplog):
'hooks': [{ 'hooks': [{
'id': 'system-hook', 'id': 'system-hook',
'name': 'system-hook', 'name': 'system-hook',
'language': 'system', 'language': 'unsupported',
'entry': 'python3 -c "import sys; print(sys.version)"', 'entry': 'python3 -c "import sys; print(sys.version)"',
'language_version': 'python3.10', 'language_version': 'python3.10',
}], }],
@ -494,7 +469,7 @@ def test_non_installable_hook_error_for_language_version(store, caplog):
msg, = caplog.messages msg, = caplog.messages
assert msg == ( assert msg == (
'The hook `system-hook` specifies `language_version` but is using ' 'The hook `system-hook` specifies `language_version` but is using '
'language `system` which does not install an environment. ' 'language `unsupported` which does not install an environment. '
'Perhaps you meant to use a specific language?' 'Perhaps you meant to use a specific language?'
) )
@ -505,7 +480,7 @@ def test_non_installable_hook_error_for_additional_dependencies(store, caplog):
'hooks': [{ 'hooks': [{
'id': 'system-hook', 'id': 'system-hook',
'name': 'system-hook', 'name': 'system-hook',
'language': 'system', 'language': 'unsupported',
'entry': 'python3 -c "import sys; print(sys.version)"', 'entry': 'python3 -c "import sys; print(sys.version)"',
'additional_dependencies': ['astpretty'], 'additional_dependencies': ['astpretty'],
}], }],
@ -517,17 +492,28 @@ def test_non_installable_hook_error_for_additional_dependencies(store, caplog):
msg, = caplog.messages msg, = caplog.messages
assert msg == ( assert msg == (
'The hook `system-hook` specifies `additional_dependencies` but is ' 'The hook `system-hook` specifies `additional_dependencies` but is '
'using language `system` which does not install an environment. ' 'using language `unsupported` which does not install an environment. '
'Perhaps you meant to use a specific language?' 'Perhaps you meant to use a specific language?'
) )
def test_args_with_spaces_and_quotes(tmp_path): def test_args_with_spaces_and_quotes(tmp_path):
ret = run_language( ret = run_language(
tmp_path, system, tmp_path, unsupported,
f"{shlex.quote(sys.executable)} -c 'import sys; print(sys.argv[1:])'", f"{shlex.quote(sys.executable)} -c 'import sys; print(sys.argv[1:])'",
('i have spaces', 'and"\'quotes', '$and !this'), ('i have spaces', 'and"\'quotes', '$and !this'),
) )
expected = b"['i have spaces', 'and\"\\'quotes', '$and !this']\n" expected = b"['i have spaces', 'and\"\\'quotes', '$and !this']\n"
assert ret == (0, expected) assert ret == (0, expected)
def test_hazmat(tmp_path):
ret = run_language(
tmp_path, unsupported,
f'pre-commit hazmat ignore-exit-code {shlex.quote(sys.executable)} '
f"-c 'import sys; raise SystemExit(sys.argv[1:])'",
('f1', 'f2'),
)
expected = b"['f1', 'f2']\n"
assert ret == (0, expected)

View file

@ -1,12 +1,15 @@
from __future__ import annotations from __future__ import annotations
import logging
import os.path import os.path
import shlex
import sqlite3 import sqlite3
import stat import stat
from unittest import mock from unittest import mock
import pytest import pytest
import pre_commit.constants as C
from pre_commit import git from pre_commit import git
from pre_commit.store import _get_default_directory from pre_commit.store import _get_default_directory
from pre_commit.store import _LOCAL_RESOURCES from pre_commit.store import _LOCAL_RESOURCES
@ -19,6 +22,17 @@ from testing.util import git_commit
from testing.util import xfailif_windows from testing.util import xfailif_windows
def _select_all_configs(store: Store) -> list[str]:
with store.connect() as db:
rows = db.execute('SELECT * FROM configs').fetchall()
return [path for path, in rows]
def _select_all_repos(store: Store) -> list[tuple[str, str, str]]:
with store.connect() as db:
return db.execute('SELECT repo, ref, path FROM repos').fetchall()
def test_our_session_fixture_works(): def test_our_session_fixture_works():
"""There's a session fixture which makes `Store` invariantly raise to """There's a session fixture which makes `Store` invariantly raise to
prevent writing to the home directory. prevent writing to the home directory.
@ -65,7 +79,7 @@ def test_store_init(store):
assert text_line in readme_contents assert text_line in readme_contents
def test_clone(store, tempdir_factory, log_info_mock): def test_clone(store, tempdir_factory, caplog):
path = git_dir(tempdir_factory) path = git_dir(tempdir_factory)
with cwd(path): with cwd(path):
git_commit() git_commit()
@ -74,7 +88,7 @@ def test_clone(store, tempdir_factory, log_info_mock):
ret = store.clone(path, rev) ret = store.clone(path, rev)
# Should have printed some stuff # Should have printed some stuff
assert log_info_mock.call_args_list[0][0][0].startswith( assert caplog.record_tuples[0][-1].startswith(
'Initializing environment for ', 'Initializing environment for ',
) )
@ -88,7 +102,73 @@ def test_clone(store, tempdir_factory, log_info_mock):
assert git.head_rev(ret) == rev assert git.head_rev(ret) == rev
# Assert there's an entry in the sqlite db for this # Assert there's an entry in the sqlite db for this
assert store.select_all_repos() == [(path, rev, ret)] assert _select_all_repos(store) == [(path, rev, ret)]
def test_warning_for_deprecated_stages_on_init(store, tempdir_factory, caplog):
manifest = '''\
- id: hook1
name: hook1
language: system
entry: echo hook1
stages: [commit, push]
- id: hook2
name: hook2
language: system
entry: echo hook2
stages: [push, merge-commit]
'''
path = git_dir(tempdir_factory)
with open(os.path.join(path, C.MANIFEST_FILE), 'w') as f:
f.write(manifest)
cmd_output('git', 'add', '.', cwd=path)
git_commit(cwd=path)
rev = git.head_rev(path)
store.clone(path, rev)
assert caplog.record_tuples[1] == (
'pre_commit',
logging.WARNING,
f'repo `{path}` uses deprecated stage names '
f'(commit, push, merge-commit) which will be removed in a future '
f'version. '
f'Hint: often `pre-commit autoupdate --repo {shlex.quote(path)}` '
f'will fix this. '
f'if it does not -- consider reporting an issue to that repo.',
)
# should not re-warn
caplog.clear()
store.clone(path, rev)
assert caplog.record_tuples == []
def test_no_warning_for_non_deprecated_stages_on_init(
store, tempdir_factory, caplog,
):
manifest = '''\
- id: hook1
name: hook1
language: system
entry: echo hook1
stages: [pre-commit, pre-push]
- id: hook2
name: hook2
language: system
entry: echo hook2
stages: [pre-push, pre-merge-commit]
'''
path = git_dir(tempdir_factory)
with open(os.path.join(path, C.MANIFEST_FILE), 'w') as f:
f.write(manifest)
cmd_output('git', 'add', '.', cwd=path)
git_commit(cwd=path)
rev = git.head_rev(path)
store.clone(path, rev)
assert logging.WARNING not in {tup[1] for tup in caplog.record_tuples}
def test_clone_cleans_up_on_checkout_failure(store): def test_clone_cleans_up_on_checkout_failure(store):
@ -118,7 +198,7 @@ def test_clone_when_repo_already_exists(store):
def test_clone_shallow_failure_fallback_to_complete( def test_clone_shallow_failure_fallback_to_complete(
store, tempdir_factory, store, tempdir_factory,
log_info_mock, caplog,
): ):
path = git_dir(tempdir_factory) path = git_dir(tempdir_factory)
with cwd(path): with cwd(path):
@ -134,7 +214,7 @@ def test_clone_shallow_failure_fallback_to_complete(
ret = store.clone(path, rev) ret = store.clone(path, rev)
# Should have printed some stuff # Should have printed some stuff
assert log_info_mock.call_args_list[0][0][0].startswith( assert caplog.record_tuples[0][-1].startswith(
'Initializing environment for ', 'Initializing environment for ',
) )
@ -148,7 +228,7 @@ def test_clone_shallow_failure_fallback_to_complete(
assert git.head_rev(ret) == rev assert git.head_rev(ret) == rev
# Assert there's an entry in the sqlite db for this # Assert there's an entry in the sqlite db for this
assert store.select_all_repos() == [(path, rev, ret)] assert _select_all_repos(store) == [(path, rev, ret)]
def test_clone_tag_not_on_mainline(store, tempdir_factory): def test_clone_tag_not_on_mainline(store, tempdir_factory):
@ -196,7 +276,7 @@ def test_mark_config_as_used(store, tmpdir):
with tmpdir.as_cwd(): with tmpdir.as_cwd():
f = tmpdir.join('f').ensure() f = tmpdir.join('f').ensure()
store.mark_config_used('f') store.mark_config_used('f')
assert store.select_all_configs() == [f.strpath] assert _select_all_configs(store) == [f.strpath]
def test_mark_config_as_used_idempotent(store, tmpdir): def test_mark_config_as_used_idempotent(store, tmpdir):
@ -206,21 +286,12 @@ def test_mark_config_as_used_idempotent(store, tmpdir):
def test_mark_config_as_used_does_not_exist(store): def test_mark_config_as_used_does_not_exist(store):
store.mark_config_used('f') store.mark_config_used('f')
assert store.select_all_configs() == [] assert _select_all_configs(store) == []
def _simulate_pre_1_14_0(store):
with store.connect() as db:
db.executescript('DROP TABLE configs')
def test_select_all_configs_roll_forward(store):
_simulate_pre_1_14_0(store)
assert store.select_all_configs() == []
def test_mark_config_as_used_roll_forward(store, tmpdir): def test_mark_config_as_used_roll_forward(store, tmpdir):
_simulate_pre_1_14_0(store) with store.connect() as db: # simulate pre-1.14.0
db.executescript('DROP TABLE configs')
test_mark_config_as_used(store, tmpdir) test_mark_config_as_used(store, tmpdir)
@ -245,7 +316,7 @@ def test_mark_config_as_used_readonly(tmpdir):
assert store.readonly assert store.readonly
# should be skipped due to readonly # should be skipped due to readonly
store.mark_config_used(str(cfg)) store.mark_config_used(str(cfg))
assert store.select_all_configs() == [] assert _select_all_configs(store) == []
def test_clone_with_recursive_submodules(store, tmp_path): def test_clone_with_recursive_submodules(store, tmp_path):

View file

@ -0,0 +1,47 @@
from __future__ import annotations
import pytest
from pre_commit.yaml import yaml_compose
from pre_commit.yaml_rewrite import MappingKey
from pre_commit.yaml_rewrite import MappingValue
from pre_commit.yaml_rewrite import match
from pre_commit.yaml_rewrite import SequenceItem
def test_match_produces_scalar_values_only():
src = '''\
- name: foo
- name: [not, foo] # not a scalar: should be skipped!
- name: bar
'''
matcher = (SequenceItem(), MappingValue('name'))
ret = [n.value for n in match(yaml_compose(src), matcher)]
assert ret == ['foo', 'bar']
@pytest.mark.parametrize('cls', (MappingKey, MappingValue))
def test_mapping_not_a_map(cls):
m = cls('s')
assert list(m.match(yaml_compose('[foo]'))) == []
def test_sequence_item_not_a_sequence():
assert list(SequenceItem().match(yaml_compose('s: val'))) == []
def test_mapping_key():
m = MappingKey('s')
ret = [n.value for n in m.match(yaml_compose('s: val\nt: val2'))]
assert ret == ['s']
def test_mapping_value():
m = MappingValue('s')
ret = [n.value for n in m.match(yaml_compose('s: val\nt: val2'))]
assert ret == ['val']
def test_sequence_item():
ret = [n.value for n in SequenceItem().match(yaml_compose('[a, b, c]'))]
assert ret == ['a', 'b', 'c']