cool-easter-32542
06/07/2025, 3:42 PM{pants.hash}
, but it's not clear how it can help. My goal is to avoid building a given image if it didn't change, so ideally I'd like to:
• retrieve a hash for an image that I have in the registry
• retrieve a hash that corresponds to the revision of the code base where my Dockerfile and file dependencies are stored
• if the hashes match, do nothing. If they don't match, rebuild and push
I thought that the hash can be calculated statically, i. e. just by giving Pants the target, so that Pants can take a look at files and base images contributing to a given image, and calculate the hash. However, so far it looks like the hash is accessible to the user only after a successful build of an image, so that I can tag the image using the hash and that's it. A feature is missing to retrieve the hash prior to building the image.
Describe the solution you'd like
Ideally Pants would be able to calculate the hash based solely on the contents of the repo, and without running Docker. There may be some edge cases, like using base images' digests in the hash - to be discussed how we should treat these.
Describe alternatives you've considered
Not clear if there's any viable alternative other than Pants-native build avoidance.
Additional context
Slack threads:
• https://pantsbuild.slack.com/archives/C046T6T9U/p1745401350333109?thread_ts=1745401350.333109&cid=C046T6T9U
• https://pantsbuild.slack.com/archives/C046T6T9U/p1744210991104779?thread_ts=1744210991.104779&cid=C046T6T9U
• https://pantsbuild.slack.com/archives/C046T6T9U/p1748940781635959?thread_ts=1748940781.635959&cid=C046T6T9U
pantsbuild/pantscool-easter-32542
06/09/2025, 8:47 PMpants paths
on our large monorepo consumes an excessive amount of memory and runs for 10+ minutes without completing. This happens even after filtering to python_source
targets. At its peak, the process consumed ~650 GB of RAM on an r7i.metal-48xl
instance (1536 GiB total).
Repo stats:
• ~3,400 source targets
• ~16,000 destination targets
• After adding `--filter-target-type=python_source`:
• ~2,000 source targets
• ~6,600 destination targets
Despite that, performance remains extremely slow. There appears to be either a performance bottleneck in the graphing logic or a memory inefficiency.
According to @benjy, the current Python-based graph algorithm is likely the culprit, and could benefit from a rewrite in Rust with more efficient data structures.
Pants version
2.26.0
OS
Amazon Linux 2023
Additional info
Happy to provide more logs or run experiments. This was blocking me from visualizing connections between resolves, which would've helped validate our target structure. Any flags or workarounds to scope this down further would also be appreciated.
pantsbuild/pantscool-easter-32542
06/09/2025, 8:50 PMnative_engine.IntrinsicError: Error opening file /private/var/folders/.../pants-sandbox-.../python/share/terminfo/n/ncr260vt300wpp for writing: Os { code: 62, kind: FilesystemLoop, message: "Too many levels of symbolic links" }
We traced this to a recursive symlink (ncr260vt300wpp
) in the Pants sandbox, which appears to come from the embedded Python used by Pants (Python Build Standalone / PBS).
This issue only affects one developer on macOS out of ~50 users, but is consistently reproducible on that machine. The root cause appears to be a bug in the PBS release that Pants is currently pinned to:
astral-sh/python-build-standalone#231
Upgrading the PBS version manually via [python-bootstrap.internal_python_build_standalone_info]
resolves the issue.
Pants version
2.26.0
OS
macOS (Apple Silicon)
Additional info
• Affected rule: Install Python for Pants usage
• Resolved by overriding PBS version with the 20250517 release
• Issue was very difficult to debug without realizing Pants wasn’t using system Python but PBS
• Recommend upgrading default PBS version in Pants repo
Let me know if logs, sandbox contents, or the reproduction environment would be helpful.
pantsbuild/pantscool-easter-32542
06/09/2025, 10:08 PMbindeps
backend that can analyze binaries to list which shared libs it is linked with?
## The bindeps
backend
The bindeps
backend would be functionally similar to dpkg-shlibdeps
in debian, and `rpmdeps`+`elfdeps` in EL distros. But, it would not require the native packaging tooling to run it. I imagine it would be most useful when using the `nfpm` backend to create native system packages (.deb
, .rpm
, ...) because nfpm
does not analyze included binaries like the native deb/rpm tooling does. But it could also be useful to audit/lint generated binaries before distributing them.
### What I have so far:
So far, I have code that can
• take a pex_binary
,
• extract its wheels (using pex-tools of course), and
• use `elfdeps` :package: (the python lib, not the native rpmbuild binary) to get the list of SONAMEs used by any binaries in the package.
I also have code that can take the list of SONAMEs and map them to deb package names using the debian package search API (for debian or ubuntu) instead of querying via native packaging tool(s) like dpkg -S
. This is important because deb `depends` expects package names, not SONAMEs. This package lookup is not needed for rpm where `requires` takes a list of SONAMEs in addition to package names.
For the rules I've written so far, the only python-specific bits handle extracting wheels from a pex_binary
and using zipfile to make the wheel contents available for elfdeps
to inspect. The actual elf
dependency inspection logic and the deb package search are not python-specific and could easily be used to inspect binaries from other pants package targets
### Potential scope of the bindeps
backend:
I'm calling this a bindeps
backend, because it does not need to be specific to elf
binaries/libs. In figuring out how best to analyze linux wheels, I also found these projects which would fit nicely under a bindeps
backend (though I haven't written the code to make them part of a pants backend) - perhaps as check
or fix
tools that take packaged binaries as input:
• These 3 can analyze a wheel's included python extensions (binaries / shared libs) and "repair" the wheel by copying the shared lib(s) into a copy of the wheel so that no system deps are required.
• `delocate` :package:: this is MacOS-specific, so it works with dynamic libraries. It can also combine an arm64
wheel with an x86_64
wheel to make a universal2
wheel that supports both architectures.
• `auditwheel` :package:: this is linux-specific, so it works with ELF binaries / .so
files. It can also audits the wheels' manylinux
tag, if any, to ensure that the wheel follows the system dep requirements to use that manylinux
tag.
• `delvewheel` :package:: this is Windows-specific, so it works with DLLs.
• `repairwheel` :package:: This lib combines the 3 above libs, (so it is cross-platform) focusing on using pure python libs instead of system binaries to modify any libs in the wheel. It also promises to be more host-agnostic so that it can hermetically create the same changes on any host platform.
• `abi3audit` :package:: Another wheel audit tool. This one audits a wheel's abi3
tag to see if python extensions in the wheel only use python symbols stabalized in the target python version or earlier.
## Request for feedback
1. Does this sound generally useful? If so, which part(s) sound useful?
2. Does the bindeps
name make sense for a backend of this scope?
3. Is there any interest in the creation of pants.backend.experimental.bindeps
?
pantsbuild/pantscool-easter-32542
06/10/2025, 10:18 PMcool-easter-32542
06/11/2025, 3:08 PMpants test
but also called before any published asset gets pushed (like docker image).
Describe alternatives you've considered
I will try test_shell_command with some sort of adhoc_tool. code_quality_tool might also work but I doubt it runs during the test phase.
I think it's worth tracking regardless of any alternative figured out.
Additional context
pantsbuild/pantscool-easter-32542
06/11/2025, 9:02 PMKeyError: 'IN'
is raised by lark
Running the same code with just Terraform outside of Pants works fine.
## Terraform sample
data "snowflake_schemas" "in" {
in {
database = "<database name>"
}
}
output "in_output" {
value = data.snowflake_schemas.in.schemas
}
Pants version
2.26.0
OS
MacOS 14.7.5 (23H527)
Additional info
Full traceback
154927.06 [ERROR] 1 Exception encountered:
Engine traceback:
in experimental-deploy
goal
ProcessExecutionFailure: Process 'Parse Terraform module sources: snowflake/jayoawd/nomihealth/schemas' failed with exit code 1.
stdout:
stderr:
Traceback (most recent call last):
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/lark/parsers/lalr_parser_state.py", line 77, in feed_token
action, arg = states[state][token.type]
~~~~~~~~~~~~~^^^^^^^^^^^^
KeyError: 'IN'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/private/var/folders/0x/k5krsh9j367dbd7dq7zr2xd80000gp/T/pants-sandbox-EgIzw5/.cache/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/pex", line 358, in <module>
boot(
File "/private/var/folders/0x/k5krsh9j367dbd7dq7zr2xd80000gp/T/pants-sandbox-EgIzw5/.cache/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/pex", line 341, in boot
runpy.run_module(module_name, run_name="__main__", alter_sys=True)
File "<frozen runpy>", line 226, in run_module
File "<frozen runpy>", line 98, in _run_module_code
File "<frozen runpy>", line 88, in _run_code
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/__pants_tf_parser.py", line 71, in <module>
main(sys.argv[1:])
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/__pants_tf_parser.py", line 64, in main
paths |= extract_module_source_paths(PurePath(filename).parent, content)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/__pants_tf_parser.py", line 36, in extract_module_source_paths
parsed_content = hcl2.loads(content)
^^^^^^^^^^^^^^^^^^^
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/hcl2/api.py", line 27, in loads
tree = hcl2.parse(text + "\n")
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/lark/lark.py", line 655, in parse
return self.parser.parse(text, start=start, on_error=on_error)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/lark/parser_frontends.py", line 104, in parse
return self.parser.parse(stream, chosen_start, **kw)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/lark/parsers/lalr_parser.py", line 42, in parse
return self.parser.parse(lexer, start)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/lark/parsers/lalr_parser.py", line 88, in parse
return self.parse_from_state(parser_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/lark/parsers/lalr_parser.py", line 111, in parse_from_state
raise e
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/lark/parsers/lalr_parser.py", line 102, in parse_from_state
state.feed_token(token)
File "/Users/sebastian.galindo/.cache/pants/named_caches/pex_root/venvs/1/f42d6ed4f6b8194d0f3be235033f57131483b8eb/6a484136bc2bfef9fe28f2d034180604f8a7f369/lib/python3.11/site-packages/lark/parsers/lalr_parser_state.py", line 80, in feed_token
raise UnexpectedToken(token, expected, state=self, interactive_parser=None)
lark.exceptions.UnexpectedToken: Unexpected token Token('IN', 'in') at line 2, column 3.
Expected one of:
* __ANON_3
* RBRACE
Use --keep-sandboxes=on_failure
to preserve the process chroot for inspection.
pantsbuild/pantscool-easter-32542
06/12/2025, 1:54 PM--always-true and --always-false
) to test that my source code typechecks correctly across feature flag configurations. Currently I do this by repeatedly running pants check
with different values for --mypy-args
.
Describe the solution you'd like
I would like to be able to do this using parametrize
and a single invocation of pants check
, and maintain mypy "incremental mode" performance levels. I'm unsure where exactly to use 'parametrize', if it would be on all python source files (using __defaults__
), or maybe only on an entrypoint or on the config file that loads the feature flags (where it would affect everything by dependency inference.
Describe alternatives you've considered
As mentioned above, repeated usages of pants check
. The downside of this is that it breaks mypy internal caching and causes the check run to take much longer.
Before using pants, I solved this with repeated calls to mypy, using distinct caches (see https://mypy.readthedocs.io/en/stable/command_line.html#incremental-mode).
This allowed each configuration to use a separate cache, and maintain performance, but I'm not sure how this would map to pants check
since my understanding is that pants manipulates/controls the mypy cache arguments to coordinate them with pants' own cache.
I would be interested in working on this myself, but would need some guidance since this is deep in the weeds of the mypy subsystem. Either way, its best to start with a high level discussion.
pantsbuild/pantscool-easter-32542
06/12/2025, 6:08 PMCPython>=3.11
and I have both 3.11 and 3.12 available (via ASDF in my case) Pants seems to choose 3.11, even though I would rather it choose 3.12 (constraints still being valid). Is there a way for it to output the concrete interpreters that are both available and valid for a target? And are there any heuristics I can use to guess which one it is using by default (or possibly configure a "favored" interpreter?)
Secondarily, is there a way to force using a particular version when using various goals? e.g.
pants test --python=3.12 ::
pantsbuild/pantscool-easter-32542
06/16/2025, 4:23 PMcool-easter-32542
06/16/2025, 7:11 PMpants.toml
file to use version 2.25 or 2.26 (from 2.24), one user gets the following message every time they try to run a pants
command:
Failed to find compatible interpreter on path /home/REDACTED_NAME/.cache/nce/REDACTED_HASH1/cpython-3.11.9+20240415-x86_64-unknown-linux-gnu-install_only.tar.gz/python/bin/python3.11.
Examined the following interpreters:
1.) /home/REDACTED_NAME/.cache/nce/REDACTED_HASH1/cpython-3.11.9+20240415-x86_64-unknown-linux-gnu-install_only.tar.gz/python/bin/python3.11 CPython==3.11.9
No interpreter compatible with the requested constraints was found:
Version matches CPython<3.10,>=3.8
Error: Failed to establish atomic directory /home/REDACTED_NAME/.cache/nce/REDACTED_HASH2/locks/install-REDACTED_HASH3. Population of work directory failed: Boot binding command failed: exit status: 1
Isolates your Pants from the elements.
Please select from the following boot commands:
<default> (when SCIE_BOOT is not set in the environment) Detects the current Pants installation and launches it.
bootstrap-tools Introspection tools for the Pants bootstrap process.
update Update scie-pants.
You can select a boot command by setting the SCIE_BOOT environment variable.
Pants version
2.25 or 2.26
OS
Linux
Additional info
The user environment has Python 3.9.22 (default if you were to run $ python 3.9
), 3.11.3 (default if you were to run $ python 3.11
), and 3.11.12 installed via asdf
.
Some probably irrelevant pants.toml
settings (including just in case):
[python]
interpreter_constraints = ["CPython==3.11.3"]
...
[python-bootstrap]
search_path = [
# Typical developer workflows:
"<ASDF>",
# GitHub workflows and miscellaneous:
"<PATH>",
]
...
[pex-cli]
version = "v2.38.1"
known_versions = [
...,
"v2.38.1|macos_arm64|4839cb13232c9918279198a4651193cd58a03920ab0c5a5772fa96e00de21904|4813104",
"v2.38.1|macos_x86_64|4839cb13232c9918279198a4651193cd58a03920ab0c5a5772fa96e00de21904|4813104",
"v2.38.1|linux_arm64|4839cb13232c9918279198a4651193cd58a03920ab0c5a5772fa96e00de21904|4813104",
"v2.38.1|linux_x86_64|4839cb13232c9918279198a4651193cd58a03920ab0c5a5772fa96e00de21904|4813104",
]
Some information about the machine:
$ lscpu
Architecture: x86_64
CPU op-mode(s): 32-bit, 64-bit
Address sizes: 46 bits physical, 48 bits virtual
Byte Order: Little Endian
...
$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 22.04.5 LTS
Release: 22.04
Codename: jammy
We tried removing .cache
completely, rerunning get-pants.sh
, but no dice.
pantsbuild/pantscool-easter-32542
06/18/2025, 4:19 AMpackage
goal
IntrinsicError: Error setting permissions on /home/vscode/.cache/pants/lmdb_store/immutable/files/39/39589a7ed7a41afde02019677c33017641bb0ad1187d6efd9c45e06bb6b4eef6: Permission denied (os error 13)
Environment is a VSCode devcontainer (image: "mcr.microsoft.com/devcontainers/python:1-3.11-bullseye")
pants version: 2.17.0
pantsbuild/pantscool-easter-32542
06/23/2025, 6:34 PMyarn install
, tsc
, webpack
, and jest
. concurrency
was introduced in pants 2.27, but its currently only available to the plugin api, and is not exposed to the BUILD targets.
Describe the solution you'd like
It's unclear to me if any heuristic could be used to infer a required concurrency level from within the nodejs backend automatically. Absent that, we'd either need to
• Expose the concurrency
field to the targets (package_json, node_build_script, javascript_tests, typescript_tests)
• Allow a plugin to add provide the concurrency level prior to process execution
Describe alternatives you've considered
Our current workaround for this is to override/copy the setup_node_tool_process
rule, which isn't ideal.
Additional context
Our current machines in CI are 4 cores, setting "concurrency": ProcessConcurrency.exactly(2)
has been effective so far at both not overscheduling the machines and allowing them to run to completion without running out of ram
pantsbuild/pantscool-easter-32542
06/25/2025, 4:43 AM.
├── pants.toml
├── src
│ └── python
│ ├── lib1
│ │ ├── __init__.py
│ │ ├── BUILD
│ │ ├── lib1.py
│ │ └── pyproject.toml
│ ├── package1
│ │ ├── __init__.py
│ │ ├── BUILD
│ │ ├── package1.py
│ │ └── pyproject.toml
│ └── package2
│ ├── __init__.py
│ ├── BUILD
│ ├── package2.py
│ └── pyproject.toml
└── test
└── python
└── tests
├── __init__.py
└── package1
├── __init__.py
├── BUILD
├── pyproject.toml
└── test_package1.py
Basically both package1 and package2 depends on lib1. I want to publish package1 and package2 as lib, but keep lib1 as internal lib which won't publish. And the package 1 in test folder is to test package 1. I can run pants generate-lockfiles
and pants test test/python/tests/package1:tests
without any problem. But when I do pants package src/python/package1:dist
, I got the following error:
% pants package src/python/package1:dist
21:35:50.43 [ERROR] 1 Exception encountered:
Engine traceback:
in `package` goal
NoOwnerError: No python_distribution target found to own src/python/lib1/__init__.py. Note that the owner must be in or above the owned target's directory, and must depend on it (directly or indirectly). See <https://www.pantsbuild.org/2.25/docs/python/overview/building-distributions> for how python_sources targets are mapped to distributions. See <https://www.pantsbuild.org/2.25/docs/python/overview/building-distributions>.
Could you pls let me know what the problem is and how to fix it? Also I am not quite sure for lib1, in package1 or package2, should I put lib1 dependency in the BUILD file as my current code, or I can put it in the pyproject.toml like below (tested, not work. Not sure if it is path issue)?
dependencies = [
"tornado==6.4.2",
"lib1 @ file:///${PROJECT_ROOT}/../lib1"
]
Pls help. thank you!
pantsbuild/pantscool-easter-32542
06/25/2025, 6:05 AMresolve
and lockfile
to generate the exported virtual env for any dependencies you like. But I want to streamline it for each package in a monorepo. What I want to achieve is that I can switch to any package's dedicated virtual env (meaning no other dependencies from other packages) in the monorepo. The problem for resolve
is that if you make packge1's dependency to use resolve="package1"
, then for package2 which depends on package1, you should also make resolve="package1"
in order to include all the dependencies for its virtual env. But this means the package1's virtual env will also include package2's dependencies which is not what we want.
So one way to make a fully isolated virtual env is that assuming we have a lib1, and package1, package2 which both depends on lib1, then we should do following:
• for lib1's dependencies, we make resolve=parametrize("lib1", "package1", "package2")
• for package1, we make it resolve="package1"
,
• for package2 we make it resolve="package2"
.
So in this case, we can export these three virtual envs for these three packages. But the thing is that we the monorepo has lots packages who depend on each other, this method becomes unscalable. If a lib is needed by N packages, then it needs resolve=parametrize("lib1", "package1", "package2", ... , "packageN")
.
I am think is it possible to reuse resolve
, for example:
• for lib1, we make resolve="lib1"
• for package1, we make it resolve="package1", resolve_reuse=["lib1", ...]
,
• for package2 we make it resolve="package2", resolve_reuse=["lib1", ...]
.
Do we have something similar like this or there is some other way to do it? pls advise. thank you!
pantsbuild/pantscool-easter-32542
06/30/2025, 1:16 AMcool-easter-32542
06/30/2025, 1:17 AMcool-easter-32542
06/30/2025, 7:57 PM./pants auth-acquire
to set up authentication.')
134853.66 [WARN] Auth failed - BuildSense plugin is disabled.
134909.21 [INFO] Completed: Building generate_all_lockfiles_helper.pex with 10 requirements: PyYAML<7.0,>=6.0, ansicolors==1.1.8, packaging==21.0, pex==2.1.53, setuptools<58.0,>=56.0.0, toml==0.10.2, types-PyYAML==5.4.3, types-s... (65 characters truncated)
134909.46 [INFO] Completed: Building local_dists.pex
134910.97 [INFO] Starting: Resolving plugins: hdrhistogram, toolchain.pants.plugin==0.15.0
134922.82 [INFO] Completed: Resolving plugins: hdrhistogram, toolchain.pants.plugin==0.15.0
134923.31 [WARN] Error loading access token: AuthError('Failed to load auth token (no default file or environment variable). Run ./pants auth-acquire
to set up authentication.')
134923.31 [WARN] Auth failed - BuildSense plugin is disabled.
134954.28 [INFO] Completed: Building poetry.pex with 1 requirement: poetry==1.1.8
135041.21 [INFO] Completed: Generate lockfile for flake8
135330.78 [INFO] Completed: Generate lockfile for pytest
135331.35 [INFO] Wrote lockfile for the resolve pytest
to 3rdparty/python/lockfiles/pytest.txt
135331.35 [INFO] Wrote lockfile for the resolve flake8
to 3rdparty/python/lockfiles/flake8.txt
135351.93 [INFO] Completed: Building dockerfile_parser.pex from dockerfile-parser_default_lockfile.txt
135429.16 [ERROR] Exception caught: (pants.engine.internals.scheduler.ExecutionError)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 235, in _run_inner
return self._perform_run(goals)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 174, in _perform_run
return self._perform_run_body(goals, poll=False)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 196, in _perform_run_body
poll_delay=(0.1 if poll else None),
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/init/engine_initializer.py", line 135, in run_goal_rules
goal_product, params, poll=poll, poll_delay=poll_delay
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/internals/scheduler.py", line 548, in run_goal_rule
self._raise_on_error([t for _, t in throws])
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/internals/scheduler.py", line 512, in _raise_on_error
wrapped_exceptions=tuple(t.exc for t in throws),
Exception message: 1 Exception encountered:
Engine traceback:
in select
in pants.backend.experimental.python.user_lockfiles.generate_user_lockfile_goal
in pants.engine.internals.graph.transitive_targets
in pants.engine.internals.graph.transitive_dependency_mapping
in pants.engine.internals.graph.resolve_targets (src/python/pants/backend/java/dependency_inference/PantsJavaParserLauncher.java:javaparser)
in pants.engine.internals.graph.resolve_unexpanded_targets (src/python/pants/backend/java/dependency_inference/PantsJavaParserLauncher.java:javaparser)
in pants.engine.internals.graph.resolve_dependencies (src/python/pants/backend/java/dependency_inference/PantsJavaParserLauncher.java:javaparser)
in pants.backend.java.dependency_inference.rules.infer_java_dependencies_via_imports (src/python/pants/backend/java/dependency_inference/PantsJavaParserLauncher.java:javaparser)
in pants.backend.java.dependency_inference.package_mapper.merge_first_party_module_mappings
in pants.backend.java.dependency_inference.package_mapper.map_first_party_java_targets_to_symbols
in pants.backend.java.dependency_inference.java_parser.resolve_fallible_result_to_analysis
in pants.backend.java.dependency_inference.java_parser.analyze_java_source_dependencies
in pants.backend.java.dependency_inference.java_parser_launcher.build_processors
in pants.jvm.resolve.coursier_fetch.materialize_classpath
in pants.jvm.resolve.coursier_fetch.coursier_fetch_lockfile
in pants.jvm.resolve.coursier_fetch.coursier_fetch_one_coord
in pants.engine.process.fallible_to_exec_result_or_raise
Traceback (most recent call last):
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/process.py", line 278, in fallible_to_exec_result_or_raise
local_cleanup=global_options.options.process_execution_local_cleanup,
pants.engine.process.ProcessExecutionFailure: Process 'Resolving with coursier: com.google.errorproneerror prone annotations2.5.1' failed with exit code 126.
stdout:
stderr:
+ coursier_exe=./cs-x86_64-pc-linux
+ shift
+ json_output_file=coursier_report.json
+ shift
+ ./cs-x86_64-pc-linux fetch --json-output-file=coursier_report.json --intransitive com.google.errorproneerror prone annotations2.5.1
coursier_wrapper_script.sh: line 8: ./cs-x86_64-pc-linux: Text file busy
Use --no-process-execution-local-cleanup to preserve process chroots for inspection.
Traceback (most recent call last):
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 235, in _run_inner
return self._perform_run(goals)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 174, in _perform_run
return self._perform_run_body(goals, poll=False)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 196, in _perform_run_body
poll_delay=(0.1 if poll else None),
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/init/engine_initializer.py", line 135, in run_goal_rules
goal_product, params, poll=poll, poll_delay=poll_delay
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/internals/scheduler.py", line 548, in run_goal_rule
self._raise_on_error([t for _, t in throws])
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/internals/scheduler.py", line 512, in _raise_on_error
wrapped_exceptions=tuple(t.exc for t in throws),
pants.engine.internals.scheduler.ExecutionError: 1 Exception encountered:
Engine traceback:
in select
in pants.backend.experimental.python.user_lockfiles.generate_user_lockfile_goal
in pants.engine.internals.graph.transitive_targets
in pants.engine.internals.graph.transitive_dependency_mapping
in pants.engine.internals.graph.reso…
pantsbuild/pantscool-easter-32542
06/30/2025, 7:58 PMERROR:root:01:03:14.88 [INFO] Initialization options changed: reinitializing scheduler...
01:03:16.11 [INFO] Scheduler initialized.
01:03:16.79 [ERROR] 1 Exception encountered:
Engine traceback:
in `run` goal
in Prepare environment for running PEXes
in Finding a `python` binary
in Scheduling: Searching for `python3` on PATH=/home/buildbot/.local/bin/:/home/buildbot/tools/nvm/current:/home/buildbot/tools/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/go:/usr/local/go/bin:/home/buildbot/go:/home/buildbot/go/bin
Exception: Failed to execute: Process {
argv: [
"./find_binary.sh",
"python3",
],
env: {
"PATH": "/home/buildbot/.local/bin/:/home/buildbot/tools/nvm/current:/home/buildbot/tools/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/go:/usr/local/go/bin:/home/buildbot/go:/home/buildbot/go/bin",
},
working_directory: None,
input_digests: InputDigests {
complete: DirectoryDigest {
digest: Digest {
hash: Fingerprint<1824155fc3b856540105ddc768220126b3d9e72531f69c45e3976178373328f3>,
size_bytes: 91,
},
tree: "Some(..)",
},
nailgun: DirectoryDigest {
digest: Digest {
hash: Fingerprint<e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855>,
size_bytes: 0,
},
tree: "Some(..)",
},
input_files: DirectoryDigest {
digest: Digest {
hash: Fingerprint<1824155fc3b856540105ddc768220126b3d9e72531f69c45e3976178373328f3>,
size_bytes: 91,
},
tree: "Some(..)",
},
immutable_inputs: {},
use_nailgun: {},
},
output_files: {},
output_directories: {},
timeout: None,
execution_slot_variable: None,
concurrency_available: 0,
description: "Searching for `python3` on PATH=/home/buildbot/.local/bin/:/home/buildbot/tools/nvm/current:/home/buildbot/tools/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/go:/usr/local/go/bin:/home/buildbot/go:/home/buildbot/go/bin",
level: Debug,
append_only_caches: {},
jdk_home: None,
platform: Linux_x86_64,
cache_scope: PerRestartSuccessful,
execution_strategy: Local,
remote_cache_speculation_delay: 0ns,
}
Error launching process after 1 retry for ETXTBSY. Final error was: Os { code: 22, kind: InvalidInput, message: "Invalid argument" }
Pants version
2.15.1
OS
Linux
Additional info
The log output from the failed build is provided above.
pantsbuild/pantscool-easter-32542
06/30/2025, 8:01 PMprocess_execution::nailgun
module - is not used / dead code in practice.
pantsbuild/pantscool-easter-32542
07/01/2025, 12:21 AMpg_dump
, to test Postgres' view of the final database schema.
This has overlap with the runtime_package_dependencies
field, especially if wanting to execute an artifact build from the repo. However, that's not a complete solution:
• it doesn't work with system_binary
• it will only provide the package output as a file, it won't necessarily be executable (e.g. if the package needs to run on a specific Python version, which may not be available within the test sandbox)
Describe the solution you'd like
Similar to test_shell_command
(and adhoc_tool
) add a runnable_dependencies
field to python_test
.
For instance:
# BUILD
system_binary(name="pg_dump, binary_name="pg_dump", ...)
python_test(name="test", source="test_foo.py", runnable_dependencies=[":pg_dump"])
# test_foo.py
def test_the_thing():
subprocess.run(["pg_dump", ...])
Currently, we need to either pass the whole PATH
into the test (reducing hermeticity), or do a work around like export PG_DUMP_PATH=$(which pg_dump)
in .pants.bootstrap
+ python_test(..., extra_env_vars=["PG_DUMP_PATH"])
+ subprocess.run([os.environ["PG_DUMP_PATH"], ...])
.
Describe alternatives you've considered
N/A
Additional context
Potentially clarifying the roles and behaviours of the various dependencies
fields on tests be helpful (e.g. why can't test target's dependencies
just allow packaged targets directly, and thus behave more like execution_dependencies
).
pantsbuild/pantscool-easter-32542
07/01/2025, 1:31 AMPATH
, the semgrep backend starts failing when invoking semgrep, which seems to require some system binaries: uname
, and (on macOS) security
.
For instance, this configuration locks down all the paths:
[GLOBAL]
pants_version = "2.27.0"
backend_packages = [
"pants.backend.experimental.tools.semgrep",
"pants.backend.python.providers.experimental.python_build_standalone",
]
[python]
interpreter_constraints = ["==3.11.*"]
[pex]
executable_search_paths = [] # Workaround: set to ["/usr/bin"]
[python-bootstrap]
search_path = []
[subprocess-environment]
env_vars = []
Output like:
16:31:24.99 [ERROR] Completed: Lint with Semgrep - semgrep failed (exit code 2).
Partition: .semgrep.yml
Fatal error: exception Failure("run ['uname' '-s']: No such file or directory")
Raised at Stdlib.failwith in file "<http://stdlib.ml|stdlib.ml>", line 29, characters 17-33
Called from CamlinternalLazy.force_lazy_block in file "<http://camlinternalLazy.ml|camlinternalLazy.ml>", line 31, characters 17-27
Re-raised at CamlinternalLazy.force_lazy_block in file "<http://camlinternalLazy.ml|camlinternalLazy.ml>", line 36, characters 4-11
Called from Conduit_lwt_unix.default_ctx in file "src/conduit-lwt-unix/conduit_lwt_unix.ml", line 158, characters 26-79
Called from CamlinternalLazy.force_lazy_block in file "<http://camlinternalLazy.ml|camlinternalLazy.ml>", line 31, characters 17-27
Re-raised at CamlinternalLazy.force_lazy_block in file "<http://camlinternalLazy.ml|camlinternalLazy.ml>", line 36, characters 4-11
Called from Cohttp_lwt_unix__Net.default_ctx in file "cohttp-lwt-unix/src/net.ml", line 33, characters 10-49
✕ semgrep failed.
A workaround is to ensure that those two binaries specifically are available on the PATH.
Full reproducer, including demonstration of the workaround:
cd $(mktemp -d)
cat > pants.toml <<EOF
[GLOBAL]
pants_version = "2.27.0"
backend_packages = [
"pants.backend.experimental.tools.semgrep",
"pants.backend.python.providers.experimental.python_build_standalone",
]
[python]
interpreter_constraints = ["==3.11.*"]
[pex]
executable_search_paths = []
[python-bootstrap]
search_path = []
[subprocess-environment]
env_vars = []
EOF
cat > BUILD <<EOF
file(name="foo", source="foo.txt")
EOF
echo x > foo.txt
cat > .semgrep.yml <<EOF
rules:
- id: x
patterns:
- pattern: x
message: found an x
languages: [generic]
severity: ERROR
EOF
# BUG: Fatal error: exception Failure("run ['uname' '-s']: No such file or directory")
pants lint ::
# WORKAROUND
cat > pants.toml <<EOF
[GLOBAL]
pants_version = "2.27.0"
backend_packages = [
"pants.backend.experimental.tools.semgrep",
"pants.backend.python.providers.experimental.python_build_standalone",
]
[python]
interpreter_constraints = ["==3.11.*"]
[pex]
executable_search_paths = ["/usr/bin"] # CHANGED
[python-bootstrap]
search_path = []
[subprocess-environment]
env_vars = []
EOF
pants lint ::
Pants version
2.27.0
OS
macOS
Additional info
N/A
pantsbuild/pantscool-easter-32542
07/01/2025, 4:47 AMpants fmt source/go/lib:
pants on a rather large mono repo it will take forever to download and analyze all go modules. What am I doing wrong here? Why does it not cache the modules and their analyses?
pants.toml
[GLOBAL]
pants_version = "2.28.0.dev4"
backend_packages = [
"pants.backend.experimental.go"
]
local_store_dir = "~/.cache/pants"
pants_ignore.add = [
"*/pants-tmpdir/**",
]
[anonymous-telemetry]
enabled = false
[source]
root_patterns = ["source/go"]
[golang]
minimum_expected_version = "1.24"
Pants version
2.28.0.dev4
OS
Debian 11
pantsbuild/pantscool-easter-32542
07/01/2025, 10:22 AMARG DOCKER_IO_MIRROR=<http://docker.io|docker.io>
FROM $DOCKER_IO_MIRROR/dperson/samba@sha256:e1d2a7366690749a7be06f72bdbf6a5a7d15726fc84e4e4f41e967214516edfd
The same build works fine using the python parser.
Pants version
2.27.0
OS
Linux
Additional info
I have added some tests to expose the issue here: bdabelow@8311df2
Minimal repro is here: https://github.com/bdabelow/pants-repro/tree/main/rust-dockerfile-parser
Log:
> pants package ::
12:16:12.97 [INFO] Initializing scheduler...
12:16:14.87 [INFO] Scheduler initialized.
12:16:14.97 [INFO] Completed: Scheduling: Test binary /usr/bin/docker.
12:16:14.99 [INFO] Completed: Scheduling: Test binary /bin/docker.
12:16:15.01 [ERROR] 1 Exception encountered:
Engine traceback:
in root
..
in pants.core.goals.package.package_asset
`package` goal
Traceback (most recent call last):
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/core/goals/package.py", line 195, in package_asset
packages = await MultiGet(
^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 356, in MultiGet
return await _MultiGet(tuple(__arg0))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 163, in __await__
result = yield self.gets
^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/core/goals/package.py", line 146, in environment_aware_package
package = await Get(
^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 113, in __await__
result = yield self
^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/backend/docker/goals/package_image.py", line 394, in build_docker_image
context, wrapped_target = await MultiGet(
^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 386, in MultiGet
return await _MultiGet((__arg0, __arg1))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 163, in __await__
result = yield self.gets
^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/backend/docker/util_rules/docker_build_context.py", line 384, in create_docker_build_context
return DockerBuildContext.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/backend/docker/util_rules/docker_build_context.py", line 137, in create
stage_names, tags_values = cls._get_stages_and_tags(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/backend/docker/util_rules/docker_build_context.py", line 186, in _get_stages_and_tags
raise DockerBuildContextError(
pants.backend.docker.util_rules.docker_build_context.DockerBuildContextError: Failed to parse Dockerfile baseimage tag for stage stage0 in // target, from image ref: <http://docker.io|docker.io>.
pantsbuild/pantscool-easter-32542
07/01/2025, 3:07 PMpants.toml
like:
[helm_deployment]
crd = ["path_to_crd_definition.py"]
and/or in the target `helm_deployment`:
helm_deployment(crd=["path_to_crd_definition.py"])
Additional context
I try to use pants with argo-worflows but because this is not a standard kubernetes definition pants is not able to infer the dependencies. I have played around a little bit with pants and was able to hack the support of CRDs into my local pants. As a proof-of-concept I have added following line to my local pants source.
I have added following lines to `k8s_parser.py`:
crd_sources = open("pantsbuild/crd_cron.py", 'rb').read()
if not crd_sources:
raise ValueError(
f"Unable to find source to crd_cron"
)
parser_file_content_source = FileContent(
path="__crd_source.py", content=crd_sources, is_executable=False
)
to include the CRD definiton that is located in my monorepo.
In addition I had to add to `k8s_parser_main.py`:
try:
import __crd_source
register_crd_class(__crd_source.MyPlatform, "crd", is_namespaced=False)
except ImportError as e:
print(f"WARN: No CRD defined: {e}", file=sys.stderr)
within the main function.
The crd_cron.py file looks like this:
from __future__ import annotations
from hikaru.model.rel_1_28.v1 import *
from hikaru import (HikaruBase, HikaruDocumentBase,
set_default_release)
from hikaru.crd import HikaruCRDDocumentMixin
from typing import Optional, List
from dataclasses import dataclass
set_default_release("rel_1_28")
@dataclass
class ContainersSpec(Container):
name: Optional[str]
@dataclass
class TemplatesSpec(HikaruBase):
name: str
container: Optional[ContainersSpec]
@dataclass
class WorkflowSpec(HikaruBase):
templates: List[TemplatesSpec]
@dataclass
class MyPlatformSpec(HikaruBase):
workflowSpec: WorkflowSpec
@dataclass
class MyPlatform(HikaruDocumentBase, HikaruCRDDocumentMixin):
metadata: ObjectMeta
apiVersion: str = f"<http://argoproj.io/v1alpha1|argoproj.io/v1alpha1>"
kind: str = "CronWorkflow"
spec: Optional[MyPlatformSpec] = None
This is all very hacky but I hope the idea is clear...
pantsbuild/pantscool-easter-32542
07/01/2025, 8:07 PMnode_build_script
also implements the package
goal, so let say the CI pipeline runs something like pants package ::
the defined node_build_script
would be executed, e.g.
package_json(
name="package_json",
scripts=[
node_build_script(
entry_point="only-run",
output_directories=["foo"],
),
],
)
Describe the solution you'd like
Maybe a node_run_script
could be created where just the run
goal is implemented?
Also maybe theres no need to require output_directories
or output_files
on a node_run_script
? The current node_build_script
does require either:
pants/src/python/pants/backend/javascript/package/rules.py
Lines 138 to 150 in</pantsbuild/pants/commit/82b1bb82fd4b609856c2aeab7ebb929793d52427|82b1bb8>
| def __post_init__(self) -> None: |
| ----------------------------------------------------------------------- |
| if not (self.output_directories or self.output_files): |
| raise ValueError( |
| softwrap( |
| f""" |
| Neither the {NodeBuildScriptOutputDirectoriesField.alias} nor the |
| {NodeBuildScriptOutputFilesField.alias} field was provided. |
| |
| One of the fields have to be set, or else the {NodeBuildScript.alias}
|
| output will not be captured for further use in the build. |
| """ |
| ) |
| ) |
We have use cases where we wanna run a yarn ...
command but the output is not necessary, just the exit code != 0 would be enough.
pantsbuild/pantscool-easter-32542
07/02/2025, 2:51 PM<buildroot>/.pants.d/workdir/sandboxer/sandboxer.sock
. When I try to enable the sandboxer in my repo and run any goals I get IntrinsicError: materialize_directory() request to sandboxer process failed: transport error
Digging details I can see in `.pants.d/workdir/sandboxer/sandboxer.log`:
6559 2025-07-02T14:24:45.933Z [INFO] Starting up sandboxer with RUST_LOG=INFO and these options: Opt {
socket_path: "/tmp/sand/oooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo/example-python/.pants.d/workdir/sandboxer/sandboxer.sock",
store_options: StoreCliOpt {
local_store_path: Some(
"/home/ecsb/.cache/pants/lmdb_store",
),
cas_server: None,
remote_instance_name: None,
cas_root_ca_cert_file: None,
cas_client_certs_file: None,
cas_client_key_file: None,
cas_oauth_bearer_token_path: None,
upload_chunk_bytes: 3145728,
store_rpc_retries: 3,
store_rpc_concurrency: 128,
store_batch_api_size_limit: 4194304,
header: [],
},
}
Error: Error { kind: InvalidInput, message: "path must be shorter than SUN_LEN" }
I could not figure out how to get a backtrace for the error, but from the paucity of search results for the error string I believe it is coming from: https://github.com/rust-lang/rust/blame/master/library/std/src/os/unix/net/addr.rs#L43
SUN_LEN
is not defined there, but from unix(7)
on Linux: "The sun_family field always contains AF_UNIX. On Linux, sun_path is 108 bytes in size; see also BUGS, below." (And I think the limit on BSD and friends is 104 https://man.freebsd.org/cgi/man.cgi?unix(4))
Outline of a reproduction:
Using exempt-python
with:
diff --git a/pants.toml b/pants.toml
index 3edccd3..a7d259f 100644
--- a/pants.toml
+++ b/pants.toml
@@ -2,7 +2,7 @@
# Licensed under the Apache License, Version 2.0 (see LICENSE).
[GLOBAL]
-pants_version = "2.26.0"
+pants_version = "2.27.0"
backend_packages.add = [
"pants.backend.build_files.fmt.black",
"pants.backend.python",
@@ -12,6 +12,7 @@ backend_packages.add = [
"pants.backend.python.lint.isort",
"pants.backend.python.typecheck.mypy",
]
+sandboxer=true
[anonymous-telemetry]
enabled = true
I can run pants fmt
if the repo is in /tmp/sand/example-python
but not if it is in /tmp/sand/oooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo/example-python
$ pwd
/tmp/sand/oooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo/example-python
$ pants fmt ::
10:45:27.27 [INFO] Completed: Scheduling: Test binary /usr/bin/bash.
10:45:27.29 [ERROR] 1 Exception encountered:
Engine traceback:
in `fmt` goal
IntrinsicError: materialize_directory() request to sandboxer process failed: transport error
Pants version
2.27.0
OS
Linux.
Additional info
.pants.d/workdir/sandboxer/sandboxer.sock
is already 41 characters, so there is not a lot of room left, particularly in CI environments that often have long checkout paths (actions/runner#1676)
Initial suggestion: Move the sandboxer.sock
to something like /var/run/pants/<fixed-len-hash-of-buildroot>/sandboxer.sock
pantsbuild/pantscool-easter-32542
07/02/2025, 3:23 PM$ cat helloworld/initboom/hello.py
print('hello')
$ cat helloworld/initboom/__init__.py
import colors
$ cat helloworld/initboom/BUILD
python_sources(
sources=[*python_sources.sources.default, "!hello.py"],
)
python_sources(
name="hello",
sources=["hello.py"],
overrides={
"hello.py": {"dependencies": ["!./__init__.py"]},
},
)
Demo Commit: cburroughs/example-python@dde22b4
Then Pants will agree that hello.py
does not depend on `__init__.py`:
$ pants peek helloworld/initboom/hello.py
[
{
"address": "helloworld/initboom/hello.py:hello",
"target_type": "python_source",
"dependencies": [],
"dependencies_raw": [
"!./__init__.py"
],
"description": null,
"goals": [
"run"
],
"interpreter_constraints": null,
"resolve": null,
"restartable": false,
"run_goal_use_sandbox": null,
"skip_black": false,
"skip_docformatter": false,
"skip_flake8": false,
"skip_isort": false,
"skip_mypy": false,
"source_raw": "hello.py",
"sources": [
"helloworld/initboom/hello.py"
],
"sources_fingerprint": "85cdfddd3b32f75322b7109c802e7b6e57f0d6d1e2bec997f871f1a48ff5fbbb",
"tags": null
}
]
But pants run
will than go boom with:
$ pants run helloworld/initboom/hello.py
Traceback (most recent call last):
File "/tmp/pants-sandbox-qLd9dw/./.cache/pex_root/venvs/1/0af9aa852e07539345a74140f1edf47e78828232/108a3ddc84230ab282ea6312e06cb68f51008ce5/pex", line 358, in <module>
boot(
File "/tmp/pants-sandbox-qLd9dw/./.cache/pex_root/venvs/1/0af9aa852e07539345a74140f1edf47e78828232/108a3ddc84230ab282ea6312e06cb68f51008ce5/pex", line 341, in boot
runpy.run_module(module_name, run_name="__main__", alter_sys=True)
File "/usr/lib/python3.9/runpy.py", line 221, in run_module
mod_name, mod_spec, code = _get_module_details(mod_name)
File "/usr/lib/python3.9/runpy.py", line 111, in _get_module_details
__import__(pkg_name)
File "/tmp/pants-sandbox-qLd9dw/./helloworld/initboom/__init__.py", line 1, in <module>
import colors
ModuleNotFoundError: No module named 'colors'
This is due to prepare_python_sources
unconditionally adding __init__.py
files to the sandbox
pants/src/python/pants/backend/python/util_rules/python_sources.py
Line 108 in</pantsbuild/pants/commit/ffac37328cfa98f2d0fbcd82df9f63e4e0643b90|ffac373>
| missing_init_files = await find_ancestor_files( |
| ----------------------------------------------- |
originally added in #10166
If the exclusion isn't going to work, Pants ought to at least throw a warning/error something.
Pants version
2.26.0
Additional info
Why am I trying to do this crazy thingy? I'm using Pants and Python to dynamically generate CI pipelines in a monorepo. Some process looks for files matching ci_provider_name_pipeline.py
, and then imports them. This is nice because I get to generate pipelines in a real programming language instead of yaml glop. But leaves me in a weird state with regards to Pants, I want them to be "regular python modules" for the purposes of linting, type checking, and all those good things, and it is convenient for the pipeline for a project to be co-located, but I don't want the pipelines and the projects themselves to share dependencies. (No using tensorflow in your ci pipeline generator!)
With Pants' file based model, these sorts of exclusions make sense, but they are pretty wacky from a "how are python modules expected to work" perspective. So I'm waffling on if this case should be supported at all, but the silently ignoring exclusions behavior feels like a bug regardless.
(A slightly less wacky variant that I think would run into the same issue is python sources from multiple resolves in the same directory)
pantsbuild/pantscool-easter-32542
07/02/2025, 6:26 PMdirname:dirname
) like so:
python_source(
name='foo', # no error if this is 'resolve'
source='foo.py')
python_source(
name='bar',
source='bar.py',
dependencies=['!./foo.py'] # no error if this line is removed
)
pantsbuild/example-python@ec833b2
Then peek
, --changed-dependents=transitive
, and visibility rules, all throw ResolveErrors.
$ PANTS_SOURCE=/home/ecsb/src/o/2alt-pants pants --print-stacktrace lint --changed-since=origin/main --changed-dependents=transitive
Pantsd has been turned off via Env.
Exception caught: (pants.engine.internals.scheduler.ExecutionError)
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_loader.py", line 133, in <module>
main()
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_loader.py", line 129, in main
PantsLoader.main()
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_loader.py", line 125, in main
cls.run_default_entrypoint()
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_loader.py", line 106, in run_default_entrypoint
exit_code = runner.run(start_time)
^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_runner.py", line 150, in run
runner = LocalPantsRunner.create(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/local_pants_runner.py", line 152, in create
specs = calculate_specs(
^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/init/specs_calculator.py", line 105, in calculate_specs
(changed_addresses,) = session.product_request(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/scheduler.py", line 601, in product_request
return self.execute(request)
^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/scheduler.py", line 542, in execute
self._raise_on_error([t for _, t in throws])
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/scheduler.py", line 526, in _raise_on_error
raise ExecutionError(
Exception message: 1 Exception encountered:
Engine traceback:
in root
..
in pants.vcs.changed.find_changed_owners
..
Traceback (most recent call last):
File "/home/ecsb/src/o/2alt-pants/src/python/pants/vcs/changed.py", line 79, in find_changed_owners
dependents = await find_dependents(
^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/backend/project_info/dependents.py", line 45, in map_addresses_to_dependents
dependencies_per_target = await concurrently(
^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 363, in MultiGet
return await _MultiGet(tuple(__arg0))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 170, in __await__
result = yield self.gets
^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/graph.py", line 1700, in resolve_dependencies
await _fill_parameters(
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/graph.py", line 1589, in _fill_parameters
parametrizations = await concurrently(
^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 363, in MultiGet
return await _MultiGet(tuple(__arg0))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 170, in __await__
result = yield self.gets
^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/graph.py", line 443, in resolve_target_parametrizations
adaptor_and_type = await _determine_target_adaptor_and_type(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/graph.py", line 182, in _determine_target_adaptor_and_type
target_adaptor = await find_target_adaptor(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/build_files.py", line 543, in find_target_adaptor
target_adaptor = _get_target_adaptor(address, address_family, request.description_of_origin)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/build_files.py", line 524, in _get_target_adaptor
raise ResolveError.did_you_mean(
pants.build_graph.address.ResolveError: The address helloworld/resolve:resolve from the `dependencies` field of the target helloworld/resolve:bar does not exist.
The target name ':resolve' is not defined in the directory helloworld/resolve. Did you mean one of these target names?
* :bar
* :foo
```
$ PANTS_SOURCE=/home/ecsb/src/o/2alt-pants pants --print-stacktrace peek helloworld/resolve:bar
Pantsd has been turned off via Env.
142601.64 [ERROR] 1 Exception encountered:
Engine traceback:
in root
..
in pants.backend.project_info.peek.peek
peek
goal
Traceback (most recent call last):
File "/home/ecsb/src/o/2alt-pants/src/python/pants/backend/project_info/peek.py", line 428, in peek
tds = await get_target_data(targets, **implicitly())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in await
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/backend/project_info/peek.py", line 307, in get_target_data
dependencies_per_target = await concurrently(
^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 363, in MultiGet
return await _MultiGet(tuple(__arg0))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants…
pantsbuild/pantscool-easter-32542
07/04/2025, 9:33 AMhelloworld/translator/translator.py
depends on rich
, which is defined in helloworld/translator/BUILD
and helloworld/greet/BUILD
. With ambiguity_resolution = "by_source_root"
, I would expect this not to be ambiguous, but the one in helloworld/translator/BUILD
to be used. However, when I run
pants --no-local-cache peek helloworld/translator/translator.py
, I get an unresolvable ambiguity:
09:29:59.13 [WARN] The target helloworld/translator/translator.py:lib imports `rich`, but Pants cannot safely infer a dependency because more than one target owns this module, so it is ambiguous which to use: ['helloworld/greet:greet', 'helloworld/translator:translator'].
Please explicitly include the dependency you want in the `dependencies` field of helloworld/translator/translator.py:lib, or ignore the ones you do not want by prefixing with `!` or `!!` so that one or no targets are left.
Alternatively, you can remove the ambiguity by deleting/changing some of the targets so that only 1 target owns this module. Refer to <https://www.pantsbuild.org/2.26/docs/using-pants/troubleshooting-common-issues#import-errors-and-missing-dependencies>.
09:29:59.13 [WARN] Pants cannot infer owners for the following imports in the target helloworld/translator/translator.py:lib:
* rich (line: 9)
If you do not expect an import to be inferable, add `# pants: no-infer-dep` to the import line. Otherwise, see <https://www.pantsbuild.org/2.26/docs/using-pants/troubleshooting-common-issues#import-errors-and-missing-dependencies> for common problems.
[
{
"address": "helloworld/translator/translator.py:lib",
"target_type": "python_source",
"dependencies": [],
"dependencies_raw": null,
"description": null,
"goals": [
"run"
],
"interpreter_constraints": null,
"resolve": null,
"restartable": false,
"run_goal_use_sandbox": null,
"skip_black": false,
"skip_docformatter": false,
"skip_flake8": false,
"skip_isort": false,
"skip_mypy": false,
"source_raw": "translator.py",
"sources": [
"helloworld/translator/translator.py"
],
"sources_fingerprint": "e71e261f5a1f4adbc68b626ad12c6e1de9376ac9945e204b1041388f6a52c705",
"tags": null
}
]
Pants version
2.26.0
OS
Linux
Additional info
I have a suspicion this bug is due to os.path.commonpath
as used here. This function, because of the missing training slash, has
assert os.path.commonpath(
["helloworld/translator/translator.py", "helloworld/greet:greet'"]
) == "helloworld"
assert os.path.commonpath(
["helloworld/translator/translator.py", "helloworld/translator:translator'"]
) == "helloworld"
If there were a trailing slash, then it would be able to see a difference:
assert os.path.commonpath(
["helloworld/translator/translator.py", "helloworld/translator/:translator'"]
) == "helloworld/translator"
assert os.path.commonpath(
["helloworld/translator/translator.py", "helloworld/greet/:greet'"]
) == "helloworld"
pantsbuild/pants