cool-easter-32542
06/23/2025, 6:34 PMyarn install
, tsc
, webpack
, and jest
. concurrency
was introduced in pants 2.27, but its currently only available to the plugin api, and is not exposed to the BUILD targets.
Describe the solution you'd like
It's unclear to me if any heuristic could be used to infer a required concurrency level from within the nodejs backend automatically. Absent that, we'd either need to
• Expose the concurrency
field to the targets (package_json, node_build_script, javascript_tests, typescript_tests)
• Allow a plugin to add provide the concurrency level prior to process execution
Describe alternatives you've considered
Our current workaround for this is to override/copy the setup_node_tool_process
rule, which isn't ideal.
Additional context
Our current machines in CI are 4 cores, setting "concurrency": ProcessConcurrency.exactly(2)
has been effective so far at both not overscheduling the machines and allowing them to run to completion without running out of ram
pantsbuild/pantscool-easter-32542
06/25/2025, 4:43 AM.
├── pants.toml
├── src
│ └── python
│ ├── lib1
│ │ ├── __init__.py
│ │ ├── BUILD
│ │ ├── lib1.py
│ │ └── pyproject.toml
│ ├── package1
│ │ ├── __init__.py
│ │ ├── BUILD
│ │ ├── package1.py
│ │ └── pyproject.toml
│ └── package2
│ ├── __init__.py
│ ├── BUILD
│ ├── package2.py
│ └── pyproject.toml
└── test
└── python
└── tests
├── __init__.py
└── package1
├── __init__.py
├── BUILD
├── pyproject.toml
└── test_package1.py
Basically both package1 and package2 depends on lib1. I want to publish package1 and package2 as lib, but keep lib1 as internal lib which won't publish. And the package 1 in test folder is to test package 1. I can run pants generate-lockfiles
and pants test test/python/tests/package1:tests
without any problem. But when I do pants package src/python/package1:dist
, I got the following error:
% pants package src/python/package1:dist
21:35:50.43 [ERROR] 1 Exception encountered:
Engine traceback:
in `package` goal
NoOwnerError: No python_distribution target found to own src/python/lib1/__init__.py. Note that the owner must be in or above the owned target's directory, and must depend on it (directly or indirectly). See <https://www.pantsbuild.org/2.25/docs/python/overview/building-distributions> for how python_sources targets are mapped to distributions. See <https://www.pantsbuild.org/2.25/docs/python/overview/building-distributions>.
Could you pls let me know what the problem is and how to fix it? Also I am not quite sure for lib1, in package1 or package2, should I put lib1 dependency in the BUILD file as my current code, or I can put it in the pyproject.toml like below (tested, not work. Not sure if it is path issue)?
dependencies = [
"tornado==6.4.2",
"lib1 @ file:///${PROJECT_ROOT}/../lib1"
]
Pls help. thank you!
pantsbuild/pantscool-easter-32542
06/25/2025, 6:05 AMresolve
and lockfile
to generate the exported virtual env for any dependencies you like. But I want to streamline it for each package in a monorepo. What I want to achieve is that I can switch to any package's dedicated virtual env (meaning no other dependencies from other packages) in the monorepo. The problem for resolve
is that if you make packge1's dependency to use resolve="package1"
, then for package2 which depends on package1, you should also make resolve="package1"
in order to include all the dependencies for its virtual env. But this means the package1's virtual env will also include package2's dependencies which is not what we want.
So one way to make a fully isolated virtual env is that assuming we have a lib1, and package1, package2 which both depends on lib1, then we should do following:
• for lib1's dependencies, we make resolve=parametrize("lib1", "package1", "package2")
• for package1, we make it resolve="package1"
,
• for package2 we make it resolve="package2"
.
So in this case, we can export these three virtual envs for these three packages. But the thing is that we the monorepo has lots packages who depend on each other, this method becomes unscalable. If a lib is needed by N packages, then it needs resolve=parametrize("lib1", "package1", "package2", ... , "packageN")
.
I am think is it possible to reuse resolve
, for example:
• for lib1, we make resolve="lib1"
• for package1, we make it resolve="package1", resolve_reuse=["lib1", ...]
,
• for package2 we make it resolve="package2", resolve_reuse=["lib1", ...]
.
Do we have something similar like this or there is some other way to do it? pls advise. thank you!
pantsbuild/pantscool-easter-32542
06/30/2025, 1:16 AMcool-easter-32542
06/30/2025, 1:17 AMcool-easter-32542
06/30/2025, 7:57 PM./pants auth-acquire
to set up authentication.')
134853.66 [WARN] Auth failed - BuildSense plugin is disabled.
134909.21 [INFO] Completed: Building generate_all_lockfiles_helper.pex with 10 requirements: PyYAML<7.0,>=6.0, ansicolors==1.1.8, packaging==21.0, pex==2.1.53, setuptools<58.0,>=56.0.0, toml==0.10.2, types-PyYAML==5.4.3, types-s... (65 characters truncated)
134909.46 [INFO] Completed: Building local_dists.pex
134910.97 [INFO] Starting: Resolving plugins: hdrhistogram, toolchain.pants.plugin==0.15.0
134922.82 [INFO] Completed: Resolving plugins: hdrhistogram, toolchain.pants.plugin==0.15.0
134923.31 [WARN] Error loading access token: AuthError('Failed to load auth token (no default file or environment variable). Run ./pants auth-acquire
to set up authentication.')
134923.31 [WARN] Auth failed - BuildSense plugin is disabled.
134954.28 [INFO] Completed: Building poetry.pex with 1 requirement: poetry==1.1.8
135041.21 [INFO] Completed: Generate lockfile for flake8
135330.78 [INFO] Completed: Generate lockfile for pytest
135331.35 [INFO] Wrote lockfile for the resolve pytest
to 3rdparty/python/lockfiles/pytest.txt
135331.35 [INFO] Wrote lockfile for the resolve flake8
to 3rdparty/python/lockfiles/flake8.txt
135351.93 [INFO] Completed: Building dockerfile_parser.pex from dockerfile-parser_default_lockfile.txt
135429.16 [ERROR] Exception caught: (pants.engine.internals.scheduler.ExecutionError)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 235, in _run_inner
return self._perform_run(goals)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 174, in _perform_run
return self._perform_run_body(goals, poll=False)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 196, in _perform_run_body
poll_delay=(0.1 if poll else None),
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/init/engine_initializer.py", line 135, in run_goal_rules
goal_product, params, poll=poll, poll_delay=poll_delay
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/internals/scheduler.py", line 548, in run_goal_rule
self._raise_on_error([t for _, t in throws])
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/internals/scheduler.py", line 512, in _raise_on_error
wrapped_exceptions=tuple(t.exc for t in throws),
Exception message: 1 Exception encountered:
Engine traceback:
in select
in pants.backend.experimental.python.user_lockfiles.generate_user_lockfile_goal
in pants.engine.internals.graph.transitive_targets
in pants.engine.internals.graph.transitive_dependency_mapping
in pants.engine.internals.graph.resolve_targets (src/python/pants/backend/java/dependency_inference/PantsJavaParserLauncher.java:javaparser)
in pants.engine.internals.graph.resolve_unexpanded_targets (src/python/pants/backend/java/dependency_inference/PantsJavaParserLauncher.java:javaparser)
in pants.engine.internals.graph.resolve_dependencies (src/python/pants/backend/java/dependency_inference/PantsJavaParserLauncher.java:javaparser)
in pants.backend.java.dependency_inference.rules.infer_java_dependencies_via_imports (src/python/pants/backend/java/dependency_inference/PantsJavaParserLauncher.java:javaparser)
in pants.backend.java.dependency_inference.package_mapper.merge_first_party_module_mappings
in pants.backend.java.dependency_inference.package_mapper.map_first_party_java_targets_to_symbols
in pants.backend.java.dependency_inference.java_parser.resolve_fallible_result_to_analysis
in pants.backend.java.dependency_inference.java_parser.analyze_java_source_dependencies
in pants.backend.java.dependency_inference.java_parser_launcher.build_processors
in pants.jvm.resolve.coursier_fetch.materialize_classpath
in pants.jvm.resolve.coursier_fetch.coursier_fetch_lockfile
in pants.jvm.resolve.coursier_fetch.coursier_fetch_one_coord
in pants.engine.process.fallible_to_exec_result_or_raise
Traceback (most recent call last):
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/process.py", line 278, in fallible_to_exec_result_or_raise
local_cleanup=global_options.options.process_execution_local_cleanup,
pants.engine.process.ProcessExecutionFailure: Process 'Resolving with coursier: com.google.errorproneerror prone annotations2.5.1' failed with exit code 126.
stdout:
stderr:
+ coursier_exe=./cs-x86_64-pc-linux
+ shift
+ json_output_file=coursier_report.json
+ shift
+ ./cs-x86_64-pc-linux fetch --json-output-file=coursier_report.json --intransitive com.google.errorproneerror prone annotations2.5.1
coursier_wrapper_script.sh: line 8: ./cs-x86_64-pc-linux: Text file busy
Use --no-process-execution-local-cleanup to preserve process chroots for inspection.
Traceback (most recent call last):
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 235, in _run_inner
return self._perform_run(goals)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 174, in _perform_run
return self._perform_run_body(goals, poll=False)
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/bin/local_pants_runner.py", line 196, in _perform_run_body
poll_delay=(0.1 if poll else None),
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/init/engine_initializer.py", line 135, in run_goal_rules
goal_product, params, poll=poll, poll_delay=poll_delay
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/internals/scheduler.py", line 548, in run_goal_rule
self._raise_on_error([t for _, t in throws])
File "/home/jsirois/dev/pantsbuild/jsirois-pants/src/python/pants/engine/internals/scheduler.py", line 512, in _raise_on_error
wrapped_exceptions=tuple(t.exc for t in throws),
pants.engine.internals.scheduler.ExecutionError: 1 Exception encountered:
Engine traceback:
in select
in pants.backend.experimental.python.user_lockfiles.generate_user_lockfile_goal
in pants.engine.internals.graph.transitive_targets
in pants.engine.internals.graph.transitive_dependency_mapping
in pants.engine.internals.graph.reso…
pantsbuild/pantscool-easter-32542
06/30/2025, 7:58 PMERROR:root:01:03:14.88 [INFO] Initialization options changed: reinitializing scheduler...
01:03:16.11 [INFO] Scheduler initialized.
01:03:16.79 [ERROR] 1 Exception encountered:
Engine traceback:
in `run` goal
in Prepare environment for running PEXes
in Finding a `python` binary
in Scheduling: Searching for `python3` on PATH=/home/buildbot/.local/bin/:/home/buildbot/tools/nvm/current:/home/buildbot/tools/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/go:/usr/local/go/bin:/home/buildbot/go:/home/buildbot/go/bin
Exception: Failed to execute: Process {
argv: [
"./find_binary.sh",
"python3",
],
env: {
"PATH": "/home/buildbot/.local/bin/:/home/buildbot/tools/nvm/current:/home/buildbot/tools/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/go:/usr/local/go/bin:/home/buildbot/go:/home/buildbot/go/bin",
},
working_directory: None,
input_digests: InputDigests {
complete: DirectoryDigest {
digest: Digest {
hash: Fingerprint<1824155fc3b856540105ddc768220126b3d9e72531f69c45e3976178373328f3>,
size_bytes: 91,
},
tree: "Some(..)",
},
nailgun: DirectoryDigest {
digest: Digest {
hash: Fingerprint<e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855>,
size_bytes: 0,
},
tree: "Some(..)",
},
input_files: DirectoryDigest {
digest: Digest {
hash: Fingerprint<1824155fc3b856540105ddc768220126b3d9e72531f69c45e3976178373328f3>,
size_bytes: 91,
},
tree: "Some(..)",
},
immutable_inputs: {},
use_nailgun: {},
},
output_files: {},
output_directories: {},
timeout: None,
execution_slot_variable: None,
concurrency_available: 0,
description: "Searching for `python3` on PATH=/home/buildbot/.local/bin/:/home/buildbot/tools/nvm/current:/home/buildbot/tools/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/local/go:/usr/local/go/bin:/home/buildbot/go:/home/buildbot/go/bin",
level: Debug,
append_only_caches: {},
jdk_home: None,
platform: Linux_x86_64,
cache_scope: PerRestartSuccessful,
execution_strategy: Local,
remote_cache_speculation_delay: 0ns,
}
Error launching process after 1 retry for ETXTBSY. Final error was: Os { code: 22, kind: InvalidInput, message: "Invalid argument" }
Pants version
2.15.1
OS
Linux
Additional info
The log output from the failed build is provided above.
pantsbuild/pantscool-easter-32542
06/30/2025, 8:01 PMprocess_execution::nailgun
module - is not used / dead code in practice.
pantsbuild/pantscool-easter-32542
07/01/2025, 12:21 AMpg_dump
, to test Postgres' view of the final database schema.
This has overlap with the runtime_package_dependencies
field, especially if wanting to execute an artifact build from the repo. However, that's not a complete solution:
• it doesn't work with system_binary
• it will only provide the package output as a file, it won't necessarily be executable (e.g. if the package needs to run on a specific Python version, which may not be available within the test sandbox)
Describe the solution you'd like
Similar to test_shell_command
(and adhoc_tool
) add a runnable_dependencies
field to python_test
.
For instance:
# BUILD
system_binary(name="pg_dump, binary_name="pg_dump", ...)
python_test(name="test", source="test_foo.py", runnable_dependencies=[":pg_dump"])
# test_foo.py
def test_the_thing():
subprocess.run(["pg_dump", ...])
Currently, we need to either pass the whole PATH
into the test (reducing hermeticity), or do a work around like export PG_DUMP_PATH=$(which pg_dump)
in .pants.bootstrap
+ python_test(..., extra_env_vars=["PG_DUMP_PATH"])
+ subprocess.run([os.environ["PG_DUMP_PATH"], ...])
.
Describe alternatives you've considered
N/A
Additional context
Potentially clarifying the roles and behaviours of the various dependencies
fields on tests be helpful (e.g. why can't test target's dependencies
just allow packaged targets directly, and thus behave more like execution_dependencies
).
pantsbuild/pantscool-easter-32542
07/01/2025, 1:31 AMPATH
, the semgrep backend starts failing when invoking semgrep, which seems to require some system binaries: uname
, and (on macOS) security
.
For instance, this configuration locks down all the paths:
[GLOBAL]
pants_version = "2.27.0"
backend_packages = [
"pants.backend.experimental.tools.semgrep",
"pants.backend.python.providers.experimental.python_build_standalone",
]
[python]
interpreter_constraints = ["==3.11.*"]
[pex]
executable_search_paths = [] # Workaround: set to ["/usr/bin"]
[python-bootstrap]
search_path = []
[subprocess-environment]
env_vars = []
Output like:
16:31:24.99 [ERROR] Completed: Lint with Semgrep - semgrep failed (exit code 2).
Partition: .semgrep.yml
Fatal error: exception Failure("run ['uname' '-s']: No such file or directory")
Raised at Stdlib.failwith in file "<http://stdlib.ml|stdlib.ml>", line 29, characters 17-33
Called from CamlinternalLazy.force_lazy_block in file "<http://camlinternalLazy.ml|camlinternalLazy.ml>", line 31, characters 17-27
Re-raised at CamlinternalLazy.force_lazy_block in file "<http://camlinternalLazy.ml|camlinternalLazy.ml>", line 36, characters 4-11
Called from Conduit_lwt_unix.default_ctx in file "src/conduit-lwt-unix/conduit_lwt_unix.ml", line 158, characters 26-79
Called from CamlinternalLazy.force_lazy_block in file "<http://camlinternalLazy.ml|camlinternalLazy.ml>", line 31, characters 17-27
Re-raised at CamlinternalLazy.force_lazy_block in file "<http://camlinternalLazy.ml|camlinternalLazy.ml>", line 36, characters 4-11
Called from Cohttp_lwt_unix__Net.default_ctx in file "cohttp-lwt-unix/src/net.ml", line 33, characters 10-49
✕ semgrep failed.
A workaround is to ensure that those two binaries specifically are available on the PATH.
Full reproducer, including demonstration of the workaround:
cd $(mktemp -d)
cat > pants.toml <<EOF
[GLOBAL]
pants_version = "2.27.0"
backend_packages = [
"pants.backend.experimental.tools.semgrep",
"pants.backend.python.providers.experimental.python_build_standalone",
]
[python]
interpreter_constraints = ["==3.11.*"]
[pex]
executable_search_paths = []
[python-bootstrap]
search_path = []
[subprocess-environment]
env_vars = []
EOF
cat > BUILD <<EOF
file(name="foo", source="foo.txt")
EOF
echo x > foo.txt
cat > .semgrep.yml <<EOF
rules:
- id: x
patterns:
- pattern: x
message: found an x
languages: [generic]
severity: ERROR
EOF
# BUG: Fatal error: exception Failure("run ['uname' '-s']: No such file or directory")
pants lint ::
# WORKAROUND
cat > pants.toml <<EOF
[GLOBAL]
pants_version = "2.27.0"
backend_packages = [
"pants.backend.experimental.tools.semgrep",
"pants.backend.python.providers.experimental.python_build_standalone",
]
[python]
interpreter_constraints = ["==3.11.*"]
[pex]
executable_search_paths = ["/usr/bin"] # CHANGED
[python-bootstrap]
search_path = []
[subprocess-environment]
env_vars = []
EOF
pants lint ::
Pants version
2.27.0
OS
macOS
Additional info
N/A
pantsbuild/pantscool-easter-32542
07/01/2025, 4:47 AMpants fmt source/go/lib:
pants on a rather large mono repo it will take forever to download and analyze all go modules. What am I doing wrong here? Why does it not cache the modules and their analyses?
pants.toml
[GLOBAL]
pants_version = "2.28.0.dev4"
backend_packages = [
"pants.backend.experimental.go"
]
local_store_dir = "~/.cache/pants"
pants_ignore.add = [
"*/pants-tmpdir/**",
]
[anonymous-telemetry]
enabled = false
[source]
root_patterns = ["source/go"]
[golang]
minimum_expected_version = "1.24"
Pants version
2.28.0.dev4
OS
Debian 11
pantsbuild/pantscool-easter-32542
07/01/2025, 10:22 AMARG DOCKER_IO_MIRROR=<http://docker.io|docker.io>
FROM $DOCKER_IO_MIRROR/dperson/samba@sha256:e1d2a7366690749a7be06f72bdbf6a5a7d15726fc84e4e4f41e967214516edfd
The same build works fine using the python parser.
Pants version
2.27.0
OS
Linux
Additional info
I have added some tests to expose the issue here: bdabelow@8311df2
Minimal repro is here: https://github.com/bdabelow/pants-repro/tree/main/rust-dockerfile-parser
Log:
> pants package ::
12:16:12.97 [INFO] Initializing scheduler...
12:16:14.87 [INFO] Scheduler initialized.
12:16:14.97 [INFO] Completed: Scheduling: Test binary /usr/bin/docker.
12:16:14.99 [INFO] Completed: Scheduling: Test binary /bin/docker.
12:16:15.01 [ERROR] 1 Exception encountered:
Engine traceback:
in root
..
in pants.core.goals.package.package_asset
`package` goal
Traceback (most recent call last):
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/core/goals/package.py", line 195, in package_asset
packages = await MultiGet(
^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 356, in MultiGet
return await _MultiGet(tuple(__arg0))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 163, in __await__
result = yield self.gets
^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/core/goals/package.py", line 146, in environment_aware_package
package = await Get(
^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 113, in __await__
result = yield self
^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/backend/docker/goals/package_image.py", line 394, in build_docker_image
context, wrapped_target = await MultiGet(
^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 386, in MultiGet
return await _MultiGet((__arg0, __arg1))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/engine/internals/selectors.py", line 163, in __await__
result = yield self.gets
^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/backend/docker/util_rules/docker_build_context.py", line 384, in create_docker_build_context
return DockerBuildContext.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/backend/docker/util_rules/docker_build_context.py", line 137, in create
stage_names, tags_values = cls._get_stages_and_tags(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/axccu/.cache/nce/6d61748cea187199dc28418157b2121ccbbb44e4af2ee6fddf8c04bbed73f76d/bindings/venvs/2.27.0/lib/python3.11/site-packages/pants/backend/docker/util_rules/docker_build_context.py", line 186, in _get_stages_and_tags
raise DockerBuildContextError(
pants.backend.docker.util_rules.docker_build_context.DockerBuildContextError: Failed to parse Dockerfile baseimage tag for stage stage0 in // target, from image ref: <http://docker.io|docker.io>.
pantsbuild/pantscool-easter-32542
07/01/2025, 3:07 PMpants.toml
like:
[helm_deployment]
crd = ["path_to_crd_definition.py"]
and/or in the target `helm_deployment`:
helm_deployment(crd=["path_to_crd_definition.py"])
Additional context
I try to use pants with argo-worflows but because this is not a standard kubernetes definition pants is not able to infer the dependencies. I have played around a little bit with pants and was able to hack the support of CRDs into my local pants. As a proof-of-concept I have added following line to my local pants source.
I have added following lines to `k8s_parser.py`:
crd_sources = open("pantsbuild/crd_cron.py", 'rb').read()
if not crd_sources:
raise ValueError(
f"Unable to find source to crd_cron"
)
parser_file_content_source = FileContent(
path="__crd_source.py", content=crd_sources, is_executable=False
)
to include the CRD definiton that is located in my monorepo.
In addition I had to add to `k8s_parser_main.py`:
try:
import __crd_source
register_crd_class(__crd_source.MyPlatform, "crd", is_namespaced=False)
except ImportError as e:
print(f"WARN: No CRD defined: {e}", file=sys.stderr)
within the main function.
The crd_cron.py file looks like this:
from __future__ import annotations
from hikaru.model.rel_1_28.v1 import *
from hikaru import (HikaruBase, HikaruDocumentBase,
set_default_release)
from hikaru.crd import HikaruCRDDocumentMixin
from typing import Optional, List
from dataclasses import dataclass
set_default_release("rel_1_28")
@dataclass
class ContainersSpec(Container):
name: Optional[str]
@dataclass
class TemplatesSpec(HikaruBase):
name: str
container: Optional[ContainersSpec]
@dataclass
class WorkflowSpec(HikaruBase):
templates: List[TemplatesSpec]
@dataclass
class MyPlatformSpec(HikaruBase):
workflowSpec: WorkflowSpec
@dataclass
class MyPlatform(HikaruDocumentBase, HikaruCRDDocumentMixin):
metadata: ObjectMeta
apiVersion: str = f"<http://argoproj.io/v1alpha1|argoproj.io/v1alpha1>"
kind: str = "CronWorkflow"
spec: Optional[MyPlatformSpec] = None
This is all very hacky but I hope the idea is clear...
pantsbuild/pantscool-easter-32542
07/01/2025, 8:07 PMnode_build_script
also implements the package
goal, so let say the CI pipeline runs something like pants package ::
the defined node_build_script
would be executed, e.g.
package_json(
name="package_json",
scripts=[
node_build_script(
entry_point="only-run",
output_directories=["foo"],
),
],
)
Describe the solution you'd like
Maybe a node_run_script
could be created where just the run
goal is implemented?
Also maybe theres no need to require output_directories
or output_files
on a node_run_script
? The current node_build_script
does require either:
pants/src/python/pants/backend/javascript/package/rules.py
Lines 138 to 150 in</pantsbuild/pants/commit/82b1bb82fd4b609856c2aeab7ebb929793d52427|82b1bb8>
| def __post_init__(self) -> None: |
| ----------------------------------------------------------------------- |
| if not (self.output_directories or self.output_files): |
| raise ValueError( |
| softwrap( |
| f""" |
| Neither the {NodeBuildScriptOutputDirectoriesField.alias} nor the |
| {NodeBuildScriptOutputFilesField.alias} field was provided. |
| |
| One of the fields have to be set, or else the {NodeBuildScript.alias}
|
| output will not be captured for further use in the build. |
| """ |
| ) |
| ) |
We have use cases where we wanna run a yarn ...
command but the output is not necessary, just the exit code != 0 would be enough.
pantsbuild/pantscool-easter-32542
07/02/2025, 2:51 PM<buildroot>/.pants.d/workdir/sandboxer/sandboxer.sock
. When I try to enable the sandboxer in my repo and run any goals I get IntrinsicError: materialize_directory() request to sandboxer process failed: transport error
Digging details I can see in `.pants.d/workdir/sandboxer/sandboxer.log`:
6559 2025-07-02T14:24:45.933Z [INFO] Starting up sandboxer with RUST_LOG=INFO and these options: Opt {
socket_path: "/tmp/sand/oooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo/example-python/.pants.d/workdir/sandboxer/sandboxer.sock",
store_options: StoreCliOpt {
local_store_path: Some(
"/home/ecsb/.cache/pants/lmdb_store",
),
cas_server: None,
remote_instance_name: None,
cas_root_ca_cert_file: None,
cas_client_certs_file: None,
cas_client_key_file: None,
cas_oauth_bearer_token_path: None,
upload_chunk_bytes: 3145728,
store_rpc_retries: 3,
store_rpc_concurrency: 128,
store_batch_api_size_limit: 4194304,
header: [],
},
}
Error: Error { kind: InvalidInput, message: "path must be shorter than SUN_LEN" }
I could not figure out how to get a backtrace for the error, but from the paucity of search results for the error string I believe it is coming from: https://github.com/rust-lang/rust/blame/master/library/std/src/os/unix/net/addr.rs#L43
SUN_LEN
is not defined there, but from unix(7)
on Linux: "The sun_family field always contains AF_UNIX. On Linux, sun_path is 108 bytes in size; see also BUGS, below." (And I think the limit on BSD and friends is 104 https://man.freebsd.org/cgi/man.cgi?unix(4))
Outline of a reproduction:
Using exempt-python
with:
diff --git a/pants.toml b/pants.toml
index 3edccd3..a7d259f 100644
--- a/pants.toml
+++ b/pants.toml
@@ -2,7 +2,7 @@
# Licensed under the Apache License, Version 2.0 (see LICENSE).
[GLOBAL]
-pants_version = "2.26.0"
+pants_version = "2.27.0"
backend_packages.add = [
"pants.backend.build_files.fmt.black",
"pants.backend.python",
@@ -12,6 +12,7 @@ backend_packages.add = [
"pants.backend.python.lint.isort",
"pants.backend.python.typecheck.mypy",
]
+sandboxer=true
[anonymous-telemetry]
enabled = true
I can run pants fmt
if the repo is in /tmp/sand/example-python
but not if it is in /tmp/sand/oooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo/example-python
$ pwd
/tmp/sand/oooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo/example-python
$ pants fmt ::
10:45:27.27 [INFO] Completed: Scheduling: Test binary /usr/bin/bash.
10:45:27.29 [ERROR] 1 Exception encountered:
Engine traceback:
in `fmt` goal
IntrinsicError: materialize_directory() request to sandboxer process failed: transport error
Pants version
2.27.0
OS
Linux.
Additional info
.pants.d/workdir/sandboxer/sandboxer.sock
is already 41 characters, so there is not a lot of room left, particularly in CI environments that often have long checkout paths (actions/runner#1676)
Initial suggestion: Move the sandboxer.sock
to something like /var/run/pants/<fixed-len-hash-of-buildroot>/sandboxer.sock
pantsbuild/pantscool-easter-32542
07/02/2025, 3:23 PM$ cat helloworld/initboom/hello.py
print('hello')
$ cat helloworld/initboom/__init__.py
import colors
$ cat helloworld/initboom/BUILD
python_sources(
sources=[*python_sources.sources.default, "!hello.py"],
)
python_sources(
name="hello",
sources=["hello.py"],
overrides={
"hello.py": {"dependencies": ["!./__init__.py"]},
},
)
Demo Commit: cburroughs/example-python@dde22b4
Then Pants will agree that hello.py
does not depend on `__init__.py`:
$ pants peek helloworld/initboom/hello.py
[
{
"address": "helloworld/initboom/hello.py:hello",
"target_type": "python_source",
"dependencies": [],
"dependencies_raw": [
"!./__init__.py"
],
"description": null,
"goals": [
"run"
],
"interpreter_constraints": null,
"resolve": null,
"restartable": false,
"run_goal_use_sandbox": null,
"skip_black": false,
"skip_docformatter": false,
"skip_flake8": false,
"skip_isort": false,
"skip_mypy": false,
"source_raw": "hello.py",
"sources": [
"helloworld/initboom/hello.py"
],
"sources_fingerprint": "85cdfddd3b32f75322b7109c802e7b6e57f0d6d1e2bec997f871f1a48ff5fbbb",
"tags": null
}
]
But pants run
will than go boom with:
$ pants run helloworld/initboom/hello.py
Traceback (most recent call last):
File "/tmp/pants-sandbox-qLd9dw/./.cache/pex_root/venvs/1/0af9aa852e07539345a74140f1edf47e78828232/108a3ddc84230ab282ea6312e06cb68f51008ce5/pex", line 358, in <module>
boot(
File "/tmp/pants-sandbox-qLd9dw/./.cache/pex_root/venvs/1/0af9aa852e07539345a74140f1edf47e78828232/108a3ddc84230ab282ea6312e06cb68f51008ce5/pex", line 341, in boot
runpy.run_module(module_name, run_name="__main__", alter_sys=True)
File "/usr/lib/python3.9/runpy.py", line 221, in run_module
mod_name, mod_spec, code = _get_module_details(mod_name)
File "/usr/lib/python3.9/runpy.py", line 111, in _get_module_details
__import__(pkg_name)
File "/tmp/pants-sandbox-qLd9dw/./helloworld/initboom/__init__.py", line 1, in <module>
import colors
ModuleNotFoundError: No module named 'colors'
This is due to prepare_python_sources
unconditionally adding __init__.py
files to the sandbox
pants/src/python/pants/backend/python/util_rules/python_sources.py
Line 108 in</pantsbuild/pants/commit/ffac37328cfa98f2d0fbcd82df9f63e4e0643b90|ffac373>
| missing_init_files = await find_ancestor_files( |
| ----------------------------------------------- |
originally added in #10166
If the exclusion isn't going to work, Pants ought to at least throw a warning/error something.
Pants version
2.26.0
Additional info
Why am I trying to do this crazy thingy? I'm using Pants and Python to dynamically generate CI pipelines in a monorepo. Some process looks for files matching ci_provider_name_pipeline.py
, and then imports them. This is nice because I get to generate pipelines in a real programming language instead of yaml glop. But leaves me in a weird state with regards to Pants, I want them to be "regular python modules" for the purposes of linting, type checking, and all those good things, and it is convenient for the pipeline for a project to be co-located, but I don't want the pipelines and the projects themselves to share dependencies. (No using tensorflow in your ci pipeline generator!)
With Pants' file based model, these sorts of exclusions make sense, but they are pretty wacky from a "how are python modules expected to work" perspective. So I'm waffling on if this case should be supported at all, but the silently ignoring exclusions behavior feels like a bug regardless.
(A slightly less wacky variant that I think would run into the same issue is python sources from multiple resolves in the same directory)
pantsbuild/pantscool-easter-32542
07/02/2025, 6:26 PMdirname:dirname
) like so:
python_source(
name='foo', # no error if this is 'resolve'
source='foo.py')
python_source(
name='bar',
source='bar.py',
dependencies=['!./foo.py'] # no error if this line is removed
)
pantsbuild/example-python@ec833b2
Then peek
, --changed-dependents=transitive
, and visibility rules, all throw ResolveErrors.
$ PANTS_SOURCE=/home/ecsb/src/o/2alt-pants pants --print-stacktrace lint --changed-since=origin/main --changed-dependents=transitive
Pantsd has been turned off via Env.
Exception caught: (pants.engine.internals.scheduler.ExecutionError)
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_loader.py", line 133, in <module>
main()
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_loader.py", line 129, in main
PantsLoader.main()
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_loader.py", line 125, in main
cls.run_default_entrypoint()
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_loader.py", line 106, in run_default_entrypoint
exit_code = runner.run(start_time)
^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/pants_runner.py", line 150, in run
runner = LocalPantsRunner.create(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/bin/local_pants_runner.py", line 152, in create
specs = calculate_specs(
^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/init/specs_calculator.py", line 105, in calculate_specs
(changed_addresses,) = session.product_request(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/scheduler.py", line 601, in product_request
return self.execute(request)
^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/scheduler.py", line 542, in execute
self._raise_on_error([t for _, t in throws])
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/scheduler.py", line 526, in _raise_on_error
raise ExecutionError(
Exception message: 1 Exception encountered:
Engine traceback:
in root
..
in pants.vcs.changed.find_changed_owners
..
Traceback (most recent call last):
File "/home/ecsb/src/o/2alt-pants/src/python/pants/vcs/changed.py", line 79, in find_changed_owners
dependents = await find_dependents(
^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/backend/project_info/dependents.py", line 45, in map_addresses_to_dependents
dependencies_per_target = await concurrently(
^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 363, in MultiGet
return await _MultiGet(tuple(__arg0))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 170, in __await__
result = yield self.gets
^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/graph.py", line 1700, in resolve_dependencies
await _fill_parameters(
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/graph.py", line 1589, in _fill_parameters
parametrizations = await concurrently(
^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 363, in MultiGet
return await _MultiGet(tuple(__arg0))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 170, in __await__
result = yield self.gets
^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/graph.py", line 443, in resolve_target_parametrizations
adaptor_and_type = await _determine_target_adaptor_and_type(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/graph.py", line 182, in _determine_target_adaptor_and_type
target_adaptor = await find_target_adaptor(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in __await__
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/build_files.py", line 543, in find_target_adaptor
target_adaptor = _get_target_adaptor(address, address_family, request.description_of_origin)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/build_files.py", line 524, in _get_target_adaptor
raise ResolveError.did_you_mean(
pants.build_graph.address.ResolveError: The address helloworld/resolve:resolve from the `dependencies` field of the target helloworld/resolve:bar does not exist.
The target name ':resolve' is not defined in the directory helloworld/resolve. Did you mean one of these target names?
* :bar
* :foo
```
$ PANTS_SOURCE=/home/ecsb/src/o/2alt-pants pants --print-stacktrace peek helloworld/resolve:bar
Pantsd has been turned off via Env.
142601.64 [ERROR] 1 Exception encountered:
Engine traceback:
in root
..
in pants.backend.project_info.peek.peek
peek
goal
Traceback (most recent call last):
File "/home/ecsb/src/o/2alt-pants/src/python/pants/backend/project_info/peek.py", line 428, in peek
tds = await get_target_data(targets, **implicitly())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/rules.py", line 71, in wrapper
return await call
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 78, in await
result = yield self
^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/backend/project_info/peek.py", line 307, in get_target_data
dependencies_per_target = await concurrently(
^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants/src/python/pants/engine/internals/selectors.py", line 363, in MultiGet
return await _MultiGet(tuple(__arg0))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ecsb/src/o/2alt-pants…
pantsbuild/pantscool-easter-32542
07/04/2025, 9:33 AMhelloworld/translator/translator.py
depends on rich
, which is defined in helloworld/translator/BUILD
and helloworld/greet/BUILD
. With ambiguity_resolution = "by_source_root"
, I would expect this not to be ambiguous, but the one in helloworld/translator/BUILD
to be used. However, when I run
pants --no-local-cache peek helloworld/translator/translator.py
, I get an unresolvable ambiguity:
09:29:59.13 [WARN] The target helloworld/translator/translator.py:lib imports `rich`, but Pants cannot safely infer a dependency because more than one target owns this module, so it is ambiguous which to use: ['helloworld/greet:greet', 'helloworld/translator:translator'].
Please explicitly include the dependency you want in the `dependencies` field of helloworld/translator/translator.py:lib, or ignore the ones you do not want by prefixing with `!` or `!!` so that one or no targets are left.
Alternatively, you can remove the ambiguity by deleting/changing some of the targets so that only 1 target owns this module. Refer to <https://www.pantsbuild.org/2.26/docs/using-pants/troubleshooting-common-issues#import-errors-and-missing-dependencies>.
09:29:59.13 [WARN] Pants cannot infer owners for the following imports in the target helloworld/translator/translator.py:lib:
* rich (line: 9)
If you do not expect an import to be inferable, add `# pants: no-infer-dep` to the import line. Otherwise, see <https://www.pantsbuild.org/2.26/docs/using-pants/troubleshooting-common-issues#import-errors-and-missing-dependencies> for common problems.
[
{
"address": "helloworld/translator/translator.py:lib",
"target_type": "python_source",
"dependencies": [],
"dependencies_raw": null,
"description": null,
"goals": [
"run"
],
"interpreter_constraints": null,
"resolve": null,
"restartable": false,
"run_goal_use_sandbox": null,
"skip_black": false,
"skip_docformatter": false,
"skip_flake8": false,
"skip_isort": false,
"skip_mypy": false,
"source_raw": "translator.py",
"sources": [
"helloworld/translator/translator.py"
],
"sources_fingerprint": "e71e261f5a1f4adbc68b626ad12c6e1de9376ac9945e204b1041388f6a52c705",
"tags": null
}
]
Pants version
2.26.0
OS
Linux
Additional info
I have a suspicion this bug is due to os.path.commonpath
as used here. This function, because of the missing training slash, has
assert os.path.commonpath(
["helloworld/translator/translator.py", "helloworld/greet:greet'"]
) == "helloworld"
assert os.path.commonpath(
["helloworld/translator/translator.py", "helloworld/translator:translator'"]
) == "helloworld"
If there were a trailing slash, then it would be able to see a difference:
assert os.path.commonpath(
["helloworld/translator/translator.py", "helloworld/translator/:translator'"]
) == "helloworld/translator"
assert os.path.commonpath(
["helloworld/translator/translator.py", "helloworld/greet/:greet'"]
) == "helloworld"
pantsbuild/pantscool-easter-32542
07/08/2025, 2:21 AMcool-easter-32542
07/10/2025, 6:54 AMfoo.d.ts
, that contain just type declarations, similar to .pyi
in Python. One can import from them like normal, e.g. import { SomeType } from './foo';
.
Pants doesn't currently infer these.
Reproducer, using this typescript file:
import { SomeType } from './declaration';
const f = (x: SomeType) => {};
cd $(mktemp -d)
cat > pants.toml <<EOF
[GLOBAL]
pants_version = "2.29.0.dev0"
backend_packages = [
"pants.backend.experimental.javascript",
"pants.backend.experimental.typescript",
]
EOF
echo 'typescript_sources(name="ts")' > BUILD
echo 'export interface SomeType {}' > declaration.d.ts
cat > main.ts <<EOF
import { SomeType } from './declaration';
const f = (x: SomeType) => {};
EOF
# BUG: "Pants cannot infer owners for the following imports in the target //main.tsts ... ./declaration"
pants dependencies main.ts
# Baseline: tsc main.ts
runs successfully
Pants version
2.29.0.dev0
OS
macOS
Additional info
N/A
pantsbuild/pantscool-easter-32542
07/10/2025, 7:42 AMfs
, net
. They can be imported either directly, or with a node:
disambiguating prefix:
// direct import
import { access } from 'fs';
import { connect } from 'net';
// qualified import
import { arch } from 'node:os';
import { basename } from 'node:path';
Pants will give warnings about the first two, without the node:
prefix, due to dependency inference warnings. This seems undesirable, and will, I imagine, require updating swathes of real-world code to adopt pants.
(This is driven by the _is_node_builtin_module
function. For comparison, the Python BE has a long hard-coded list of stdlib modules in _STDLIB_MODULES
, to be able to understand that import <some.stdlib.module>
will be fine.)
Reproducer:
cd $(mktemp -d)
cat > pants.toml <<EOF
[GLOBAL]
pants_version = "2.29.0.dev0"
backend_packages = [
"pants.backend.experimental.javascript",
]
EOF
echo 'javascript_sources(name="js")' > BUILD
cat > main.mjs <<EOF
// direct import
import { access } from 'fs';
import { connect } from 'net';
// qualified import
import { arch } from 'node:os';
import { basename } from 'node:path';
EOF
# BUG: "[WARN] Pants cannot infer owners for the following imports in the target //main.mjs:js: ... fs ... net"
pants dependencies main.mjs
# Baseline: this runs fine
node main.mjs
Pants version
2.29.0.dev0
OS
macOS
Additional info
n/a
pantsbuild/pantscool-easter-32542
07/11/2025, 4:43 AMtsconfig.json
/ jsconfig.json
specifies options for TypeScript and IDEs, per tsconfig.py
, and can be layered via extends
referring to 'parent' configurations
pants/src/python/pants/backend/typescript/tsconfig.py
Lines 3 to 8 in</pantsbuild/pants/commit/9e9a8c2ed47a7af2754e837f4aefd6b085b832d5|9e9a8c2>
| """tsconfig.json is primarily used by the typescript compiler in order to resolve types during |
| ---------------------------------------------------------------------------------------------- |
| compilation. The format is also used by IDE:s to provide intellisense. The format is used for |
| projects that use only javascript, and is then named jsconfig.json. |
| |
| See https://code.visualstudio.com/docs/languages/jsconfig |
| """ |
For instance, these tsconfig.json
and package.json
files:
{
"extends": "@tsconfig/node20/tsconfig.json"
}
{
...,
"dependencies": {
"@tsconfig/node20": "^20.1.4",
}
}
However, Pants fails to understand this, resulting in a warning:
14:35:00.38 [WARN] pants could not locate tsconfig.json's 'extends' at @tsconfig/node20/tsconfig.json. Found: ['tsconfig.json'].
Reproducer:
cd $(mktemp -d)
cat > pants.toml <<EOF
[GLOBAL]
pants_version = "2.29.0.dev0"
backend_packages = [
"pants.backend.experimental.javascript",
"pants.backend.experimental.typescript",
]
EOF
cat > BUILD <<EOF
package_json(name="pkg")
typescript_sources(name="ts")
EOF
touch main.ts
cat > tsconfig.json <<EOF
{
"extends": "@tsconfig/node20/tsconfig.json"
}
EOF
cat > package.json <<EOF
{
"name": "example",
"version": "1.0.0",
"main": "index.js",
"scripts": {},
"author": "",
"description": "",
"dependencies": {
"@tsconfig/node20": "^20.1.6"
}
}
EOF
cat > package-lock.json <<EOF
{
"name": "example",
"version": "1.0.0",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "example",
"version": "1.0.0",
"dependencies": {
"@tsconfig/node20": "^20.1.6"
}
},
"node_modules/@tsconfig/node20": {
"version": "20.1.6",
"resolved": "https://registry.npmjs.org/@tsconfig/node20/-/node20-20.1.6.tgz",
"integrity": "sha512-sz+Hqx9zwZDpZIV871WSbUzSqNIsXzghZydypnfgzPKLltVJfkINfUeTct31n/tTSa9ZE1ZOfKdRre1uHHquYQ=="
}
}
}
EOF
# BUG: "[WARN] pants could not locate tsconfig.json's 'extends' at @tsconfig/node20/tsconfig.json. Found: ['tsconfig.json']."
pants dependencies main.ts
# Baseline: tsc itself is happy
# npm install
# tsc -p ./tsconfig.json
Pants version
2.29.0.dev0
OS
macOS
Additional info
N/A
pantsbuild/pantscool-easter-32542
07/11/2025, 6:41 PMmacos-13 is closing down
The macOS 13 hosted runner image is closing down, following our N-1 OS support policy. This process will begin September 1, 2025, and the image will be fully retired on November 14, 2025. We recommend updating workflows to use macos-14 or macos-15.pantsbuild/pants
cool-easter-32542
07/11/2025, 11:30 PMpants --keep-sandboxes=always test projects/test_foo.py:tests -- -l
16:25:31.92 [INFO] Preserving local process execution dir /Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk for Run Pytest for projects/test_foo.py:tests
16:25:35.94 [INFO] Completed: Scheduling: Run Pytest for projects/test_foo.py:tests
16:25:35.94 [ERROR] Completed: Run Pytest - projects/test_foo.py:tests - failed (exit code 1).
============================= test session starts ==============================
platform darwin -- Python 3.11.11, pytest-7.4.4, pluggy-1.6.0
rootdir: /Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk
configfile: pytest.ini
plugins: asyncio-0.23.8, docker-3.2.3, xdist-3.7.0, cov-6.2.1, mock-3.14.1, opentelemetry-1.1.0, hypothesis-6.127.9
asyncio: mode=Mode.AUTO
collected 1 item
projects/test_foo.py F [100%]
=================================== FAILURES ===================================
___________________________________ test_foo ___________________________________
def test_foo():
print(os.getcwd())
extra = f"{os.getcwd()}/extra"
print(extra)
assert os.getcwd() in extra
> assert False, "this was actually successful"
E AssertionError: this was actually successful
E assert False
extra = '/Users/jordan/git/pants.dextra'
projects/test_foo.py:8: AssertionError
----------------------------- Captured stdout call -----------------------------
/Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk
/Users/jordan/git/pants.dextra
- generated xml file: /Users/jordan/git/pants.dprojects.test_foo.py.tests.xml -
=========================== short test summary info ============================
FAILED projects/test_foo.py::test_foo - AssertionError: this was actually successful
============================== 1 failed in 0.13s ===============================
✕ projects/test_foo.py:tests failed in 3.92s.
Note that:
• /tmp/pants-sandbox-4DThMk
is successfully printed in the terminal output: the first line in the capture output is /Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk
• /tmp/pants-sandbox-4DThMk/
is stripped from everywhere it should appear: the second line in the capture output is /Users/jordan/git/pants.dextra
, which is not a real path.
When re-running the __run.sh
script in the sandbox, all the terminal output is correct:
/Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk/__run.sh
================================================= test session starts ==================================================
platform darwin -- Python 3.11.11, pytest-7.4.4, pluggy-1.6.0
rootdir: /Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk
configfile: pytest.ini
plugins: asyncio-0.23.8, docker-3.2.3, xdist-3.7.0, cov-6.2.1, mock-3.14.1, opentelemetry-1.1.0, hypothesis-6.127.9
asyncio: mode=Mode.AUTO
collected 1 item
projects/test_foo.py F [100%]
======================================================= FAILURES =======================================================
_______________________________________________________ test_foo _______________________________________________________
def test_foo():
print(os.getcwd())
extra = f"{os.getcwd()}/extra"
print(extra)
assert os.getcwd() in extra
> assert False, "this was actually successful"
E AssertionError: this was actually successful
E assert False
extra = '/Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk/extra'
projects/test_foo.py:8: AssertionError
------------------------------------------------- Captured stdout call -------------------------------------------------
/Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk
/Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk/extra
--- generated xml file: /Users/jordan/git/pants.d/tmp/pants-sandbox-4DThMk/projects.test_foo.py.tests.xml ----
=============================================== short test summary info ================================================
FAILED projects/test_foo.py::test_foo - AssertionError: this was actually successful
================================================== 1 failed in 0.05s ===================================================
pantsbuild/pantscool-easter-32542
07/13/2025, 8:30 PMin_scope_types
on `@union`s.
Instead, we could derive the in-scope types from the signature of a polymorphic rule.
# Background
## Polymorphic Dispatch
Polymorphic rules allow dispatch based on runtime types of members of a union.
In the old call-by-type world, this meant that, given:
@union
class Base:
...
class Member1:
...
class Member2:
...
@rule
async def rule1(input: Member1) -> Result:
...
@rule
async def rule2(input: Member2) -> Result:
...
and the following union rule registrations:
UnionRule(Base, Member1),
UnionRule(Base, Member2),
Then await Get(Result, Base, input)
would dispatch to either rule1
or rule2
, depending on the type of input
. Note that Member1
and Member2
do not have to be Python subtypes of Base
, but it is very common that they are.
## In-scope Types
Unlike regular function dispatch, `@rule`s can consume `Param`s available at the callsite, even if not provided explicitly. For example, given
@rule
async def do_something(input: Input, other: Other) -> Result:
...
Then await Get(Result, Input, input)
will invoke do_something(...)
with the Other
argument provided from callsite context (e.g., if it was a parameter of the calling rule, or if some other rule can create it from parameters available to do_something(...)
.
This complicates matters for polymorphic dispatch, since we need the relevant polymorphic rules to have stable APIs, so that the caller knows which params to provide to `@rule`s it may not know about in advance.
This stability is achieved via the in_scope_types
argument to the @union
decorator:
@union(in_scope_types=[Other])
class Base:
...
This is a contract that polymorphic dispatch will provide the union member, and instances of each of the in_scope_types
, as params to the callees.
# Polymorphic Call-by-name
We are currently ]migrating](#21065) the codebase from call-by-type to call-by-name. Polymorphic call-by-name is implemented via a base @rule
tagged as polymorphic:
@rule(polymorphic=True)
async def base_rule(input: Base) -> Result:
...
So that a call await base_rule(**implicitly({input: Base}))
dispatches according to the runtime type of input
.
# Proposed Idiom
Once we are entirely call-by-name, we might consider the following changes, to make Pants rule graph polymorphism follow a more natural idiom:
1. Use subtyping instead of UnionRule
registration. Today it is already very common for union members to also be Python subclasses of the union type. So we might as well require it, and consult the MRO instead of explicitly registering unions.
2. Use the `base_rule`'s signature as the stable extension API, instead of in_scope_types
. For example, given
@rule(polymorphic=True)
async def base_rule(input: Base, other: Other) -> Result:
...
we can infer that Base
and Other
are the params that will be made available to the polymorphic variants.
This has the added advantage of being a property of the base rule, and not of the union itself, so that different rules that are polymorphic on the same type can have different in-scope types.
This change will require the `base_rule`'s definition to be available where in_scope_types
are currently consumed, which may require some plumbing.
pantsbuild/pantscool-easter-32542
07/14/2025, 1:19 AM21:16:12.68 [DEBUG] Launching 1 roots (poll=true).
every second to the .pants.d/workdir/pants.log
file which is just adding no new information. Ideally, pantsd would not be writing that message and filling up the log unnecessarily. This seems to happen after I ran a Pants command with -ldebug
.
This is with Pants 2.27.0.
pantsbuild/pantscool-easter-32542
07/14/2025, 6:00 PMbuildx
to build a docker image, you may have multiple buildx drivers configured on your system where you want to choose one for pants to use.
Describe the solution you'd like
I'd like
• a global configuration for the docker subsystem (overridable in a docker_image configuration) that allows for selecting the buildx driver by name
• this is used to set the --builder
argument used in the docker buildx build
command that pants launches (see https://docs.docker.com/reference/cli/docker/buildx/#builder)
Describe alternatives you've considered
The only alternatives I am aware of include
1. Setting the buildx driver outside of pants configuration by setting the default with https://docs.docker.com/reference/cli/docker/buildx/use/
2. Setting BUILDX_BUILDER=<your driver of choice>
environment variable as a [docker.env_vars]
Additional context
• This builds on the work started in #15199
• See more context in this thread: https://pantsbuild.slack.com/archives/C046T6T9U/p1751926012583459
pantsbuild/pantscool-easter-32542
07/15/2025, 6:32 AMrun
command.
Describe the solution you'd like
extra_run_args
field on docker_image
target, akin to the extra_build_args
.
Describe alternatives you've considered
CLI mess
Additional context
N/A
pantsbuild/pantscool-easter-32542
07/15/2025, 12:39 PMcool-easter-32542
07/15/2025, 6:53 PMengine
crate is both the primary crate in Pants and also the root of the Rust workspace. I'd like to reorganize the Rust sources to break that conjunction.
src/rust
would become the new workspace root. All crates one level below engine
would be moved to src/rust
. For example, src/rust/engine/process_execution
becomes src/rust/process_execution
. The crates underneath process_execution
(like src/rust/engine/process_execution/remote
) would still live under src/rust/process_execution
since they are arguably thematically related.
src/rust/engine/Cargo.toml
would be split into the workspace-specific part and the engine
-specific part. Currently, it serves dual purposes.
Benefits:
1. Easier to understand engine
since its top-level directory is not polluted with every other crate.
2. The workspace Cargo.toml
will be separate from the engine
Cargo.toml
. It will make it easier for readers to reason about the workspace and engine
.
pantsbuild/pants