fast-school-44220
08/19/2025, 6:36 PMacoustic-librarian-29560
08/20/2025, 8:35 PM-i <https://pip.astronomer.io/v2/>
that I want to pull from there, but pex does not seem to respect this in its lockfile generation, or at least not the way pants passes the requirements to pex.freezing-activity-67835
08/20/2025, 8:44 PMpex venv
).
How do I best organise this with Pants?
Via adhoc_tool
, I'm able to compile the binaries, or install the venv via conda. These can then be used a dependency for my python_source
targets. This makes sense for pants test
usage. However, when deploying to a docker image, it seems to make more sense to me to manually move the built artefacts separately from the PEX over to the image, in separate instructions (in which case we don't want to bundle the artifacts inside the PEX).
It seems quite annoying/unintuitive to me that pants is able to tie those things together nicely, but when I'm building a docker image, I need to use an entirely different approach to bundle these artifacts.
Is there a better way, in which we can sensibly have pant produce a single target which is able to run with pants test
and produces an artifact I can directly deploy to a docker image?
Thanks!brave-hair-402
08/21/2025, 12:55 PM./pants test
install packages that don’t match the lockfile?
In my case, the test PEX ends up with two attrs
versions (24.2.0
and 25.3.0
). Python imports the older one first, which breaks cattrs 25.x
with ImportError: cannot import name 'NothingType' from 'attrs'
. When I run ./pants export ::
, the exported venv correctly has only attrs==25.3.0
. I’ve cleaned caches, but the duplicate older attrs
returns during pants test
.
Is this a known issue or a misconfig on my end?fast-school-44220
08/22/2025, 6:26 PMcat
on the command line shows me the edit did save) but pants
keeps building using a cached version from somewhere. How do I force it to use the actual file on disk?acoustic-library-86413
08/27/2025, 11:38 AMcomplete_platforms
file for my environment, there are lots of references to Python versions that are not in use by my project. Is there any downside to having these specified in the file, even though they are unused? I.e. cp39-abi3-manylinux_2_36_aarch64
is specified, even though the project uses exclusively Python 3.12.freezing-appointment-41336
08/27/2025, 12:18 PMwide-processor-41045
08/27/2025, 4:29 PMpants check
keeps failing for me with this exception:
12:46:11.01 [ERROR] 1 Exception encountered:
Engine traceback:
in `check` goal
ProcessExecutionFailure: Process 'Building requirements_venv.pex' failed with exit code 1.
stdout:
stderr:
[Errno 13] Permission denied: '/Users/florenciasilvestre/.cache/pants/named_caches/pex_root/installed_wheels/0/02162f9b1151c25ed4f3634769ac80fd4af3dd9bf81074949af47fa6d1af6b27/numpy_typing_compat-1.25.20250730-py3-none-any.whl.lck.work/numpy_typing_compat-1.25.20250730.dist-info/licenses/LICENSE'
I’ve already attempted to clean up the Pants cache, run docker prune, and restart Pants. I’ve even reinstalled Python and my PC.
Usually, when I encounter issues with Pants, one of these steps resolves the problem, but unfortunately, this time, none of them have worked.
Can anyone suggest any other potential solutions or troubleshooting steps I could take?mammoth-dawn-85816
08/27/2025, 9:13 PMvite
at the moment, because uname
can't be found in the pants sandbox. Details in 🧵 , can anyone offer advice?acceptable-balloon-2043
08/28/2025, 11:31 AMjupyter_notebook
target that would allow them to be treated similarly as python_source
for purposes of dependency inference or even linters/checkers/formatters. There already exists a tool (nbconvert) that can output a python equivalent file to a notebook, so it sounds like a viable option to run that in the pants backend and redirect everything to the resulting python file.. but it's hard for me to estimate how viable this is as a plugin and what issues I might hit.
Any one tried something like this or any advice from pants maintainers on feasibility of such an approach (redirecting linters/dependency inference to an ephemerally generated file as part of the run)?mammoth-dawn-85816
08/28/2025, 1:39 PMpants package src/lambdas/whatever
I get a little bit of metadata:
14:37:05.54 [INFO] Wrote dist/src.lambdas.whatever/whatever.zip
Runtime: python3.12
Architecture: x86_64
Handler: lambda_function.handler
Is there any way to have that metadata output to a file instead? (My bash is weaker than I thought it was, I don’t seem to be able to pipe it to a file from stdout or stderr either!)
I’d like to be able to configure my terraform to set up my lambda with the runtime, arch, and handler as configured in pants, so I don’t have duplicate info — if I can get this into a JSON/similar file, I can configure Terraform to read it and use those values 💪dry-market-87920
08/28/2025, 3:42 PMwide-country-2470
08/28/2025, 8:49 PMacoustic-library-86413
08/29/2025, 6:59 AM__defaults__
propagate to all subdirectories? I have my tests organized in a manner like this:
tests/
module_1/
module_2/
...
module_n/
BUILD
All of these modules share the same extra_env_vars
and I wanted to include a BUILD
at the tests/
level, but the environment variables are not injected into the tests if I do this. The contents of the BUILD file is:
__defaults__(
all=dict(
extra_env_vars=[
"SOME_VAR=some_value",
...
]
)
)
adamant-hospital-78231
09/01/2025, 4:37 PMfreezing-activity-67835
09/01/2025, 6:11 PMpurple-hydrogen-4740
09/02/2025, 12:07 PMboundless-monitor-67068
09/02/2025, 1:37 PM.proto
files or is my only option to define these explicitly? Thank you 🙏hundreds-carpet-28072
09/02/2025, 1:39 PMinterpreter_constraints
match the ones used to generate the lockfile, and I don’t have any other config overwriting this.
InvalidLockfileError: You are consuming `black~=24.3.0`, `coverage[toml]==7.2.7`, and 11 other requirements from the `tools` lockfile at requirements-tools.lock with incompatible inputs.
- The inputs use interpreter constraints (`CPython>=3.8`) that are not a subset of those used to generate the lockfile (`CPython<3.11,>=3.9`).
fast-school-44220
09/02/2025, 8:54 PMpurple-hydrogen-4740
09/03/2025, 9:13 AMclean-alligator-41449
09/05/2025, 10:50 AMpants -no-pantsd generate-lockfiles
Expected sha256 hash of 7b70f5e6a41e52e48cfc087436c8a28c17ff98db369447bcaff3b887a3ab4467 when downloading triton but hashed to e2b0afe420d202d96f50b847d744a487b780567975455e56f64b061152ee9554.
happy-kitchen-89482
09/07/2025, 8:04 PMclean-alligator-41449
09/09/2025, 12:02 PMpants check ::
on remote (though not locally).
11:56:23.62 [ERROR] 1 Exception encountered:
Engine traceback:
in `check` goal
TypeError: Session.set_credentials() missing 1 required positional argument: 'secret_key'
Is this a pyright or pants issue?average-breakfast-91545
09/09/2025, 2:20 PMprehistoric-motorcycle-70707
09/09/2025, 3:30 PMhundreds-carpet-28072
09/11/2025, 1:23 PM<dir-path>:<target-name>
is output to dist/<dir-path>/<target-name>.pex
, but is there a way I can use this logic directly?enough-painting-56758
09/12/2025, 8:01 AMapplication/
src/
service_a/
BUILD
service_b/
BUILD
pants.toml
pyproject.toml
Each BUILD
file currently looks like this:
# Include all Python source files in this directory
python_sources()
# Define a Docker image target
docker_image(
image_tags=["latest"],
name="docker_image",
repository="service_a",
)
# Define a PEX (Python Executable) binary target
pex_binary(
name="app",
entry_point="app.py",
)
Right now I’m pushing Docker images with the latest
tag using this command:
pants --changed-since=origin/main --changed-dependents=transitive list publish
I’d like to generate independent semantic versions for each service.
To do that, I added a vcs_version
target in each service’s BUILD
file:
# Generate service-specific version
vcs_version(
generate_to="src/service_a/version.py",
name="version",
template="__version__ = '{version}'",
)
This creates a version.py
file per service.
My questions are:
• How do I tag the Docker image with this generated version?
• Will this setup automatically produce an independent version for each service based on changes in that service only?
Thanks in advance for any help!hundreds-carpet-28072
09/12/2025, 10:01 AM--skip-existing
flag isn’t supported for pypi repos in Google Artifact Registry, does Pants’ usage of Twine in pants publish
add any functionality to this end? Or is it just invoking the tool directly? https://www.pantsbuild.org/dev/docs/python/goals/publishabundant-tent-27407
09/12/2025, 11:33 AMpoetry_requirements
I need to create two pyproject files. I'd like to create just one, and have each of the poetry_requirements
use a dependency group inside the same pyproject file. Is this crazy talk, or perhaps an idea?