https://pantsbuild.org/ logo
Join Slack
Powered by
# general
  • f

    fast-school-44220

    08/19/2025, 6:36 PM
    Hi everyone. I'm evaluating Pants for suitability for our custom tool chains. Looking at the samples, I don't really see anything that follows the "source -> codegen -> compiler -> artifact" build idiom. Does anyone know of some samples I can look at to get a feel for how to implement our flow?
    e
    • 2
    • 3
  • a

    acoustic-librarian-29560

    08/20/2025, 8:35 PM
    This is partially a pex question but is there a way to force Pants/pex to resolve a dependency from a specific index? My use case is I have Airflow code that is deployed on Astronomer and their base docker image forces you to install any dependencies they have in their own Python index from that index. The problem is, sometimes their index is a few versions behind. I'm wondering if there's a way to only search that index when trying to generate the lockfile for certain dependencies. What I have currently is one requirements file that just pulls from PyPi and one with the header
    -i <https://pip.astronomer.io/v2/>
    that I want to pull from there, but pex does not seem to respect this in its lockfile generation, or at least not the way pants passes the requirements to pex.
    • 1
    • 1
  • f

    freezing-activity-67835

    08/20/2025, 8:44 PM
    Hi! I'm trying to have my python code communicate (via subprocesses) with • binaries (possibly built from source) and • other python virtual envs (eg installed via conda) I'd like this to happen during development, (CI) testing, and deployment via PEX in docker images (using
    pex venv
    ). How do I best organise this with Pants? Via
    adhoc_tool
    , I'm able to compile the binaries, or install the venv via conda. These can then be used a dependency for my
    python_source
    targets. This makes sense for
    pants test
    usage. However, when deploying to a docker image, it seems to make more sense to me to manually move the built artefacts separately from the PEX over to the image, in separate instructions (in which case we don't want to bundle the artifacts inside the PEX). It seems quite annoying/unintuitive to me that pants is able to tie those things together nicely, but when I'm building a docker image, I need to use an entirely different approach to bundle these artifacts. Is there a better way, in which we can sensibly have pant produce a single target which is able to run with
    pants test
    and produces an artifact I can directly deploy to a docker image? Thanks!
  • b

    brave-hair-402

    08/21/2025, 12:55 PM
    Hi all — has anyone seen
    ./pants test
    install packages that don’t match the lockfile? In my case, the test PEX ends up with two
    attrs
    versions (
    24.2.0
    and
    25.3.0
    ). Python imports the older one first, which breaks
    cattrs 25.x
    with
    ImportError: cannot import name 'NothingType' from 'attrs'
    . When I run
    ./pants export ::
    , the exported venv correctly has only
    attrs==25.3.0
    . I’ve cleaned caches, but the duplicate older
    attrs
    returns during
    pants test
    . Is this a known issue or a misconfig on my end?
    e
    b
    • 3
    • 7
  • f

    fast-school-44220

    08/22/2025, 6:26 PM
    This is probably a super basic question, but it's driving me nuts... I make edits to my goals.py file and save it (
    cat
    on the command line shows me the edit did save) but
    pants
    keeps building using a cached version from somewhere. How do I force it to use the actual file on disk?
    h
    • 2
    • 18
  • a

    acoustic-library-86413

    08/27/2025, 11:38 AM
    When I generate the
    complete_platforms
    file for my environment, there are lots of references to Python versions that are not in use by my project. Is there any downside to having these specified in the file, even though they are unused? I.e.
    cp39-abi3-manylinux_2_36_aarch64
    is specified, even though the project uses exclusively Python 3.12.
    ✅ 1
    b
    • 2
    • 2
  • f

    freezing-appointment-41336

    08/27/2025, 12:18 PM
    Hi all 👋 , I have a question which I've not been able to find an answer for in the docs or the GitHub issues threads. Is there a supported approach for running pytest against actual Pants build files? For context, I'm working in a mono-repo with multiple subprojects (each with their own dedicated resolve), and I want to have a small collection of unit tests which enforce certain invariants and conventions for specific target parameters. I'm currently able to do this on a per-subproject basis, but that would require duplicated test definitions per subproject.
    h
    • 2
    • 9
  • w

    wide-processor-41045

    08/27/2025, 4:29 PM
    Hi all! Running
    pants check
    keeps failing for me with this exception:
    Copy code
    12:46:11.01 [ERROR] 1 Exception encountered:
    
    Engine traceback:
      in `check` goal
    
    ProcessExecutionFailure: Process 'Building requirements_venv.pex' failed with exit code 1.
    stdout:
    
    stderr:
    [Errno 13] Permission denied: '/Users/florenciasilvestre/.cache/pants/named_caches/pex_root/installed_wheels/0/02162f9b1151c25ed4f3634769ac80fd4af3dd9bf81074949af47fa6d1af6b27/numpy_typing_compat-1.25.20250730-py3-none-any.whl.lck.work/numpy_typing_compat-1.25.20250730.dist-info/licenses/LICENSE'
    I’ve already attempted to clean up the Pants cache, run docker prune, and restart Pants. I’ve even reinstalled Python and my PC. Usually, when I encounter issues with Pants, one of these steps resolves the problem, but unfortunately, this time, none of them have worked. Can anyone suggest any other potential solutions or troubleshooting steps I could take?
    w
    e
    +2
    • 5
    • 8
  • m

    mammoth-dawn-85816

    08/27/2025, 9:13 PM
    Hey folks! I discovered pants today, and I'm really enjoying it! I'm trying to stretch it to fit all of my python/pnpm-vite-typescript/terraform monorepo, but I'm struggling with
    vite
    at the moment, because
    uname
    can't be found in the pants sandbox. Details in 🧵 , can anyone offer advice?
    h
    • 2
    • 11
  • a

    acceptable-balloon-2043

    08/28/2025, 11:31 AM
    I have a jupyter notebook question 🙂 Searching through the slack a lot of people have used or written plugins to be able to run them with the right venv/files.. but I don't find a lot of people trying to make them first class citizens. In my company we have a lot of notebooks that are used in scripts or tests, and running them through pants hits a lot of problems from the lack of automatic dependency inference (you don't know which packages or code the notebook relies on, so the sandbox is lacking). I'm considering writing a plugin that would expose a
    jupyter_notebook
    target that would allow them to be treated similarly as
    python_source
    for purposes of dependency inference or even linters/checkers/formatters. There already exists a tool (nbconvert) that can output a python equivalent file to a notebook, so it sounds like a viable option to run that in the pants backend and redirect everything to the resulting python file.. but it's hard for me to estimate how viable this is as a plugin and what issues I might hit. Any one tried something like this or any advice from pants maintainers on feasibility of such an approach (redirecting linters/dependency inference to an ephemerally generated file as part of the run)?
    s
    • 2
    • 3
  • m

    mammoth-dawn-85816

    08/28/2025, 1:39 PM
    Hey folks; when I run
    pants package src/lambdas/whatever
    I get a little bit of metadata:
    Copy code
    14:37:05.54 [INFO] Wrote dist/src.lambdas.whatever/whatever.zip
        Runtime: python3.12
        Architecture: x86_64
        Handler: lambda_function.handler
    Is there any way to have that metadata output to a file instead? (My bash is weaker than I thought it was, I don’t seem to be able to pipe it to a file from stdout or stderr either!) I’d like to be able to configure my terraform to set up my lambda with the runtime, arch, and handler as configured in pants, so I don’t have duplicate info — if I can get this into a JSON/similar file, I can configure Terraform to read it and use those values 💪
    s
    h
    • 3
    • 37
  • d

    dry-market-87920

    08/28/2025, 3:42 PM
    Hi all, just did a quick search for Rust in the Slack and it seems to confirm there's no meaningful support still? Evaluating Pants for our team and this would be a deal killer, so trying to make sure I have an accurate understanding.
    f
    g
    • 3
    • 2
  • w

    wide-country-2470

    08/28/2025, 8:49 PM
    Hi I have a mono repo with three sub-projects and a library that all three subprojects use. The resolve for the library parameterizes the resolves to include the resolve for each sub-project. The requirements for the three sub-projects are similar but not identical. For some reason when running pants test for the library, the tests pass for two of the three revolves. And throws an error saying that it can’t find the pydantic_extra_types library for the third resolve even thought pydantic-extra-types is in the third party lock files for all three resolves. What am I missing?
    c
    • 2
    • 1
  • a

    acoustic-library-86413

    08/29/2025, 6:59 AM
    Is there a way to make
    __defaults__
    propagate to all subdirectories? I have my tests organized in a manner like this:
    Copy code
    tests/
       module_1/
       module_2/
       ...
       module_n/
       BUILD
    All of these modules share the same
    extra_env_vars
    and I wanted to include a
    BUILD
    at the
    tests/
    level, but the environment variables are not injected into the tests if I do this. The contents of the BUILD file is:
    Copy code
    __defaults__(
        all=dict(
            extra_env_vars=[
              "SOME_VAR=some_value",
              ...
            ]
        )
    )
    f
    • 2
    • 7
  • a

    adamant-hospital-78231

    09/01/2025, 4:37 PM
    Hey, I wanted to ask if someone can point me to an example project using the (experimental) cc backend to a nontrivial extent. I'm completely new to pants, and so far I haven't been able to suss out what build goal would produce eg an executable from cc_sources(). I'm trying to evaluate pants for something quite a bit more complex than that :)
    f
    w
    • 3
    • 4
  • f

    freezing-activity-67835

    09/01/2025, 6:11 PM
    Hi! Is there any way to define a docker_environment from a docker_image target? This would be very useful to run python tests in our production runtime. Thanks!
    h
    • 2
    • 3
  • p

    purple-hydrogen-4740

    09/02/2025, 12:07 PM
    Hi folks! Does pants reuses dependencies or re-downloads them when building multiple pytest_runner.pex files for running tests for mutiple packages?
    s
    • 2
    • 1
  • b

    boundless-monitor-67068

    09/02/2025, 1:37 PM
    hello friends 👋 A question on protobuf backend: Can pants infer dependencies between
    .proto
    files or is my only option to define these explicitly? Thank you 🙏
    • 1
    • 1
  • h

    hundreds-carpet-28072

    09/02/2025, 1:39 PM
    Do tool libs internally use a different version of python for resolves? My
    interpreter_constraints
    match the ones used to generate the lockfile, and I don’t have any other config overwriting this.
    Copy code
    InvalidLockfileError: You are consuming `black~=24.3.0`, `coverage[toml]==7.2.7`, and 11 other requirements from the `tools` lockfile at requirements-tools.lock with incompatible inputs.
    - The inputs use interpreter constraints (`CPython>=3.8`) that are not a subset of those used to generate the lockfile (`CPython<3.11,>=3.9`).
    h
    c
    • 3
    • 9
  • f

    fast-school-44220

    09/02/2025, 8:54 PM
    Can anyone point me at the code that makes pants go "oh, hey, I can use protobuf to generate that module. let's run it!"?
    h
    • 2
    • 85
  • p

    purple-hydrogen-4740

    09/03/2025, 9:13 AM
    Does pants allow to set pip requests timeout? We are dealing with a slow network and building multiple pytest runners while testing our monorepo always fails because of the timeouts
    c
    • 2
    • 1
  • c

    clean-alligator-41449

    09/05/2025, 10:50 AM
    I'm getting a sha256 mismatch when upgrading my requirements. How do I start resolving this? I have run
    pants -no-pantsd generate-lockfiles
    Copy code
    Expected sha256 hash of 7b70f5e6a41e52e48cfc087436c8a28c17ff98db369447bcaff3b887a3ab4467 when downloading triton but hashed to e2b0afe420d202d96f50b847d744a487b780567975455e56f64b061152ee9554.
    g
    c
    • 3
    • 13
  • h

    happy-kitchen-89482

    09/07/2025, 8:04 PM
    https://pantsbuild.slack.com/archives/C0D7TNJHL/p1757275465210039
  • c

    clean-alligator-41449

    09/09/2025, 12:02 PM
    I get this when I run
    pants check ::
    on remote (though not locally).
    Copy code
    11:56:23.62 [ERROR] 1 Exception encountered:
    Engine traceback:
      in `check` goal
    TypeError: Session.set_credentials() missing 1 required positional argument: 'secret_key'
    Is this a pyright or pants issue?
    w
    f
    • 3
    • 10
  • a

    average-breakfast-91545

    09/09/2025, 2:20 PM
    I'm sorry to ask, but what's the current state of the art for managing torch across architectures? Is there a decent example somewhere? I have some engineers running Linux with/without gpu, and some engineers running recent macs.
    g
    • 2
    • 8
  • p

    prehistoric-motorcycle-70707

    09/09/2025, 3:30 PM
    Is there any subsystem for formatting HTML files? I see prettier , but Im having trouble getting it working on html files
    c
    • 2
    • 1
  • h

    hundreds-carpet-28072

    09/11/2025, 1:23 PM
    What’s the cleanest way to go from target address to outputted package file path? Essentially, so that I can utilise Pant’s internal check for “does .pex exist already in the place where I would expect it to be output?“. It seems the mapping is simply:
    <dir-path>:<target-name>
    is output to
    dist/<dir-path>/<target-name>.pex
    , but is there a way I can use this logic directly?
    f
    • 2
    • 6
  • e

    enough-painting-56758

    09/12/2025, 8:01 AM
    Hi everyone, I’m new to Pants and need some guidance on setting up independent semantic versioning for services inside a monorepo. My current project structure looks like this:
    Copy code
    application/
      src/
        service_a/
          BUILD
        service_b/
          BUILD
      pants.toml
      pyproject.toml
    Each
    BUILD
    file currently looks like this:
    Copy code
    # Include all Python source files in this directory
    python_sources()
    
    # Define a Docker image target
    docker_image(
        image_tags=["latest"],
        name="docker_image",
        repository="service_a",
    )
    
    # Define a PEX (Python Executable) binary target
    pex_binary(
        name="app",
        entry_point="app.py",
    )
    Right now I’m pushing Docker images with the
    latest
    tag using this command:
    Copy code
    pants --changed-since=origin/main --changed-dependents=transitive list publish
    I’d like to generate independent semantic versions for each service. To do that, I added a
    vcs_version
    target in each service’s
    BUILD
    file:
    Copy code
    # Generate service-specific version
    vcs_version(
        generate_to="src/service_a/version.py",
        name="version",
        template="__version__ = '{version}'",
    )
    This creates a
    version.py
    file per service. My questions are: • How do I tag the Docker image with this generated version? • Will this setup automatically produce an independent version for each service based on changes in that service only? Thanks in advance for any help!
  • h

    hundreds-carpet-28072

    09/12/2025, 10:01 AM
    Twine’s
    --skip-existing
    flag isn’t supported for pypi repos in Google Artifact Registry, does Pants’ usage of Twine in
    pants publish
    add any functionality to this end? Or is it just invoking the tool directly? https://www.pantsbuild.org/dev/docs/python/goals/publish
  • a

    abundant-tent-27407

    09/12/2025, 11:33 AM
    This might be a feature request. Say I want to split my dependencies over two resolves. To do that with say
    poetry_requirements
    I need to create two pyproject files. I'd like to create just one, and have each of the
    poetry_requirements
    use a dependency group inside the same pyproject file. Is this crazy talk, or perhaps an idea?