https://pantsbuild.org/ logo
Join Slack
Powered by
# general
  • l

    late-breakfast-42820

    10/13/2025, 9:51 PM
    I did kill the pantsd process manually as I couldn't stop it any other way when it was running a server in claude that I shut down without safely exiting
  • l

    late-breakfast-42820

    10/13/2025, 9:59 PM
    false alarm! was some docker files from a db lol
    😅 1
  • h

    happy-family-315

    10/14/2025, 7:12 AM
    Hi, I got weird import behaviour from my pex: I created a multi stage docker image (deb + src) for my pants-built pex file. Everything is working except for an import bug, where my code tries to import geo_pandas which imports typing module from python. However instead it tries to import typing module from open cv. After that I got an exception because I do not have all dependencies installed which are required for open cv. This is also the reason I found out about the error. I'll give you the error log and the docker file. Hope someone can explain.
    Copy code
    2025-10-10T06:11:32.122Z
    File "/usr/lib/python3.12/multiprocessing/process.py", line 314, in _bootstrap
    2025-10-10T06:11:32.122Z
    self.run()
    2025-10-10T06:11:32.122Z
    File "/usr/lib/python3.12/multiprocessing/process.py", line 108, in run
    2025-10-10T06:11:32.122Z
    self._target(*self._args, **self._kwargs)
    2025-10-10T06:11:32.122Z
    File "/usr/lib/python3.12/concurrent/futures/process.py", line 251, in _process_worker
    2025-10-10T06:11:32.122Z
    call_item = call_queue.get(block=True)
    2025-10-10T06:11:32.122Z
    ^^^^^^^^^^^^^^^^^^^^^^^^^^
    2025-10-10T06:11:32.122Z
    File "/usr/lib/python3.12/multiprocessing/queues.py", line 122, in get
    2025-10-10T06:11:32.122Z
    return _ForkingPickler.loads(res)
    2025-10-10T06:11:32.122Z
    ^^^^^^^^^^^^^^^^^^^^^^^^^^
    2025-10-10T06:11:32.122Z
    File "/workspaces/job/update_layer/lib/python3.12/site-packages/geofileops/__init__.py", line 10, in <module>
    2025-10-10T06:11:32.122Z
    import geopandas._compat as gpd_compat
    2025-10-10T06:11:32.122Z
    File "/workspaces/job/update_layer/lib/python3.12/site-packages/geopandas/__init__.py", line 3, in <module>
    2025-10-10T06:11:32.122Z
    from geopandas.geoseries import GeoSeries
    2025-10-10T06:11:32.122Z
    File "/workspaces/job/update_layer/lib/python3.12/site-packages/geopandas/geoseries.py", line 3, in <module>
    2025-10-10T06:11:32.122Z
    import typing
    2025-10-10T06:11:32.122Z
    File "/workspaces/job/update_layer/lib/python3.12/site-packages/cv2/typing/__init__.py", line 61, in <module>
    2025-10-10T06:11:32.122Z
    import cv2.mat_wrapper
    2025-10-10T06:11:32.122Z
    ImportError: libGL.so.1: cannot open shared object file: No such file or directory
    Copy code
    FROM ubuntu:${VARIANT} AS base
    ARG PYTHON_MAJOR_VERSION
    
    FROM base AS layer-deps
    COPY jobs/update_layer-deps.pex /deps.pex
    RUN PEX_TOOLS=1 python${PYTHON_MAJOR_VERSION} /deps.pex venv --scope=deps --collisions-ok --compile --rm all /workspaces/jobs/update_layer
    
    FROM base AS layer-srcs
    ARG PYTHON_MAJOR_VERSION
    COPY jobs/update_layer-srcs.pex /srcs.pex
    RUN PEX_TOOLS=1 python${PYTHON_MAJOR_VERSION} /srcs.pex venv --scope=srcs --compile --rm all /workspaces/jobs/update_layer
    
    FROM base
    COPY --from=layer-deps /workspaces/jobs/update_layer /workspaces/jobs/update_layer
    COPY --from=layer-srcs /workspaces/jobs/update_layer /workspaces/jobs/update_layer
    
    WORKDIR /workspaces/jobs_geo_import/
    c
    • 2
    • 2
  • w

    worried-piano-22913

    10/14/2025, 10:40 AM
    Hi everyone, is there someone who has a clue about this: https://pantsbuild.slack.com/archives/C046T6T9U/p1759762500455099
  • w

    worried-piano-22913

    10/14/2025, 11:39 AM
    I think i just found the cause myself, but i'm not sure if it's the intended behaviour. I'll address it in the thread
  • m

    mammoth-dawn-85816

    10/14/2025, 4:49 PM
    Hey folks, I’m struggling to understand how to define a
    python_distribution
    that pulls the package name and version from
    pyproject.toml
    — this slack mentions this in a few places, and there’s an open issue that seems to cover the same thing, but I don’t know if there’s some recommended way to use
    pyproject.toml
    as the authoritative source for the package being built — does anyone have advice?
    h
    • 2
    • 6
  • b

    brief-engine-92399

    10/16/2025, 2:36 AM
    Is there a way to take a pex_binary target and create a requirements.txt file from it? It's a follow up from a previous question - but I'm trying to cross-platform build docker images and it is really really slow. Like 100x+ slower when building arm from x86. I'm wondering if it would be easier to download all packages at runtime instead. I want the packages from a pex compile to do that though. Will take any advice.
    c
    g
    b
    • 4
    • 23
  • t

    thousands-plumber-33255

    10/16/2025, 7:15 PM
    How can i turn off deprecation warnings for a pex binary target? I tried this:
    Copy code
    pex_binary(
            name='deps',
            execution_mode='venv',
            include_requirements=True,
            include_sources=False,
            include_tools=True,
            layout='packed',
            env={"PYTHONWARNINGS": "ignore"},
        )
    
    FROM base AS production-deps
    ARG PYTHON_MAJOR_VERSION
    COPY django/production-deps.pex /deps.pex
    RUN PEX_TOOLS=1 python${PYTHON_MAJOR_VERSION} /deps.pex venv --scope=deps --collisions-ok --compile --rm all /bin/production
    But that still gives me all those warnings when I run the pex:
    Copy code
    /bin/production/lib/python3.12/site-packages/pydantic/_internal/_config.py:323: PydanticDeprecatedSince20: Support for class-based `config` is deprecated, use ConfigDict instead. Deprecated in Pydantic V2.0 to be removed in V3.0. See Pydantic V2 Migration Guide at <https://errors.pydantic.dev/2.11/migration/>
      warnings.warn(DEPRECATION_MESSAGE, DeprecationWarning)
    I can even see in the /bin/production/pex file that it injects the envs. But why is that ignored at all?
    c
    b
    • 3
    • 13
  • p

    proud-planet-36170

    10/16/2025, 8:45 PM
    We have a project with a git submodule but it doesn't seem to be picked up by
    --changed-since
    Copy code
    ➜  platform git:(main) ✗ git diff HEAD~1 | cat
    diff --git a/attendee/attendee-repo b/attendee/attendee-repo
    index 8385791..672fa8c 160000
    --- a/attendee/attendee-repo
    +++ b/attendee/attendee-repo
    @@ -1 +1 @@
    -Subproject commit 8385791c33bdef6c17953f5ba356413b5d329013
    +Subproject commit 672fa8c7f007562ee030cc3ec1f954a007995672
    
    ➜  platform git:(main) ✗ git diff --name-only HEAD~1 | cat
    attendee/attendee-repo
    
    ➜  platform git:(main) ✗ pants \
                  --changed-since=HEAD~1 \
                  --changed-dependents=transitive \
                  list
    circleci-etl:docker-image
    circleci-etl/circleci_etl/job_tests.py:../image-files
    circleci-etl/circleci_etl/job_tests.py
  • n

    numerous-pharmacist-91083

    10/16/2025, 11:51 PM
    I have a python build that uses pytest and a global
    conftest.py
    in my root directory. That sets up some custom markers and associated commnd line options like
    --run_large
    . It was all working just fine but then I upgraded to Python 3.12 which wasn't compatible with the pytest version Pants uses by default so I setup a resolve for the test tool and specified a newer version of pytest like:
    Copy code
    [pytest]
    args = ['--log-cli-level=INFO']
    install_from_resolve = "tools"
    That is now using the newer version of pytest but it's no longer running my
    conftest.py
    even if I add:
    Copy code
    [python-infer]
    conftests = true
    I can verify that because (1) I added print statements to things like my
    pytest_addoption
    method and (2) the command line arguments like
    --run_large
    are no longer recognized. How can I fix it?
    h
    • 2
    • 71
  • h

    hundreds-carpet-28072

    10/17/2025, 1:20 PM
    What’s my best option for detecting changes to target versions in BUILD files that is general to target type?
    w
    • 2
    • 17
  • g

    great-river-19742

    10/19/2025, 9:45 AM
    How is the final config assembled if I have multiple configfiles (eg
    PANTS_CONFIG_FILES=pants.ci.toml
    in addition to pants.toml). Say I have in both files a section
    [blah]
    . Does "[blah] from ci.toml" completely overrides the default's [blah] or does some smart merging of config entries happen?
    h
    • 2
    • 1
  • h

    happy-kitchen-89482

    10/19/2025, 10:17 PM
    https://pantsbuild.slack.com/archives/C0D7TNJHL/p1760912241192899
  • b

    brainy-airline-59624

    10/20/2025, 6:30 AM
    I'm attempting to add a requirement to a requirement.txt file. This requirement is a raw github URL of the form git+https://github.com/...@v0.x.xxx When I add this to the requirement file, it breaks. mbp293@MBP293-Z zero-ai % pants check :: 151840.09 [ERROR] 1 Exception encountered: Engine traceback: in
    check
    goal in Find targets from input specs ValueError: Invalid requirement 'git+https://github.com/...@v0.x.xxx in third_party/zero-api_requirements.txt at line 9: Expected end or semicolon (after name and no valid version specifier) git+https://github.com/...@v0.x.xxx One thing I found recommended that I put this into a
    python_requirement
    block and it seems to grab and build this repo fine. The problem is I want to include this third_party in my default resolve so that it's a part of the lockfile I use with my IDE. Perhaps there's some simple way to do this, but I'm having a hard time solving this.
    e
    b
    • 3
    • 4
  • h

    hundreds-carpet-28072

    10/20/2025, 4:12 PM
    I’m moving my specified dependencies from a requirements.txt to pyproject.toml to enable using
    uv
    lockfiles and converting the output into Pex lockfiles. Does Pants support the
    [dependency-groups]
    fields in pyproject.toml in any way? It would be useful to be able to generate separate resolves from these as I have in the past for dependencies related to: test, tools, docs etc.
    n
    e
    • 3
    • 4
  • a

    ancient-beard-76775

    10/20/2025, 8:17 PM
    Hey everyone, I’m trying out Pants to build my Scala projects and hitting an error because the version of scalameta used for dependency inference doesn’t support some of the syntax I’m using. Is there a way I can override the version of scalameta that’s used, or do I need to wait for Pants itself to upgrade the version?
    h
    • 2
    • 12
  • b

    brainy-airline-59624

    10/21/2025, 8:47 AM
    I continue to get this error when trying to run a build on Github:
    Copy code
    tderr:
    There was 1 error downloading required artifacts:
    1. zero-api-python 1 from git+<https://github.com/zeals-co-ltd/zero-api.git@v0.1.631>
        pid 1487 -> /home/runner/.cache/pants/named_caches/pex_root/venvs/1/d881a8f5dff01abf8f60d8ca212d5cf43381f85e/302b21f4c5ac9f243cbb9cb5d57e98957a60ea02/bin/python
    I cut out a bunch of code, but the basic problem is we have a repo, that is installing from a private github repo. We want to pass the credentials in. Every other repo in our codebase has a standardized way of doing it that basically looks like:
    Copy code
    - name: Configure git for private modules
            run: |
              git config --global url."https://${{ secrets.ORG_PROJECTS_TOKEN }}@github.com/".insteadOf "<https://github.com/>"
    Fairly standard. I think I'm failing to pass this into PantsBuild. Any ideas on what I'm missing? No other repos use PantsBuild and so no one else at the company knows anything about this.
    h
    • 2
    • 7
  • a

    ancient-beard-76775

    10/22/2025, 12:53 PM
    Making progress in building my Scala projects, but I’ve got a unique setup that I’m not sure how to get working with Pants TL;DR: Is it possible to have a target depend on another target that uses a different
    resolve
    ? I’m getting a
    NoCompatibleResolve
    error In sbt I have a project called
    predef
    that defines an object with the full package path
    example.Predef
    . Every other project then gets an additional scalac arg:
    Copy code
    -Yimports:java.lang,scala,scala.Predef,example.Predef
    This makes it so that everything defined in
    example.Predef
    is available without needing any imports I’m trying to mirror this in my Pants config. In
    pants.toml
    I have:
    Copy code
    [jvm.resolves]
    jvm-default = "3rdparty/jvm/deps.lock"
    uses-predef = "3rdparty/jvm/deps.lock"
    
    [scala.version_for_resolve]
    jvm-default = "3.7.3"
    uses-predef = "3.7.3"
    
    [scalac]
    args = ["-deprecation", "-encoding", "utf8"]
    
    [scalac.args_for_resolve]
    uses-predef = ["-Yimports:java.lang,scala,scala.Predef,example.Predef"]
    In
    3rdparty/jvm/BUILD
    I have:
    Copy code
    jvm_artifact(
      name = "jvm-default-org.scala-lang_scala3-library_3",
      group = "org.scala-lang",
      artifact = "scala3-library_3",
      version = "3.7.3",
      resolve = "jvm-default",
    )
    
    jvm_artifact(
      name = "uses-predef-org.scala-lang_scala3-library_3",
      # ...same group, artifact, and version as above...
      resolve = "uses-predef",
    )
    And in a target (
    shared/src/main/scala/example/BUILD
    ) that should get the extra scalac arg I have:
    Copy code
    scala_sources(
      resolve = "uses-predef",
      dependencies = ["predef/src/main/scala/example"],
    )
    When I run
    pants check ::
    I get this error:
    Copy code
    NoCompatibleResolve: The selected targets did not have a resolve in common:
    
    jvm-default:
      * 3rdparty/jvm:jvm-default-org.scala-lang_scala3-library_3
      * predef/src/main/scala/example/Predef.scala
    
    uses-predef:
      * 3rdparty/jvm:uses-predef-org.scala-lang_scala3-library_3
      * shared/src/main/scala/example/Example.scala
    
    Targets which will be merged onto the same classpath must share a resolve (from the [resolve](<https://www.pantsbuild.org/2.29/reference/targets/deploy_jar#resolve>) field).
    h
    • 2
    • 16
  • b

    brief-engine-92399

    10/23/2025, 2:23 AM
    May be more of a pex question, but any idea why if I try to run pex --compile locally after a pants package of a pex_binary, what I see in the venv makes sense, but when I try to run it in a
    docker_image
    , packages are missing?
    b
    • 2
    • 13
  • g

    gray-apple-58935

    10/24/2025, 11:40 AM
    I ran into similar issue, debugger misses all breakpoints in
    pants test --debug-adapter ...
    , if I run
    pants run --debug-adapter ...
    , top level code is executed but not test body. I put more context here - https://github.com/pantsbuild/pants/discussions/22788 is this a bug or have I missed some setting in
    launch.json
    or
    pants.toml
    ?
    w
    t
    • 3
    • 48
  • s

    silly-queen-7197

    10/24/2025, 8:30 PM
    Does pants change
    pants test foo#bar
    into
    pytest foo -k bar
    ? I realize I don’t know how to do
    pytest foo::bar
    .
  • f

    fast-school-44220

    10/24/2025, 8:35 PM
    May I ask a history question? How did pants come to use the term "targets" for primary sources? This is causing some Abbot & Costello style communication issues on my team (coming from Make style tools) sometimes.
    b
    • 2
    • 2
  • f

    fast-school-44220

    10/24/2025, 11:27 PM
    annoying: GitHub identifies pants BUILD files as being written in Starlark.
    w
    • 2
    • 2
  • m

    mammoth-dawn-85816

    10/27/2025, 9:25 AM
    Hey folks! I’m having trouble with in-monorepo (source layout) dependencies declared with poetry in pyproject.toml, when running
    pants test ::
    . The structure of the dependency isn’t flattened, so the references to it fail (when
    pytest
    works correctly, outside of pants). What am I missing? •
    src/libs/package_1
    depends on
    src/libs/package_2
    , ◦ declared with a line like
    package_2 = { path = "../package_2", develop = true }
    in the
    [tool.poetry.dependencies]
    section of my
    pyproject.toml
    ◦ Code for package 2 lives in
    src/libs/package_2/src/package_2/*.py
    ◦ both have a single BUILD file in the package root, with
    python_sources
    ,
    python_tests
    and
    poetry_requirements
    (sources depends on poetry, tests on both) •
    pytest
    , run outside of pants, executes tests in
    src/libs/package_1/tests/test_*.py
    ◦ It correctly loads
    package_2
    source files and tests succeed ◦ I believe this is because of the line
    packages = [{ include = "package_2", from = "src" }]
    inside
    [tool.poetry]
    of my package_2
    pyproject.toml
    . •
    pants test src/libs/package_1::
    fails with
    ModuleNotFoundError: No module named 'package_2'
    ◦ When I keep & inspect the sandbox, the
    src/libs/package_2/
    dir is empty except for
    src/package_2/*.py
    ◦ I believe
    package_1
    is unable to read the source in
    ../package_2/*.py
    because it’s actually in
    ../package_2/src/package_2/*.py
    but there’s no
    pyproject.toml
    there to include that — does this seem correct? •
    pants run tools/whatever::
    (which depends on
    package_1
    ) successfully runs using
    package_1
    (and its dependency on
    package_2
    ) I’ve tried including a
    resources(name="pyproject", sources=["pyproject.toml"])
    in
    package_2
    , (with both the package_2
    python_sources
    depending on it, and package_1's
    python_tests
    ) — with no change in the outcome, despite the package_2
    pyproject.toml
    being present in the sandbox. I think I’m either missing something stupid; or the approach I’m taking with using
    poetry
    for dependency management is confusing matters. Can anyone suggest debug steps?
    g
    • 2
    • 3
  • q

    quaint-piano-62770

    10/27/2025, 2:32 PM
    I am trying to generate lockfiles where my requirements have `torch`/`torchvision` but the
    +cpu
    wheels. For this I add
    find_links
    option to the
    [python-repos]
    section in the
    pants.toml
    file
    Copy code
    [python-repos]
    indexes = [
      "<https://pypi.org/simple>",
      "%(env.PIP_EXTRA_INDEX_URL)s"
    ]
    find_links = [
      "<https://download.pytorch.org/whl/torch/>",
      "<https://download.pytorch.org/whl/torchvision/>"
    ]
    But I get this error when trying to run `pants generate-lockfiles`:
    Copy code
    File "/Users/narayan/.cache/pants/named_caches/pex_root/venvs/1/32a2d4a7e846b6328d7526fa44a431609f09ae7c/2fa36aa9e14d5074d008690effb919f0552cfb0a/lib/python3.11/site-packages/pip/_vendor/msgpack/fallback.py", line 968, in _pack_map_pairs
    pip:     self._pack(v, nest_limit - 1)
    pip:   File "/Users/narayan/.cache/pants/named_caches/pex_root/venvs/1/32a2d4a7e846b6328d7526fa44a431609f09ae7c/2fa36aa9e14d5074d008690effb919f0552cfb0a/lib/python3.11/site-packages/pip/_vendor/msgpack/fallback.py", line 819, in _pack
    pip:     raise ValueError("Memoryview is too large")
    pip: ValueError: Memoryview is too large
    If I run one resolve at a time it works,
    pants generate-lockfiles --resolve=[resolve-name]
    but obviously this is inconvenient. Any solutions?
    w
    b
    c
    • 4
    • 26
  • h

    hundreds-carpet-28072

    10/28/2025, 10:56 AM
    Could somebody provide a link as to the exact spec that Pex uses for its lockfiles? I can’t seem to find it searching docs/github.
    b
    c
    • 3
    • 55
  • b

    billowy-tiger-59247

    10/28/2025, 4:57 PM
    Hey folks! We are dealing with an interesting behavior that got me trying to understand how Pants deals with test dependencies at test runtime. What happens in this case: • service resolve: dependency A==1.0.0 • pytest resolve (using the
    pytest.install_from_resolve
    ): dependency A=2.0.0 Which dependency will be picked up during test execution? The one from the original resolve or the one from the pytest resolve?
    h
    • 2
    • 15
  • n

    numerous-pharmacist-91083

    10/28/2025, 11:03 PM
    I'm facing some very slow Python build times. Any time any dependency is changed the build takes forever, primarily on steps like
    Building 21 requirements for requirements.pex
    which I think is building a pex isolated environment for unit tests and/or mypy runs. i think the main culprit is a few big libraries like PyTorch and torchvison but, for obvious reasons, I can't eliminate those. It's gotten so bad that I just added a single dependency and my CI is now timing out after 2 hours even though I'm caching the pants cache in my CI. Is there anything I can do to speed this up? I'm looking at options like maybe running with
    execution_mode = 'venv'
    or
    run_against_entire_lockfile
    but I don't have a good enough understand of how Pants, pex, testing, pip, etc. all interact and since each experiment takes about 2 hours some advice would be greatly appreciated.
    h
    • 2
    • 6
  • r

    ripe-architect-1001

    10/31/2025, 1:59 PM
    Maybe a stupid question but for using
    ruff
    for example if I just want to use it but I don't need any additional configuration I just need to add the backend, I don't need a config section?
    f
    w
    k
    • 4
    • 4
  • c

    curved-manchester-66006

    10/31/2025, 2:12 PM
    🧪 There are recent RCs on all current branches. If these resolve an issue you were experiencing please take them for a spin. • https://github.com/pantsbuild/pants/releases/tag/release_2.27.1rc0 • https://github.com/pantsbuild/pants/releases/tag/release_2.28.1rc0 • https://github.com/pantsbuild/pants/releases/tag/release_2.29.1rc1