https://pantsbuild.org/ logo
Join SlackCommunities
Powered by
# plugins
  • e

    elegant-park-52418

    11/13/2024, 2:23 PM
    in my plugin, i'd like to be able to generate some sources and have them be pulled into the docker build context for a
    docker_image
    target. is there an existing pattern i can follow to achieve this?
    c
    g
    • 3
    • 65
  • l

    lively-school-24147

    11/15/2024, 3:57 PM
    I’m writing a plugin in which I need to run a program (could be Scala, Python or Go) which lives in the same repo. It will produce a metadata file which I then need to do further processing with (run a bunch of validations, so essentially a check goal). I’m wondering what the best pattern is? I could make my field set accept a target which can be packaged, and then execute the packaged binary conditionally (depending on what language it is) and grabbing its output. Or is it better to treat it as some sort of codegen where the output is the generated file, and then have language specific targets which can compile a binary, run it and then output codegen’d files?
    g
    • 2
    • 8
  • g

    gentle-sugar-52379

    11/21/2024, 5:23 PM
    i'm not able to get my very basic macro to run 😞 pants is not finding it. is this really the right path and configuration? i tried to follow the docs as close as possible. if i copy the macro code at the beginning of my build file manually, it works fine
    βœ… 1
    e
    • 2
    • 3
  • s

    square-elephant-85438

    12/02/2024, 6:44 PM
    I see that a goal_rule is always uncacheable, but are the rules invoked by a goal_rule cacheable?
    βœ… 1
    f
    • 2
    • 3
  • h

    happy-psychiatrist-90774

    12/05/2024, 3:51 PM
    Hey guys, This is a bit of a continuation of this discussion over on GH I'm trying to go around the partitioning system for linters to I try to create my own lint request class and I'm always hitting a wall with something missing Is there ANY way to do linting without going through a "formal" request?
    h
    • 2
    • 7
  • w

    wooden-policeman-10903

    12/06/2024, 9:14 PM
    Hey, guys! I've just started using pants and it is awesome. Currently learning to do plugin integration. I have two questions, mainly 1. The first one is about integration of https://prisma-client-py.readthedocs.io into the pantsbuild -- how would one approach the system like this? On the one hand side, it is a codegen that transforms spec into the python client to interact with the DB, on the other hand side, this is a migration engine, which creates commitable SQL files inside the service directory. 2. I have my own (just published) project (https://github.com/g-usi/asyncapi-python) that takes an AsyncAPI (https://www.asyncapi.com) spec and generates an Application python module from it. I am able to trigger
    export-codegen
    and it works correctly. It is also able to run, when I add it as a dependency of
    python_soruce
    , however, the generated modules do not appear in the generated
    export
    resolve, as protobuf sources would, and thus are invisible to my IDE. I was following https://www.pantsbuild.org/stable/docs/writing-plugins/common-plugin-tasks/add-codegen, is there anything I am missing?
    g
    • 2
    • 5
  • p

    plain-author-67175

    12/13/2024, 2:38 AM
    Hi, I am trying to build packages for 3rd-party repos that use setup.py. But, they have build-time dependencies. I could package them using manually created
    pyproject.toml
    like:
    Copy code
    [build-system]
    requires = ["setuptools", "wheel", "torch==2.5.1+cu121"]
    build-backend = "setuptools.build_meta"
    defining them as
    resource
    and adding as a dependency to
    python_distribution
    . Now, I would like to generate this
    pyproject.toml
    using a custom target that takes the build-time dependencies as an argument, and use this generated resource as dependency to
    python_distribution
    . For example, using a
    BUILD
    like this:
    Copy code
    pyproject_build_config(
        name = "pyproject",
        build_requirements = [
            "setuptools",
            "wheel",
            "torch==2.5.1+cu121",
        ],
    )
    
    python_distribution(
        name = "pytorch3d",
        dependencies = [
            ":pytorch3d-source",
            ":pyproject",
        ],
        generate_setup = False,
        provides = python_artifact(),
        sdist = False,
    )
    However, I am unsure how to get the plugin correctly. This is what I have so far, but this doesn't seem work (although there's no error either):
    Copy code
    # register.py
    from pants.core.target_types import ResourceSourceField, ResourceTarget
    from pants.engine.fs import CreateDigest, Digest, FileContent
    from pants.engine.rules import Get, rule
    from pants.engine.target import (
        COMMON_TARGET_FIELDS,
        GeneratedSources,
        GenerateSourcesRequest,
        StringSequenceField,
    )
    
    
    class PyprojectBuildConfigRequirements(StringSequenceField):
        alias = "build_requirements"
        help = "List of requirements to build the package"
    
    
    class PyprojectBuildConfigTarget(ResourceTarget):
        alias = "pyproject_build_config"
        core_fields = (*COMMON_TARGET_FIELDS, PyprojectBuildConfigRequirements)
        help = "Generate PEP-517/518 compliant pyproject.toml for setup.py-based builds"
    
    
    class GeneratePyprojectBuildConfigRequest(GenerateSourcesRequest):
        input = PyprojectBuildConfigTarget  # This is the resource target type that we handle.
    
    
    @rule
    async def generate_pyproject_build_config(target: PyprojectBuildConfigRequirements) -> GeneratedSources:
        # Dynamically create content for the resource.
        build_requirements = target[PyprojectBuildConfigRequirements].value
        dependencies_list = ",".join((f'"{b}"' for b in build_requirements))
        content = f"""
    [build-system]
    requires = [{dependencies_list}]
    build-backend = "setuptools.build_meta"
    """
        file_name = "pyproject.toml"
    
        # Create the file as a digest.
        digest = await Get(Digest, CreateDigest([FileContent(file_name, content.encode())]))
    
        # Return the generated sources.
        return GeneratedSources(digest)
    
    
    def target_types():
        return [PyprojectBuildConfigTarget]
    
    
    def rules():
        return [generate_pyproject_build_config]
    What should I do?
    b
    • 2
    • 9
  • r

    rough-room-65027

    12/13/2024, 11:15 AM
    @rough-room-65027 has left the channel
  • h

    helpful-librarian-76940

    12/13/2024, 12:04 PM
    πŸ‘‹ Hello! Quite new to pants (we're currently using 2.21), and working on a plugin to generate a constraints file for our requirements. The plugin works quite well (retrieves a poetry pex, reads the requirements and generates a lockfile, and uses it to generate a constraints.txt), however I'm hitting a wall while testing it. Specifically, I'm using
    run_goal_rule
    to unit test my
    Constraints
    goal, but this fails because I can't find a working way to mock the rule that downloads a poetry pex. I tried using
    run_rule_with_mocks
    to mock this rule via
    MockGet
    , but I'm not quite sure how to pass the right requirements to test my goal. The ideal scenario would be
    run_goal_rule_with_mocks
    , but I don't think that exists atm πŸ˜… Would really appreciate some guidance, thanks!
    b
    • 2
    • 2
  • a

    acoustic-librarian-29560

    12/20/2024, 4:56 PM
    Hey, I am trying to test a recursive rule in a plugin and have a
    MockGet
    for a
    DependenciesRequest -> Targets
    call that each iteration calls. However, it looks like only the first call hits my Mock and subsequent calls just return
    Targets([])
    . Is there a way to do this kind of mocking that I'm unaware of or should I open an issue?
    h
    • 2
    • 8
  • c

    cold-cricket-31243

    01/09/2025, 7:58 PM
    I'm developing a plugin and I'd like to test installing it from test.pypi.org before I officially publish it. I'm having a hard time figuring out where I can set the url to look at test.pypi.org instead of pypi.org. Is anyone familiar with how to do this?
    βœ… 1
    βœ… 1
    h
    • 2
    • 22
  • c

    cold-cricket-31243

    01/13/2025, 5:45 PM
    I'm writing a test for this uv-sync plugin, which uses
    await run_interactive_process(InteractiveProcess(...)
    . From the comments across the channels, it's looking like I need to use mocking with
    rule_runner.run_interactive_process
    like we see in this example to test the plugin. Can someone confirm if this is the correct approach? The issue I'm facing right now is attached. TLDR
    native_engine.IntrinsicError: Error executing interactive process: No such file or directory (os error 2)
    . From the log error, I know the issue is related to the files being written in one directory, but the pants goal being run in a different directory.
    Untitled
    βœ… 1
    h
    • 2
    • 11
  • a

    able-school-92027

    01/22/2025, 3:02 PM
    Hey all πŸ‘‹ We have a custom plugin repo that was initially written to support pants 2.19, but now we want to add support for pants 2.24 and I'm seeing some weird
    mypy
    behavior when running
    pants check ::
    Copy code
    Partition #1 - pants-2-19, ['CPython==3.9.*']:
    <my_file.py>: error: Unexpected keyword argument "paths" for "Sandbox"; did you mean "path"?  [call-arg]
    Found 1 error in 1 file (checked 1 source file)
    
    Partition #2 - pants-2-24, ['CPython==3.9.*']:
    <my_file.py>: error: Unexpected keyword argument "path" for "Sandbox"; did you mean "paths"?  [call-arg]
    My
    my_file.py
    looks like:
    Copy code
    from pants.core.util_rules.adhoc_process_support import ExtraSandboxContents
    from pants.version import Version, PANTS_SEMVER
    
    class Sandbox(ExtraSandboxContents):
    
        @staticmethod
        def from_process(process: Process):
            env_with_out_path: dict[str, str] = dict()
            env_with_out_path.update(process.env)
            env_with_out_path.pop("PATH", None)
    
            if PANTS_SEMVER < Version("2.23.0"):
                return Sandbox(
                    digest=process.input_digest,
                    path=process.env.get("PATH", None),   <<< this is the prop when using pants<2.23.0
                    immutable_input_digests=process.immutable_input_digests,
                    append_only_caches=process.append_only_caches,
                    extra_env=FrozenDict(env_with_out_path),
                )
    
            return Sandbox(
                digest=process.input_digest,
                paths=(process.env.get("PATH", None),),  <<< this is the new propery on pants>=2.23.0
                immutable_input_digests=process.immutable_input_digests,
                append_only_caches=process.append_only_caches,
                extra_env=FrozenDict(env_with_out_path),
            )
    I had to add the
    if
    for the version because the
    path
    property of
    Sandbox
    was replaced with
    paths
    on pants 2.23 via https://github.com/pantsbuild/pants/commit/3b1ffb64bc2332e3925ffba387ce496143a4284d#diff-c440a5c58ef3255b75e7fb16e[…]7e2c3aba4567a95f44fa1edf8a4L139, but it looks like
    mypy
    is not being able to correctly run taking into consideration the pants version, as it fails on pants 2.19 with
    paths
    doesn't exist in
    Sandbox
    , and fails on pants 2.24 with
    path
    doesn't exist in
    Sandbox
    . I do have the
    resolves
    configured for each pants version, so that I can run lint, tests, check, etc, for both pants versions. Does anyone know how to correctly run
    mypy
    with multiple pants versions?
    b
    • 2
    • 3
  • c

    cold-cricket-31243

    01/24/2025, 5:34 PM
    I have developed several plugins for running
    uv sync
    and
    uv build
    . I want to create a goal that wraps the other goals. Does anyone have a reference for how to do this? I'm finding this surprisingly difficult. Here's a ref to my current plugins: https://github.com/TechnologyBrewery/pants-uv-lifecycle-plugin/tree/dev/src/pants_uv_lifecycle_plugin
    h
    • 2
    • 12
  • l

    late-lifeguard-85949

    02/07/2025, 12:58 AM
    Hello πŸ‘‹! Looking for a little guidance with a plugin I wrote internally, I think I've addressed my problem for now but hoping to get some insight into properly fixing my code (maybe I'm just approaching it all wrong to begin with). After updating from 2.21 to 2.24 (identified error from 2.23) I'm getting this error:
    Copy code
    native_engine.IntrinsicError: Get(InterpreterConstraints, InterpreterConstraintsRequest, InterpreterConstraintsRequest(addresses=Addresses([Address(src/******/__init__.py:lib), Address(src/******/app.py:lib), Address(src/******/conf.py:lib), Address(src/******/schemas.py:lib), Address(src/******/__init__.py:lib), Address(src/******/cli.py:lib), Address(src/******/core.py:lib), Addr
    ess(src/******/cors.py:lib), Address(src/******/models.py:lib)]), hardcoded_interpreter_constraints=None)) was not detected in your @rule body at rule compile time. Was the `Get` constructor called in a non async-function, or was it inside an async function defined after the @rule? Make sure the `Get` is defined before or inside the @rule body.
    w
    • 2
    • 34
  • h

    happy-psychiatrist-90774

    02/09/2025, 4:37 PM
    Is there a recommended way to create a common 1st party library that's shared between plugins?
    h
    • 2
    • 8
  • h

    happy-psychiatrist-90774

    02/10/2025, 8:00 AM
    Another question unrelated to the above Is there a way to run a goal from another goal? Specifically, I'm looking at conditionally (re-)generating a lockfile as part of another plugin
    c
    • 2
    • 2
  • i

    incalculable-toothbrush-17758

    02/10/2025, 12:40 PM
    I'm having trouble just getting off the ground when writing my plugin. Generally I'm trying to install a Go tool and then utilize it to generate some code. However I can't really get past even downloading the sources. More details in the thread.
    b
    • 2
    • 7
  • c

    calm-librarian-70577

    02/14/2025, 10:03 AM
    I'm trying to make a plugin that makes a Docker-wrapped Pex deterministic. Pex will happily timestamp all the files inside the
    .pex
    file from the
    SOURCE_DATE_EPOCH
    envvar correctly. However, the timestamp of the
    .pex
    file itself will always update.
    COPY
    -ing it in a Dockerfile will therefore create a new sha every time, making it uncachable. I need to somehow fix up the timestamp of the
    .pex
    file post-pex-build and pre-docker-build within the
    package
    goal. My problem is that I don't even know how I'd approach this problem. Any ideas? (Convincing Pex to timestamp the file correctly would of course also fix the problem.)
    g
    c
    • 3
    • 9
  • g

    gorgeous-winter-99296

    02/18/2025, 3:32 PM
    Has anyone here played around with building a plugin/custom goals for "local workflows"? We have a fairly sizeable collection of test-runners, utility scripts, etc, that wrap pants, but it's kinda awkward and fiddly. Instead one of my colleagues asked if we could have a prefix like
    pants the-repo ...
    . It seems technically doable, but I'm not 100% if it's wise.
    f
    e
    +2
    • 5
    • 11
  • b

    brave-smartphone-45640

    02/24/2025, 1:06 PM
    Hi there, I've successfully written a pants plugin to set custom docker image tags if an env var is present during
    pants package
    . Now I'd like to test it using the
    RuleRunner
    setup. Since I'm completely new to writing custom pants code I'm having a hard time debugging the error message. Any help is much appreciated, thanks! πŸ™ 😊 See code in 🧡
    w
    • 2
    • 10
  • g

    gorgeous-winter-99296

    02/28/2025, 12:20 PM
    Should
    migrate-call-by-name
    work on standalone plugins? I'm trying to migrate my plugins now but it fails out immediately:
    Copy code
    $ pants --source-root-patterns='["pants-plugins"]' migrate-call-by-name pants-plugins/::
    13:00:06.23 [ERROR] '/home/ts/.cache/nce/68f5608a60df9b97aab453d453817a4ded400d1d8ec7ede7ec14bcac83421a7b/bindings/venvs/2.24.0/lib/python3.9/site-packages/pants/option/subsystem.py' is not in the subpath of '/home/ts/Repositories/pants-backends' OR one path is relative and the other is absolute.
    w
    • 2
    • 9
  • a

    ambitious-actor-36781

    03/24/2025, 1:28 AM
    Is there a canonical example of: β€’ Running the bulk (all of) your product code in an (old) version of python β€’ Running your pants plugin code w/ whatever version of python Pants requires (3.9 prior to 2.25, 3.11 after) β€’ Running python codegen plugin code against the version of python that your product code needs β€’ running pytest and mypy against it all We kinda got lucky that our vendor required Py3.9 But they upgraded to py3.10 which pants completely skipped. And now we're struggling with resolves and interpreter constraints and stuff
    b
    • 2
    • 3
  • a

    average-breakfast-91545

    03/24/2025, 9:41 PM
    Not sure how to even ask this question, but suppose I have a backend with an
    experimental-deploy
    goal. That goal runs a python file in a sandbox. How can a user of my backend pass command line arguments to the script? I did try just running
    pants experimental-deploy src/mything -- --foo=bar --baz=quux
    but those flags don't make it through to sys.argv in the process, which makes sense. My deploy rule is just constructing a Process and wrapping it up as a DeployProcess, but I'm unclear about how I would obtain additional, arbitrary, command line arguments for that Process. Is this a thing?
    g
    • 2
    • 3
  • a

    acoustic-librarian-29560

    04/22/2025, 7:14 PM
    @fast-nail-55400 What's the deal with
    AdhocProcessRequest
    and
    AdhocProcessResult
    - what's different about them vs regular process and process result? Trying to build an adhoc packaging tool to do some filepath manipulation before the packaged output of a dependecy is passed downstream. As far as I can tell, there's no existing way to do this.
    f
    • 2
    • 11
  • e

    elegant-park-52418

    04/29/2025, 2:59 PM
    is it possible to batch/partition arbitrary rules, not just things like test/lint/fix goals? if so, is there a good example in core that someone can point out to me?
    f
    • 2
    • 13
  • e

    elegant-park-52418

    04/29/2025, 2:59 PM
    e.g. let's say i have a bunch of targets, and i want to run a rule on some number of them in parallel
  • e

    elegant-park-52418

    04/29/2025, 3:00 PM
    doesn't have to be a target, whatever input
  • w

    worried-glass-66985

    05/05/2025, 12:33 PM
    Hello βœ‹ Can I install pants as a python package somehow so that pycharm can see its modules and make hints. Now it just says "Unresolved reference 'pants' "
    e
    • 2
    • 5
  • c

    cold-mechanic-10814

    05/12/2025, 1:13 PM
    Hi, I'm attempting to implement a couple of targets/rules that implement basic codegen (generate a JSON file from a YAML source, etc.). I've got the basics working, but I'm not sure how I can "chain" the codegen targets. Here's an illustrative BUILD file:
    Copy code
    yaml_file(
        name="my_yaml",
        source=http_source(
            "<https://raw.githubusercontent.com/codefresh-io/yaml-examples/refs/heads/master/codefresh-build-1.yml>",
            len=197,
            sha256="4f0f073a576fc44d1ad670bf886fb24f394f63b6eabb7677eb197d427f5db7b0",
        ),
        convert_to_json=True
    )
    
    json_file(
        name="my_json",
        source="codefresh-build-1.json",
        dependencies=[":my_yaml"]
    )
    The
    yaml_file
    target works fine. The rule is of the form
    Copy code
    @rule
    async def generate_yaml_from_yaml_source(
        request: GenerateYAMLSourcesRequest,
    ) -> GeneratedSources:
    Where
    GenerateYAMLSourcesRequest
    is subclass of
    GenerateSourcesRequest
    .
    It's when I try to trigger the generation for the
    json_file
    target that the issue occurs. The target is also backed by a rule based on a subclass of
    GenerateSourcesRequest
    , and in addition has code to resolve the dependencies and provide them as a snapshot during the generation. The code fails, seemingly before my rule is triggered, with the following error:
    Copy code
    native_engine.IntrinsicError: Unmatched glob from src/python/libs/portland/connectors/my_connector:my_json's `source` field: "src/python/libs/portland/connectors/my_connector/codefresh-build-1.json"
    What's the appropriate way to chain these codegen steps - with the caveat that the
    json_file
    target won't always be used in a chain, it should also be used independently.
    f
    • 2
    • 5