elegant-park-52418
11/13/2024, 2:23 PMdocker_image
target. is there an existing pattern i can follow to achieve this?lively-school-24147
11/15/2024, 3:57 PMgentle-sugar-52379
11/21/2024, 5:23 PMsquare-elephant-85438
12/02/2024, 6:44 PMhappy-psychiatrist-90774
12/05/2024, 3:51 PMwooden-policeman-10903
12/06/2024, 9:14 PMexport-codegen
and it works correctly. It is also able to run, when I add it as a dependency of python_soruce
, however, the generated modules do not appear in the generated export
resolve, as protobuf sources would, and thus are invisible to my IDE. I was following https://www.pantsbuild.org/stable/docs/writing-plugins/common-plugin-tasks/add-codegen, is there anything I am missing?plain-author-67175
12/13/2024, 2:38 AMpyproject.toml
like:
[build-system]
requires = ["setuptools", "wheel", "torch==2.5.1+cu121"]
build-backend = "setuptools.build_meta"
defining them as resource
and adding as a dependency to python_distribution
.
Now, I would like to generate this pyproject.toml
using a custom target that takes the build-time dependencies as an argument, and use this generated resource as dependency to python_distribution
.
For example, using a BUILD
like this:
pyproject_build_config(
name = "pyproject",
build_requirements = [
"setuptools",
"wheel",
"torch==2.5.1+cu121",
],
)
python_distribution(
name = "pytorch3d",
dependencies = [
":pytorch3d-source",
":pyproject",
],
generate_setup = False,
provides = python_artifact(),
sdist = False,
)
However, I am unsure how to get the plugin correctly. This is what I have so far, but this doesn't seem work (although there's no error either):
# register.py
from pants.core.target_types import ResourceSourceField, ResourceTarget
from pants.engine.fs import CreateDigest, Digest, FileContent
from pants.engine.rules import Get, rule
from pants.engine.target import (
COMMON_TARGET_FIELDS,
GeneratedSources,
GenerateSourcesRequest,
StringSequenceField,
)
class PyprojectBuildConfigRequirements(StringSequenceField):
alias = "build_requirements"
help = "List of requirements to build the package"
class PyprojectBuildConfigTarget(ResourceTarget):
alias = "pyproject_build_config"
core_fields = (*COMMON_TARGET_FIELDS, PyprojectBuildConfigRequirements)
help = "Generate PEP-517/518 compliant pyproject.toml for setup.py-based builds"
class GeneratePyprojectBuildConfigRequest(GenerateSourcesRequest):
input = PyprojectBuildConfigTarget # This is the resource target type that we handle.
@rule
async def generate_pyproject_build_config(target: PyprojectBuildConfigRequirements) -> GeneratedSources:
# Dynamically create content for the resource.
build_requirements = target[PyprojectBuildConfigRequirements].value
dependencies_list = ",".join((f'"{b}"' for b in build_requirements))
content = f"""
[build-system]
requires = [{dependencies_list}]
build-backend = "setuptools.build_meta"
"""
file_name = "pyproject.toml"
# Create the file as a digest.
digest = await Get(Digest, CreateDigest([FileContent(file_name, content.encode())]))
# Return the generated sources.
return GeneratedSources(digest)
def target_types():
return [PyprojectBuildConfigTarget]
def rules():
return [generate_pyproject_build_config]
What should I do?rough-room-65027
12/13/2024, 11:15 AMhelpful-librarian-76940
12/13/2024, 12:04 PMrun_goal_rule
to unit test my Constraints
goal, but this fails because I can't find a working way to mock the rule that downloads a poetry pex. I tried using run_rule_with_mocks
to mock this rule via MockGet
, but I'm not quite sure how to pass the right requirements to test my goal.
The ideal scenario would be run_goal_rule_with_mocks
, but I don't think that exists atm π
Would really appreciate some guidance, thanks!acoustic-librarian-29560
12/20/2024, 4:56 PMMockGet
for a DependenciesRequest -> Targets
call that each iteration calls. However, it looks like only the first call hits my Mock and subsequent calls just return Targets([])
. Is there a way to do this kind of mocking that I'm unaware of or should I open an issue?cold-cricket-31243
01/09/2025, 7:58 PMcold-cricket-31243
01/13/2025, 5:45 PMawait run_interactive_process(InteractiveProcess(...)
. From the comments across the channels, it's looking like I need to use mocking with rule_runner.run_interactive_process
like we see in this example to test the plugin. Can someone confirm if this is the correct approach? The issue I'm facing right now is attached. TLDR native_engine.IntrinsicError: Error executing interactive process: No such file or directory (os error 2)
. From the log error, I know the issue is related to the files being written in one directory, but the pants goal being run in a different directory.able-school-92027
01/22/2025, 3:02 PMmypy
behavior when running pants check ::
Partition #1 - pants-2-19, ['CPython==3.9.*']:
<my_file.py>: error: Unexpected keyword argument "paths" for "Sandbox"; did you mean "path"? [call-arg]
Found 1 error in 1 file (checked 1 source file)
Partition #2 - pants-2-24, ['CPython==3.9.*']:
<my_file.py>: error: Unexpected keyword argument "path" for "Sandbox"; did you mean "paths"? [call-arg]
My my_file.py
looks like:
from pants.core.util_rules.adhoc_process_support import ExtraSandboxContents
from pants.version import Version, PANTS_SEMVER
class Sandbox(ExtraSandboxContents):
@staticmethod
def from_process(process: Process):
env_with_out_path: dict[str, str] = dict()
env_with_out_path.update(process.env)
env_with_out_path.pop("PATH", None)
if PANTS_SEMVER < Version("2.23.0"):
return Sandbox(
digest=process.input_digest,
path=process.env.get("PATH", None), <<< this is the prop when using pants<2.23.0
immutable_input_digests=process.immutable_input_digests,
append_only_caches=process.append_only_caches,
extra_env=FrozenDict(env_with_out_path),
)
return Sandbox(
digest=process.input_digest,
paths=(process.env.get("PATH", None),), <<< this is the new propery on pants>=2.23.0
immutable_input_digests=process.immutable_input_digests,
append_only_caches=process.append_only_caches,
extra_env=FrozenDict(env_with_out_path),
)
I had to add the if
for the version because the path
property of Sandbox
was replaced with paths
on pants 2.23 via https://github.com/pantsbuild/pants/commit/3b1ffb64bc2332e3925ffba387ce496143a4284d#diff-c440a5c58ef3255b75e7fb16e[β¦]7e2c3aba4567a95f44fa1edf8a4L139,
but it looks like mypy
is not being able to correctly run taking into consideration the pants version, as it fails on pants 2.19 with paths
doesn't exist in Sandbox
, and fails on pants 2.24 with path
doesn't exist in Sandbox
.
I do have the resolves
configured for each pants version, so that I can run lint, tests, check, etc, for both pants versions.
Does anyone know how to correctly run mypy
with multiple pants versions?cold-cricket-31243
01/24/2025, 5:34 PMuv sync
and uv build
. I want to create a goal that wraps the other goals. Does anyone have a reference for how to do this? I'm finding this surprisingly difficult. Here's a ref to my current plugins: https://github.com/TechnologyBrewery/pants-uv-lifecycle-plugin/tree/dev/src/pants_uv_lifecycle_pluginlate-lifeguard-85949
02/07/2025, 12:58 AMnative_engine.IntrinsicError: Get(InterpreterConstraints, InterpreterConstraintsRequest, InterpreterConstraintsRequest(addresses=Addresses([Address(src/******/__init__.py:lib), Address(src/******/app.py:lib), Address(src/******/conf.py:lib), Address(src/******/schemas.py:lib), Address(src/******/__init__.py:lib), Address(src/******/cli.py:lib), Address(src/******/core.py:lib), Addr
ess(src/******/cors.py:lib), Address(src/******/models.py:lib)]), hardcoded_interpreter_constraints=None)) was not detected in your @rule body at rule compile time. Was the `Get` constructor called in a non async-function, or was it inside an async function defined after the @rule? Make sure the `Get` is defined before or inside the @rule body.
happy-psychiatrist-90774
02/09/2025, 4:37 PMhappy-psychiatrist-90774
02/10/2025, 8:00 AMincalculable-toothbrush-17758
02/10/2025, 12:40 PMcalm-librarian-70577
02/14/2025, 10:03 AM.pex
file from the SOURCE_DATE_EPOCH
envvar correctly. However, the timestamp of the .pex
file itself will always update. COPY
-ing it in a Dockerfile will therefore create a new sha every time, making it uncachable. I need to somehow fix up the timestamp of the .pex
file post-pex-build and pre-docker-build within the package
goal. My problem is that I don't even know how I'd approach this problem. Any ideas? (Convincing Pex to timestamp the file correctly would of course also fix the problem.)gorgeous-winter-99296
02/18/2025, 3:32 PMpants the-repo ...
. It seems technically doable, but I'm not 100% if it's wise.brave-smartphone-45640
02/24/2025, 1:06 PMpants package
. Now I'd like to test it using the RuleRunner
setup. Since I'm completely new to writing custom pants code I'm having a hard time debugging the error message. Any help is much appreciated, thanks! π π
See code in π§΅gorgeous-winter-99296
02/28/2025, 12:20 PMmigrate-call-by-name
work on standalone plugins? I'm trying to migrate my plugins now but it fails out immediately:
$ pants --source-root-patterns='["pants-plugins"]' migrate-call-by-name pants-plugins/::
13:00:06.23 [ERROR] '/home/ts/.cache/nce/68f5608a60df9b97aab453d453817a4ded400d1d8ec7ede7ec14bcac83421a7b/bindings/venvs/2.24.0/lib/python3.9/site-packages/pants/option/subsystem.py' is not in the subpath of '/home/ts/Repositories/pants-backends' OR one path is relative and the other is absolute.
ambitious-actor-36781
03/24/2025, 1:28 AMaverage-breakfast-91545
03/24/2025, 9:41 PMexperimental-deploy
goal. That goal runs a python file in a sandbox. How can a user of my backend pass command line arguments to the script?
I did try just running pants experimental-deploy src/mything -- --foo=bar --baz=quux
but those flags don't make it through to sys.argv in the process, which makes sense.
My deploy rule is just constructing a Process and wrapping it up as a DeployProcess, but I'm unclear about how I would obtain additional, arbitrary, command line arguments for that Process. Is this a thing?acoustic-librarian-29560
04/22/2025, 7:14 PMAdhocProcessRequest
and AdhocProcessResult
- what's different about them vs regular process and process result? Trying to build an adhoc packaging tool to do some filepath manipulation before the packaged output of a dependecy is passed downstream. As far as I can tell, there's no existing way to do this.elegant-park-52418
04/29/2025, 2:59 PMelegant-park-52418
04/29/2025, 2:59 PMelegant-park-52418
04/29/2025, 3:00 PMworried-glass-66985
05/05/2025, 12:33 PMcold-mechanic-10814
05/12/2025, 1:13 PMyaml_file(
name="my_yaml",
source=http_source(
"<https://raw.githubusercontent.com/codefresh-io/yaml-examples/refs/heads/master/codefresh-build-1.yml>",
len=197,
sha256="4f0f073a576fc44d1ad670bf886fb24f394f63b6eabb7677eb197d427f5db7b0",
),
convert_to_json=True
)
json_file(
name="my_json",
source="codefresh-build-1.json",
dependencies=[":my_yaml"]
)
The yaml_file
target works fine. The rule is of the form
@rule
async def generate_yaml_from_yaml_source(
request: GenerateYAMLSourcesRequest,
) -> GeneratedSources:
Where GenerateYAMLSourcesRequest
is subclass of GenerateSourcesRequest
.
It's when I try to trigger the generation for the json_file
target that the issue occurs. The target is also backed by a rule based on a subclass of GenerateSourcesRequest
, and in addition has code to resolve the dependencies and provide them as a snapshot during the generation. The code fails, seemingly before my rule is triggered, with the following error:
native_engine.IntrinsicError: Unmatched glob from src/python/libs/portland/connectors/my_connector:my_json's `source` field: "src/python/libs/portland/connectors/my_connector/codefresh-build-1.json"
What's the appropriate way to chain these codegen steps - with the caveat that the json_file
target won't always be used in a chain, it should also be used independently.