cold-cricket-31243
01/24/2025, 5:34 PMuv sync and uv build. I want to create a goal that wraps the other goals. Does anyone have a reference for how to do this? I'm finding this surprisingly difficult. Here's a ref to my current plugins: https://github.com/TechnologyBrewery/pants-uv-lifecycle-plugin/tree/dev/src/pants_uv_lifecycle_pluginlate-lifeguard-85949
02/07/2025, 12:58 AMnative_engine.IntrinsicError: Get(InterpreterConstraints, InterpreterConstraintsRequest, InterpreterConstraintsRequest(addresses=Addresses([Address(src/******/__init__.py:lib), Address(src/******/app.py:lib), Address(src/******/conf.py:lib), Address(src/******/schemas.py:lib), Address(src/******/__init__.py:lib), Address(src/******/cli.py:lib), Address(src/******/core.py:lib), Addr
ess(src/******/cors.py:lib), Address(src/******/models.py:lib)]), hardcoded_interpreter_constraints=None)) was not detected in your @rule body at rule compile time. Was the `Get` constructor called in a non async-function, or was it inside an async function defined after the @rule? Make sure the `Get` is defined before or inside the @rule body.happy-psychiatrist-90774
02/09/2025, 4:37 PMhappy-psychiatrist-90774
02/10/2025, 8:00 AMincalculable-toothbrush-17758
02/10/2025, 12:40 PMcalm-librarian-70577
02/14/2025, 10:03 AM.pex file from the SOURCE_DATE_EPOCH envvar correctly. However, the timestamp of the .pex file itself will always update. COPY-ing it in a Dockerfile will therefore create a new sha every time, making it uncachable. I need to somehow fix up the timestamp of the .pex file post-pex-build and pre-docker-build within the package goal. My problem is that I don't even know how I'd approach this problem. Any ideas? (Convincing Pex to timestamp the file correctly would of course also fix the problem.)gorgeous-winter-99296
02/18/2025, 3:32 PMpants the-repo .... It seems technically doable, but I'm not 100% if it's wise.brave-smartphone-45640
02/24/2025, 1:06 PMpants package. Now I'd like to test it using the RuleRunner setup. Since I'm completely new to writing custom pants code I'm having a hard time debugging the error message. Any help is much appreciated, thanks! π π
See code in π§΅gorgeous-winter-99296
02/28/2025, 12:20 PMmigrate-call-by-name work on standalone plugins? I'm trying to migrate my plugins now but it fails out immediately:
$ pants --source-root-patterns='["pants-plugins"]' migrate-call-by-name pants-plugins/::
13:00:06.23 [ERROR] '/home/ts/.cache/nce/68f5608a60df9b97aab453d453817a4ded400d1d8ec7ede7ec14bcac83421a7b/bindings/venvs/2.24.0/lib/python3.9/site-packages/pants/option/subsystem.py' is not in the subpath of '/home/ts/Repositories/pants-backends' OR one path is relative and the other is absolute.ambitious-actor-36781
03/24/2025, 1:28 AMaverage-breakfast-91545
03/24/2025, 9:41 PMexperimental-deploy goal. That goal runs a python file in a sandbox. How can a user of my backend pass command line arguments to the script?
I did try just running pants experimental-deploy src/mything -- --foo=bar --baz=quux but those flags don't make it through to sys.argv in the process, which makes sense.
My deploy rule is just constructing a Process and wrapping it up as a DeployProcess, but I'm unclear about how I would obtain additional, arbitrary, command line arguments for that Process. Is this a thing?acoustic-librarian-29560
04/22/2025, 7:14 PMAdhocProcessRequest and AdhocProcessResult - what's different about them vs regular process and process result? Trying to build an adhoc packaging tool to do some filepath manipulation before the packaged output of a dependecy is passed downstream. As far as I can tell, there's no existing way to do this.elegant-park-52418
04/29/2025, 2:59 PMelegant-park-52418
04/29/2025, 2:59 PMelegant-park-52418
04/29/2025, 3:00 PMworried-glass-66985
05/05/2025, 12:33 PMcold-mechanic-10814
05/12/2025, 1:13 PMyaml_file(
name="my_yaml",
source=http_source(
"<https://raw.githubusercontent.com/codefresh-io/yaml-examples/refs/heads/master/codefresh-build-1.yml>",
len=197,
sha256="4f0f073a576fc44d1ad670bf886fb24f394f63b6eabb7677eb197d427f5db7b0",
),
convert_to_json=True
)
json_file(
name="my_json",
source="codefresh-build-1.json",
dependencies=[":my_yaml"]
)
The yaml_file target works fine. The rule is of the form
@rule
async def generate_yaml_from_yaml_source(
request: GenerateYAMLSourcesRequest,
) -> GeneratedSources:
Where GenerateYAMLSourcesRequest is subclass of GenerateSourcesRequest.
It's when I try to trigger the generation for the json_file target that the issue occurs. The target is also backed by a rule based on a subclass of GenerateSourcesRequest , and in addition has code to resolve the dependencies and provide them as a snapshot during the generation. The code fails, seemingly before my rule is triggered, with the following error:
native_engine.IntrinsicError: Unmatched glob from src/python/libs/portland/connectors/my_connector:my_json's `source` field: "src/python/libs/portland/connectors/my_connector/codefresh-build-1.json"
What's the appropriate way to chain these codegen steps - with the caveat that the json_file target won't always be used in a chain, it should also be used independently.happy-psychiatrist-90774
05/20/2025, 10:53 PMpants.toml
[GLOBAL]
pants_version = "2.26.0"
pythonpath = ["%(buildroot)s/pants-plugins"]
backend_packages = [
"pants.backend.python",
"pants.backend.plugin_development",
"myplugin",
]
[python]
interpreter_constraints = ["==3.9.*"]
enable_resolves = true
[python.resolves]
pants-plugins = "pants-plugins/lock.json"
pants-plugins/BUILD
pants_requirements(resolve='pants-plugins')
python_requirement(
requirements=['requests'],
resolve='pants-plugins',
name='requests',
)
pants-plugins/myplugin/register.py
import requests
def rules():
return []
Running basically any pants command at this point I get
ModuleNotFoundError: No module named 'requests'
What am I doing wrong? I'm pretty sure that I'm following all the instructions to the letter...happy-psychiatrist-90774
05/29/2025, 3:51 PMPexProcess) that doesn't immediately quit?
I'm looking into implementing the "serve" option for `mkdocs`(link) so it needs to remain open until explicitly stoppedhappy-psychiatrist-90774
06/05/2025, 7:53 PMadorable-psychiatrist-59834
06/05/2025, 10:16 PM@rule
def resolve_build_args(args: DockerBuildArgs, sub: MySubSystem) -> DockerBuildArgs:
return args.extended(["flag-1=X"])proud-dentist-22844
08/05/2025, 10:51 PMPythonToolBase subsystem (in the pants repo), how do you generate the initial lockfile? I tried adding my new subsystem to build_support/bin/generate_builtin_lockfiles.py and then running pants run build-support/bin/generate_builtin_lockfiles.py -- --debug elfdeps (where elfdeps is the option scope of my new subsystem).
But it complains that the lockfile can't be found (I know. That's why I asked to generate it.). Using --keep-sandboxes is not helpful because of the secondary sandbox that gets created by the generate_builtin_lockfiles.py script.
Do I need to copy some other random lockfile and use it as a seed?fast-school-44220
08/28/2025, 8:32 PMfast-school-44220
09/02/2025, 5:47 PMoutput_snapshot = []
if result.exit_code == 0:
<http://logger.info|logger.info>(f"β Generated {output_file_name} from {source_file_name}")
output_snapshot = await Get(Snapshot, Digest, result.output_digest)
else:
logger.error(f"β Failed to generate {output_file_name}: {result.stderr}")
output_snapshot = await Get(Snapshot, Digest, EMPTY_DIGEST)
return GeneratedSources(output_snapshot)witty-furniture-6665
09/09/2025, 10:30 AMadhoc_tool to run tach on a code base.
Suppossing I have a BUILD file
pex_binary(
name="tach",
entry_point="tach",
dependencies=[
"python-default#actually_shared", # resolves Tach from PyPI
],
)
adhoc_tool(
name="tach_check",
runnable=":tach",
args=[
"check", # Run `tach check`
".", # On the whole repo (or a subdir if you want)
],
log_output=True,
output_directories=[],
execution_dependencies=[
":src", # Source code
"tach.toml", # Config file
],
root_output_directory=".",
)
Point to that with check-tach = "run pants-plugins/cre/tach:tach_check" ,
When i run pants check-tach :: , i get a list of all sub dirs and a question as follows:
* src/tools/sales/ingest.py
* src/tools/sales/ingest_stages/__init__.py
* src/tools/sales/ingest_stages/r2s.py
* src/tools/sales/ingest_stages/s2m.py
* src/tools/sales/ingest_stages/s2m_test.py:tests
* src/tools/sales/load_data.py
* src/tools/sales/plants/__init__.py
* src/tools/sales/plants/adr.py
* src/tools/sales/plants/adr_test.py:tests
* src/tools/security/review.py
* src/tools/venv:create
Please select one of these targets to run.
I am trying to run tach check . , what am i doing wrong?
Any help would be greatly appreciated on this, thanks in advanceaverage-breakfast-91545
09/09/2025, 5:18 PMtach check uses a toml file to check the dependencies of declared modules, and make sure that we haven't accidentally imported a gajillion ML libraries into the wrong place. It would be nice to have it work under pants check .fast-school-44220
09/17/2025, 8:53 PM.foo files into .bar format. It does dependency inference and everything. Now I've written a second plugin that converts .bar files into .baz files. The .foo files are my primary sources. But when I try to HydrateSources with enable_codeen and the for_sources_type set to .baz , none of the rules are invoked, nor are the dependency inference functions. What is the secret to get it to chain the two rules together?limited-twilight-83823
09/25/2025, 8:22 PMpants check to avoid managing Node.js for pure python projects. https://github.com/jacoblearned/pants-basedpyright
I'd like to add optional support for its baseline feature which would require writing the output files from a VenvPexProcess back to disk, but it seems there's no way to get a handle to the workspace during CheckRequest executions to write the baseline file safely (for good reason).
Is my main option here to implement a custom @goal_rule to run separately from pants check that collects output digests from CheckResults and then write those to disk?fast-school-44220
10/06/2025, 2:57 PMsilly-queen-7197
10/17/2025, 6:34 PM