Isn't it considered best practice by now to not ex...
# community-support
s
Isn't it considered best practice by now to not explicitly use
dependsOn()
for task dependencies, but to just consume output from task dependencies in as input in another task? I believe that's what I'm doing with
Copy code
tasks.register<GeneratePluginDocsTask>("generatePluginDocs") {
    val kspKotlinTasks = getTasksByName("kspKotlin", /* recursive = */ true)
    val outputFiles = kspKotlinTasks.flatMap { it.outputs.files }
    inputFiles = files(outputFiles).asFileTree.matching { include("**/META-INF/plugin/*.json") }
}
but that code is not executing the
kspKotolin
tasks whose output I'm consuming... πŸ€·β€β™‚οΈ
v
Isn't it considered best practice by now to not explicitly use
depen dsOn()
for task dependencies
Unless the left-hand side is a lifecycle task, correct. In other cases task output should be wired to task inputs. But you do not properly wire task output to task inputs. To start with,
getTasksByName("kspKotlin", /* recursive = */ true)
is a particularly bad idea due to several reasons. β€’ It gets tasks from other projects, so while not being cross-project configuration you still reach into the model of other projects to get the task instances which is likewise bad and also falls under the "Don't reference other project tasks directly!" warning of https://docs.gradle.org/current/userguide/cross_project_publications.html#considerations_and_possible_solutions which is an unsafe way to share task outputs cross-project. β€’ As it returns a plain
Set<Task>
, it also breaks task-configuration avoidance for those tasks β€’ And it also only works with tasks that are already registered at the point where you call that method as it is not a life-collection like a
TaskContainer
. Those tasks you then
flatMap
(The
Set
-one, not the
Provider
-one) to
it.outputs.files
which then breaks the task-dependencies that would have been there. So summarized, you should switch to properly share build output cross-project as documented on the above link and if you do it properly should also get the expected task dependencies then.
s
Ugh... thanks for this analysis. So it means there's no way around creating a custom configuration just to share task output across projects?
v
No clean one, no
s
Oh man. Gradle used to be great once...
v
At least custom configuration, or custom variant
It still is great πŸ™‚ It just tries to be reliable
s
But the API for that sucks πŸ˜‰
v
You can of course do such things and you can also preserve the task dependencies, you just need to do it right.
But it would still be "unsafe sharing"
But hey, it is your build, if you want to do it that way, fix it and then live with the consequences πŸ˜„
s
Right. My point is, Gradle nowadays makes it too easy to shoot yourself in the foot.
Violating best practices should be impossible or hard to do. Right now it's the opposite: The (to me) most straight forward code violates best practices.
v
Yeah, well, backwards compatibility has a price πŸ˜„
The foot-shooting is the same as years ago, it just was not that clear and known that you shoot your foot πŸ˜„
s
A bit off-topic, but since you pointed out the shortcomings of
getTasksByName()
in general, is it ok to be used when not depending on task output? Like this use-case:
Copy code
// Gradle's "dependencies" task selector only executes on a single / the current project [1]. However, sometimes viewing
// all dependencies at once is beneficial, e.g. for debugging version conflict resolution.
// [1]: <https://docs.gradle.org/current/userguide/viewing_debugging_dependencies.html#sec:listing_dependencies>
tasks.register("allDependencies") {
    group = "Help"
    description = "Displays all dependencies declared in all projects."

    val dependenciesTasks = getTasksByName("dependencies", /* recursive = */ true)
    dependsOn(dependenciesTasks)

    // Ensure deterministic output by requiring to run tasks after each other in always the same order.
    dependenciesTasks.sorted().zipWithNext().forEach { (a, b) ->
        b.mustRunAfter(a)
    }
}
v
I'm currently also fighting ugly bad-practice code I wrote back in 1.0-milestone-7 days that prevents our main project to upgrade to Gradle 8+ πŸ˜„
😱 1
getTasksByName()
is practically always a bad idea. β€’ If you use
true
as second argument you always reach into the subprojects models (unless there are no subprojects) β€’ If you use
getTasksByName()
at all, you always only get the tasks already registered β€’ If you use
getTasksByName()
, you always break task-configuration avoidance
For things like
dependencies
where you know it is present in all projects, you can use a different way though, one minute
Copy code
allprojects.map { "${it.path}:dependencies" }.forEach {
    dependsOn(it)
}
allprojects { ... }
and
subprojects { ... }
is bad, or getting things from their model
But in the snippet I showed you just use the project name and construct a task path string on which you depend
That is ok as it does neither configure another project, nor reaches into the model of the other project
s
Hmm, but still, I just banned
allprojects
and
subprojects
from our project. I'm not going to introduce it now again πŸ˜…
v
As I said,
allprojects { ... }
/
subprojects { ... }
is different from using the information directly available from
allprojects
/
subprojects
πŸ™‚
s
I understand the semantic difference. But I really just banned "allprojects" / "subprojects" as strings from our build files, as that's an easier check than looking into each use-case whether it's acceptable.
BTW, I guess https://docs.gradle.org/current/samples/sample_cross_project_output_sharing.html is a more concrete example for following the best practice.
v
That sample should be the sample to the above linked documentation, yes. It also links to that documentation page l linked you to.
s
Unfortunately, those examples do not show how to do it if you only know the (third-party) task name, and there are multiple producer tasks for which you do not know the project name.
v
Depends on concrete details and situation. If you for example have a convention plugin that you apply to all projects, create the outgoing configuration in all projects, and use a
pluginManager.withPlugin(...) { ... }
to react to the 3rd-party plugin being applied and then add that task as artifact to the configuration, then on consumer side just consume all projects for example.
s
Would
Copy code
val kspProducerConfiguration by configurations.creating {
    isCanBeResolved = false
}

tasks.named("kspKotlin") {
    artifacts {
        add(kspProducerConfiguration.name, this)
    }
}
also work for the produces side?
p
You should not add an artifact inside a task configuration, because if you add an artifact, the resulting task will be request to configure.
And you should use the scoped configurations if possible.
v
A bit the other way aaround. Like that it would only be added as artifact if the task is actually going to be configured, which is bad, so yes, shouldn't be done from the task configuration action just for other reasoning πŸ˜„
Probaly more something like
Copy code
artifacts {
    add(kspProducerConfiguration.name, tasks.named("kspKotlin"))
}
s
Thanks. Do you also have any hints how the consumer side should look like if the project / path is unknown, and you don't want to refer to
allprojects
in
Copy code
dependencies {
    sharedConfiguration(project(path = ":producer", configuration = "sharedConfiguration"))
}
p
Well, ideally the project path of the provider is know. At the glance, the resolution of project dependencies and external dependencies is the same. But you could depend on all projects and use a artifact view to filter failed dependencies, but I think, this requires attributes to not fail hard due to a not found requested configuration. Never tried it without attributes. Also ideally, you should use attributes instead of requesting configurations. Yes, it is complex.
😞 1
v
For just sharing within one build using explicit configurations is usually good enough if you do not need more complex scenarios. But the Sebastian's question, if you don't want to use
allprojects
but depend on all projects, you have to list them manually. πŸ™‚
p
Yeah, but don’t publish an artifact that depends on a requested configuration :)
v
Yeah, better not. But usually that is not the use-case for that, but to process some output of other projects, like in this case generating some documentation from the gathered files. And there it should be fine or even preferable. Otherwise the variant is also published (unless you then disable that).