This message was deleted.
# community-support
s
This message was deleted.
v
Nor sure what you mean. If you do
include("foo:bar")
it indeed adds a subproject named
foo
located in
foo
and a subproject thereof
bar
located in
foo/bar
.
ł
currently i'm just recursively hitting the filesystem looking for gradle.build.kts, but that obviously won't scale long term
It kind of does 🤷 we're using it with hundreds of modules with no issues, and I think it should be compatible with configuration cache too. There are open source projects using similar approach as well, e.g. https://github.com/apollographql/apollo-kotlin/blob/main/settings.gradle.kts#L13-L22
a
when i say scale, i mean in the monorepo with thousands to tens of thousands sense. while i realize that i'm unlikely to hit the upper end of that any time soon, i'm hoping to basiocally have all my apps and all my other projects eventually migrated all under one repo and hoping not to have to deal with a build system migration. also considering hybrid gradle + bazel setup (sadly, bazel is all but completely broken for android otherwise i'd have stuck with bazel). mostly trying to avoid needing to manage a long list of transitive modules at the root of the repo since it's also not even clear to me why gradle needs to know about any children as long as they follow the standard module-name = path convention so modules can be resolved when something depends on them (i realize without that convention you'd need to manually configure module to path mappings). i'm basically just trying to define repo-wide configs in settings.gradle and then have each app manage itself relatively self contained, while still being able to invoke gradle from the repo root.
is this was for one project, i'd agree that there's no reason to worry about scale, even even at the numbers i'm talking about, on a decent machine only makes probably a few seconds difference once filesystem caches are set up, so mostly this is about trying to avoid settings.gradle needing to know about every single module in the repo on principle
v
Sounds like you should not have one mega-build that `include`s all projects, but have multiple standalone builds that define their projects and then have a composite builds that includes the whole builds to be able do trigger the individual projects from the root. This way you just manage the respective subprojects on the individual project level.
a
thanks, so i did see that subproject thing and those square links were like a retelling of my weekend trying to get bazel to work as well as blaze for android (fools errand, i know), giving up, and then just trying to make gradle work. just did a lot of the cleanups listed there that i hadn't done already. i still would prefer to have one mega project though, and even if it's unsupported, i'd be curious to know why gradle needs to include all modules upfront instead of just loading them on-demand when they're included as a dependency or some other dep. maybe i'm doing something wrong with subprojects though. i'd still need to include the subprojects manually (which i guess i'm willing to do because they will at most be in the low hundreds), but then i end up with two or three final issues: • each of those projects still need to manage their include()s • each of those projects need to manage their top level configuration (e.g. rmaven repos) • each of those projects need to includeBuild all the other projects they depend on at the top level i'm trying to go for that monorepo experience where things are configured globally once, and then all code can depend on all other code anywhere in the repo by just using dependencies() and without needing to update any other shared configs. it sounds like square solved the project init thing by just pre-aggregating and loading a cached set of includes, which might be the way to go in my case long term too
v
If you mean by that, that you want to use
allprojects { ... }
or
subprojects { ... }
, that's a big no-go anyway. This is highly discouraged. It immediately introduces project coupling, which disturbs more sophisticated Gradle features. Instead you should use convention plugins that you then apply directly where you want their effect. This way you can also share build logic in those separate builds easily and do but to repeat repository configuration and so on. Of course you can do a mega-project if you want, I just advice you to not do it. A big composite build will most probably work very similar in the end, but it depends on the details of course. If you prefer to do it on one build, do so. But it will most probably also affect performance, because then all projects need to be configured, whereas with a decent composite build structure only the builds actually used are configured.
even if it's unsupported, i'd be curious to know why gradle needs to include all modules upfront instead of just loading them on-demand when they're included as a dependency or some other dep.
Because that's how it works. The settings script defines which projects are part of the current build. The configuration phase then configures these projects and the execution phase executes them. The configuration phase, where dependencies are declared is much too late to add new projects. Theoretically you could even declare dependencies on execution time, even though it is highly discouraged. Feel free to post a feature request for such a behavior, but I highly doubt you will hit positive feedback, but well, who knows. 🙂 If you want to not declare all projects manually, you can of course do some filetree walking logic to find all projects that have a build script and only consider those projects (a build script does not have to be present, even though it is good practice). Or you could consider all folders in a specific subfolder as projects. Or any similar logic. Just remember that this logic needs to be done on every build (except when reusing a configuration cache entry), so it should be fast.
a
thanks! so i used convention plugins to get rid of 90% of my allprojects (though i suspect it slowed down my builds :(), there's only one line left to move the build outputs to /tmp (i guess technically i could probably also move this to a CV, though i'm also occasionally seeing local .gradle direcotries being created so something isn't observing that directory) . the configuration i'm mostly talking about now is the repositories and plugin repositories from settings.gradle (not sure i can put those into a plugin since they're used to load plugins?) and plugin versions (pluginManager.apply() doesn't expose a way to set the version for a plugin..). the all projects needingf to be configured bit is what this thread is about. i don't see a reason for gradle to pre-emptively configure or know about projects/modules it doesn't need to build/run. it sounds like "that's just the way gradle was built to work" but still not seeing the root "why" i do have one to add dependencies since there's a certain library that needs like 6-9 deps. is there a way to create dependency groups so i can switch that to just a single dep? i'm currently doing some pruned file-tree walking to do include(). curious though how a build script could be avoided. i need to somehow build these things right? i am using the configuration cache, but i think maybe due to convention plugins the logic runs anyway, but didn't explicitely check. basically after switching over to CVs, the configuration step even for a tiny repo increased by a few seconds, and i think the square blogs implied it would happen as such unless you used artifactory/prebuilt CVs
v
the configuration i'm mostly talking about now is the repositories and plugin repositories from settings.gradle (not sure i can put those into a plugin since they're used to load plugins?)
If you
includeBuild
the build that builds the settings plugin within
pluginManagement { ... }
and not top-level, the build can also contribute settings plugins that can then be applied within the same setting script, yes.
and plugin versions (pluginManager.apply() doesn't expose a way to set the version for a plugin..).
If you use
pluginManager.apply()
, then you are probably in your convention plugin doing this? If so, the plugin should be defined as dependency of the build building the plugin and there the version is defined.
it sounds like "that's just the way gradle was built to work" but still not seeing the root "why"
Well, if you want reasoning, you probably have to ask the Gradle folks. This is a user community where mainly users like me help other users like you. 🙂 And I could only guess about most of their design decisions.
i do have one to add dependencies since there's a certain library that needs like 6-9 deps. is there a way to create dependency groups so i can switch that to just a single dep?
Not fully sure what you mean. You can have a convention plugin that adds these dependencies and then apply that convention plugin if that is what you meant. Or if you use version catalogs, then you can define bundles in the version catalog where multiple other libraries are included so that you can depend on all of them at once using the bundle.
curious though how a build script could be avoided.
You shouldn't. Back in the "good" old days, where it was common practice to use
allprojects { ... }
and
subprojects { ... }
and
project(...) { ... }
you could have all the configuration for example in the root project and just leave out the build scripts of the subprojects. But nowadays this is bad practice as you know, and the build scripts should be present and apply the according convention plugin(s).
i am using the configuration cache, but i think maybe due to convention plugins the logic runs anyway, but didn't explicitely check.
If you have a configuration cache hit, all logic up to the execution phase is skipped, including init script and settings scripts and also including building the convention plugins from included builds. So if anything of that happens, you did not have a CC hit, or there is probably some bug. The only thing that still happens with a CC hit is the evaulation of
ValueSource
values, as there this is explicitly the wanted behavior, that they always run and if what they return is different, the CC entry is not reused but configuration happens again, including evaluating the
ValueSource
a second time.
basically after switching over to CVs, the configuration step even for a tiny repo increased by a few seconds, and i think the square blogs implied it would happen as such unless you used artifactory/prebuilt CVs
Depends on many factors. When CC is reused for example, then not. If CC is not reused, it depends on whether
buildSrc
is used or an included build. And if an included build, whether it is all in one project or separated into multiple projects or included builds. Of course, if you build the convention plugin through
buildSrc
or included build and CC is not reused, it of course has to check whether the build of the convention plugin is up to date or needs to rebuild, but usually this should not need several seconds, but heavily depends on the machine you are building with and the complexity of that build. Using something pre-built if possible is of course always faster then building ad-hoc. That is a trade-off between time and flexibility, as with pre-built you of course have to actually pre-build and publish before you can use it.
a
1. cool, i'll try it out 2. you mean using the old style plugins { id(foo) version far apply false } syntax? i hoped to avoid pre-declaring all versions there 3. thanks, forgot about bundles in the VC. that plugin approach is what i am doing since i also need to declare a plugin with those deps (this is for hilt if you're curious) 4. hopefully gradle profiler can help me figure out what is going on. or is there a better tool? i was getting CC hits iirc 5. not using buildSrc since i've heard conflicting comments of whether it is good practice or now 6. for context, i'm running this on WSL/HyperV (so trivial perf hit over raw metal, but limited to 12 cpu threads instead of the full 24 and 20GB RAM instead of 32, swap/file i/o isn't an issue and all the files are cached by linux afaik) on a just one tier down from top-of the line consumer machine from < 1 year ago (i wasn't going to pay the markup on that i9-13900kf 😂), so shouldn't really be an issue. but in terms of absolute numbers, things went from 600ms to 4-6s on a basic app. hopefully the profiler tells me what's up
funnily enough gradle profiler hits this bug: https://github.com/gradle/gradle/issues/18386
so, looks like it's something funky with wsl/ubuntu or something. a cold reset and i'm back to even better speeds with all the gradle.properties tuned for performance:
👌 1
do you have an example of how an included build can contribute plugin versions and repository configs? the ktdoc says it can but my naive attempt failed to make it work
v
you mean using the old style plugins { id(foo) version far apply false } syntax? i hoped to avoid pre-declaring all versions there
No, I did not. I assumed you use that statement you mentioned in one of your convention plugins? There you never will use any version, no matter how you apply the plugin. Instead you declare it as dependency of the plugin, so for example as
implementation(...)
in
gradle/build-logic/build.gradle.kts
or wherever you build it.
hopefully gradle profiler can help me figure out what is going on. or is there a better tool?
None I'm aware of
not using buildSrc since i've heard conflicting comments of whether it is good practice or now
It is mentioned everywhere in the docs, so I'd not particularly call it bad practice. I personally prefer a normal included build though for various reasons. But for small builds it probably makes not much difference. For really complex builds it can result in better performance using an included build if done properly. Or of course published convention plugins as mentioned above.
do you have an example of how an included build can contribute plugin versions
Assuming the included build builds a plugin that you use, simply declare them as
implementation
or
runtimeOnly
dependencies.
and repository configs?
Well, have a setting plugin built by the included build that does the config, include the build within
pluginManagement { ... }
and then apply that settings plugin. This will not work for other settings plugins as that would be too late, but for project plugins, it should work. But no, I don't have an example at hand.
the ktdoc says it can but my naive attempt failed to make it work
Feel free to throw together an MCVE of what you tried, then maybe it becomes obvious to tell what you need to change.
a
yeah, at this point basically everything i ask about is in convention plugins. i mostly just tried putting the includeBuild at different levels, and then in the settings.gradle of the included build putting repositor blocks atdifferent levels. i guess from what you said the thing i was doing wrong was I needed to also call apply from 'settings.gradle' or similar. i'll try that tonight. i think at this point you've basically covered all my issues with gradle. thanks again for all your help!
v
Yeah, what you do in the settings script of the included build is irrelevant.
You need to have in
src/main/.....
a settings plugin that does the configuration and that you then apply in the including build's settings script.
The settings script of the included build is only relevant for building the included build and has no meaning for the including build
a
wait, so do i need a gradle plugin (fooplugin : Plugin<Project>), or a .kts script that i include into the main project from the included build?
v
Plugin<Settings>
or
whatever.settings.gradle.kts
, yes
1
a
so close, but no cigar: as soon as i do: plugins { id("my_plugin_id") } at the root of my settings.gradle.kts, it fails with: Unable to load class 'com.android.build.api.dsl.CommonExtension' com.android.build.api.dsl.CommonExtension Gradle's dependency cache may be corrupt (this sometimes occurs after a network connection timeout.) Re-download dependencies and sync project (requires network) The state of a Gradle build process (daemon) may be corrupt. Stopping all Gradle daemons may solve this problem. Stop Gradle build processes (requires restart) Your project may be using a third-party plugin which is not compatible with the other plugins in the project or the version of Gradle requested by the project. In the case of corrupt Gradle processes, you can also try closing the IDE and then killing all Java processes. Removing that plugins block fixes it, and restarting the daemon doesn't solve it either so i doubt its actual corruption. the plugin so far is just a no-op . IF i move the plugin apply to the pluginManagement that error goes away and it syncs, but then the repositores aren't applied from the convention plugin
ok, so in the latter it doesn't even run, so back o the top level, i've confirmed the plugin is applied, but then i get classloader issues in my other project plugins
getting closer now: removed plugin versions/applies from build.gradle.kts, and now just trying to add dependencies from the other convention plugins fails
the plugin thing didn't work either. if i switch to runtimeonly the convention plugins can't see the types. if i switch to implementation i get classloader issues, so i just keep them as compileonly for now, but that means i need to leave the root build.gradle plugins block as-is to keep version info. on top of that, i also realize i can't declare convention plugins in the version catalog because they have no versions, so sadly none of this ended up working.
i also noticed that not even nowinandroid does any of this stuff and i'm at the point where i basically did as much as they did, if not more, in terms of cleanup, so maybe it's not actually possible to do this final deduping of central logic
v
IF i move the plugin apply to the pluginManagement that error goes away and it sync
Of course,
pluginManagement { plugins { ... } }
does not apply any plugin anywhere and also does not add it to any classpath / class loader. Its purpose is to centrally define plugin versions to use if you apply them without version at other places in the build, to centralize the version definitions for pluings. The predecessor of version catalogs so to say, as far as plugin versions are concerned. Without seeing the build, it is hard to say what exactly is the class loader problem. But how this typically happens is, if you have plugin A that depends on plugin B and uses its types, but only using
compileOnly
and reacting to the plugin being applied using
pluginManager.withPlugin
. Then you add A to the root project and B to a subproject classpath, for example by simply applying it and also apply A in the subproject. This then results in A seeing B being applied as
pluginManager.withPlugin
works with
String
ID. But when the classe of A that are on the root project class loader try to access the classes of B which are only on the subproject class loader it cannot find them. The settings script classpath is in a parent class loader of the root project class loader, so here the same can happen. You add A to the settings class loader by applying the plugin that has it as dependency. B is only added to the root project class loader or a subproject class loader and then the classes in A cannot find those. The typical mitigation is, to make sure those plugins are in the same class loader, so in your case you could for example declare B as a
runtimeOnly
dependency of your plugin, so that its classes also end up on the settings class loader and thus can be found by A. As you do not actually need A in your settings plugin, another and maybe cleaner option would be to split your plugin build into two projects. One that builds the settings plugin and does not have a dependency on A, and one that builds your project plugins where you need the dependency on A. Then the settings plugin will not add A to the settings class loader and you should not run into that problem in the first place.
on top of that, i also realize i can't declare convention plugins in the version catalog because they have no versions
Sure you can, just use any version, I tend to use
?
as version, but it is quite irrelevant what you use. As the plugin is coming from an included build, the version is ignored anyway.