This message was deleted.
# dependency-management
s
This message was deleted.
c
a/build.gradle
Copy code
dependencies {
  val projectB = project(":B")
  if(??? == "server") add("server", projectB)
}
b/build.gradle
Copy code
??? = "server"
c/build.gradle
Copy code
??? = "client"
we had it done using files in projects and reading them from the consumer...but I would like to skip accessing consumed projects
v
What should the
???
be in
a
?
c
that's my question
I mean, I wanted it to be something like
projectB.attributes
let me rephrase the question maybe... is it possible to expose any metadata from one project to it's consumers?
(note:
projectB
here is
ProjectDependency
not
Project
)
meh..I think what I'm asking is not possible, to expose anything it would need to be resolved first
v
yep, think so too
Well, it is within one build, so you could access it, for example by having some extension (or in worst case extra property) on
b
that you could check.
Or depending on the usage of
server
cofniguration in
a
, you could use attributes or the higher level feature variants
But that would then need that you depend on the feature variant or attribute of both
b
and
c
and resolve it lenient so that it is not an error if the dependency cannot be resolved.
c
actually that's kind of the idea.. I want to create buckets with dependencies with "features", and then later I'll use them to compose the classpath for another task
something like... task..... configure { classpath = serverConfiguration+clientBConfiguration }
but I didn't want to create stub empty variants
v
serverConfiguration.resolvedConfiguration.lenientConfiguration
?
🤔 1
Or a lenient artifact view?
c
I didn't know that exist
I'll try to rethink it to use that, thanks a lot! (even if at the end it doesn't solve my problem I'd have learnt something new and useful)
👌 1
v
Another idea might be a shared build service that you fill with data from your settings script that could somehow determine the types and then
a
can use the information of that build service.
c
Scary, I don't if it would be "right", but sounds scary anyway
v
Nah, not too scary, just a bit. 😄 You could maybe also do this type determination in the settings script and set it as extension on
a
, no need for a build service actually if only
a
needs the information. You could probably even do the dependency adding from the settings script, but that would imho be out of its scope.
c
as I'm using a
sync
task instead of directly classpath (I didn't mention it because I wanted to drop it) I've found another approach...
Copy code
into(it.name) {
            eachFile {
                JarFile(file).use { jar ->
                    if (...) {
                        exclude()
                    }
                }
            }
            from(it - (configurations["runtimeClasspath"] + jarFiles.get()))
        }
this way I can check in the content of the jar the resource I need to read, and it's done during execution phase
not as nice as "not copying files" but it does work
v
Checking a file in the jar pushes another dirty idea to my mind. 😄 You could probably us an artifact transform that looks in the jar and if the wanted file is there use it unchanged and otherwise transform it to an empty directory or similar.
c
transform...will it run in execution phase? I expected it to happen during configuration to calculate the inputs of the task
v
They run on resolution. So if you do not resolve on configuration time, then at execution time.
But either way, if a configuration is resolved, the jars are or will be built, so the transformation will be able to look into the built jar.
c
my approach: • requires to build all artifacts to read the jar content • duplicates files on disk • ✅ runs on execution phase transform: • requires to build all artifact • ¿ ✅ ? don't duplicate...but create the empty jars • ✅ runs on execution phase
the approach I saw in our current code does..read the file with the attribute in configuration time, but it saves the time of compiling unnecessary artifacts (and I would say...it's accessing consumed project files from the consumer, which is bad)
v
I don't think your need to create empty jars
Either have no output, or if it needs, an empty directory
c
I think I'll go with
sync
because it's working already and add a task to replace it with
transform
because it will save some time
it would be easier to do
project(path).dependencyProject.blablabla
but I think that will eventually break with CC or later with project isolation
(maybe even now it doesn't work with CC, haven't tried)
v
CC should probably work as you do it at configuration time. Project isolation will probably break it, yeah.
Using feature variants or attributes and lenient configuration or artifact view would at least work too with the sync approach, not needing to build unnecessary jars
c
current usecase is a blacklist, I should change it to a whitelist to use lenient I think also it seems to require a lot more changes, and so far I'm trying to replace a ton a complex gradle scripts with a plugin that is at least "correct" in terms of Gradle usage
👌 1
current uses Sync task, replacing it by classpath is an optimization (not copying files), not producing extra jars is desired (a performance decrease in my approach), but over all I need correctness
Adding a task in my backlog to explore those alternatives
👌 1
JFYI finally using
compatibilityRules
was the best solution, even if it required a little more work. Thanks again for the ideas!
v
Interesting, mind elaborating?
c
custom attribute, setting the list of values in the producer, and then filtering in the consumer using compatibilityRules (I made both a rule for included and for excluded values, but using just one) and then consuming it with:
*it*._incoming_.artifactView *{* lenient(true) *}*._files_
I'm not building the artifacts that wouldn't be used, and the solution seems clean and explicit
the only overhead I didn't like is creating a plugin for the producers to add the attribute to the schema, but needed it to set the values in the gradle file (instead of using the resource file as before)
v
Ah, with additionally a lenient artifact view, ok. 👌
c
yeps
I lied... even for excluded artifacts...it's compiling them
I don't understand why
I know you'll tell me to create a reproducer, but I don't know if that's working as expected or a bug (bug can be from my project, not just gradle)
false alarm
👌 1
but I have a question... if I depend on
project("producer" , configuration = "myConfiguration")
it works fine, compatibility rule is executed for non-equal values, but if I depend on
project("producer")
it doesn't exclude any dependency I tried even removing my custom attribute from the producer, and... it resolves the dependency anyway
I have to create a reproducer to see if something in my project trolls me or if rally doesn't work as I expected
same issue in the reproducer, will reduce the code and push it
even removing compatibilityRules... it seems
null == ["one","two]
according to gradle
or said differently producer:
--------------------------------------------------
Variant myConfiguration
--------------------------------------------------
Capabilities
- org.exampleproducer1.0-SNAPSHOT (default capability)
Artifacts
- build/libs/producer-1.0-SNAPSHOT.jar (artifactType = jar)
and can be consumed from consumer:
--------------------------------------------------
Configuration producer
--------------------------------------------------
Attributes
- features = one,two
- org.gradle.category = library
- org.gradle.usage = java-runtime
v
That's expected I'd say
Each candidate’s attribute value is compared to the consumer’s requested attribute value. A candidate is considered compatible if its value matches the consumer’s value exactly, passes the attribute’s compatibility rule or is not provided.
That's probably why feature variants are done through the capability which is always set?