This message was deleted.
# hamilton-help
s
This message was deleted.
👀 1
s
Hi, I got very confused about Parallel Execution, I don’t understand when the local or remote executor is called, and I cant find doc about executor(SynchronousLocalTaskExecutor / MultiProcessingExecutor). My understanding is parallel parts is executed by remote and before and after is local.
yep that’s fairly correct. Local runs the “orchestration”, “remote” is what “tasks” (i.e. a function) are distributed to.
r
Thanks a lot! And by the way, which should I use to cache result, experimental.h_cache or plugins.h_diskcache?
s
I think I might have introduced a bug in the code we were working on with https://github.com/MolCrafts/molexp/blob/master/example/cmdline/slurm_funcs.py Right now the nodes need to be tagged with “cmdline” for the remote executor to run them — see https://github.com/MolCrafts/molexp/blob/master/src/prototype/cmdline.py#L19
Otherwise are you running into a problem? It’s not clear to me where you exactly need help 🙂
r
I am cleaning up the code and introducing
pysqa
to submit it. I hope I can make some progress tonight😃
s
> Thanks a lot! And by the way, which should I use to cache result, experimental.h_cache or plugins.h_diskcache? Right now https://github.com/MolCrafts/molexp/blob/master/example/cmdline/run.py uses the h_cache.CachingGraphAdapter (along with pickle) and you have to tag things you want to cache. But the h_diskcache one would cache EVERYTHING to pickle… so depends on what you want to achieve 🙂
r
OK! Thanks!
t
with
h_diskcache.DiskCacheAdapter
, you can actually specify the list of nodes you care about caching via a list of node names (
cache_vars: List[str]
). The default is all nodes. DiskCache is robust for parallel usage of the cache since each transaction is atomic: https://grantjenks.com/docs/diskcache/#comparisons
s
thanks @Thierry Jean we chatted in DM, and we’re going to get something working first, and then decide if the experiment manager fits, or it doesnt, and if something needs to change there.
👍 2