Pydoit is really great, but I couldn't find how to extend it to build my own executor, and I always found myself confused writing new tasks and dealing with dependencies. I didn't like that snakemake relied on file names to trigger rules, I was constently juggling complicated file names. I will likely be changing the interface as I go. Think of it as a Make when you can write the dependencies as python code, and that can run locally, on an HPC or in the cloud (cloud is not implemented just yet). There were use cases that I could not implement cleanly in the dataflow model of nextflow. Small library that allows scheduling scripts asyncrhonously on different platforms. I have tried to use the following three alternatives which are all truly excelent! The task has now it's output attached to it, it can be used in the creation of other tasks.
Also note that several tasks can be awaited in parallel by bagging them with sf.bag(.). Again, the task is automatically sent at this stage if it has not be done before.
the task is awaited, and hence execution is blocked until the task is finished.The task can be sent by using the start() method, or it will be sent automatically when awaited. A unique ID id created from the task from its command and its inputs. They can be read, copied but not changed. At this point, the properties of the task are frozen. Scritpflow run sleepit Life cycle of a task
Then create a local env, activate, install and run! Import scriptflow as sf # set main options sf.
describe scripts with input/output as code.describe dependencies as python code (using await/async).Creately diagrams can be exported and added to Word, PPT (powerpoint). I will likely be changing the interface as I go. You can edit this template and create your own diagram. Small library that allows scheduling scripts asyncrhonously on different platforms.