Skip to content

Clustered running of dataflows #33

@behrad

Description

@behrad

I am running multiple processes of dataflows, e.g.
dataflows daemon core
dataflows daemon redis-events
dataflows daemon listeners2
But I can not define how much of each initiator to run in each command, since we use initiator types in daemons.
for this to happen:

  1. Can we add multiple etc/project files in dataflow? (e.g. dataflows project2 daemon test)

OR

  1. is it easier to change the project json definition so that I can use the same initiators in multiple daemons? something like this:
"daemon": {
        "test": {
            "initiator": [ "cluster1" ]
        },
        "test2": {
            "initiator": [ "cluster2"]
        }
}

"initiator":{
    "cluster1": {
        "type": "http",
        "workflows":[
            {1}, {2}, {3}
    ]},
    "cluster2": {
        "type": "http",
        "workflows":[
            {4}, {5}, {6}
    ]}
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions