Execution DB and API

Goal

Provide the data and interfaces to focus on executions in the Engine.

Right now events and task executions are two things quite separated. With the introduction of the workflow and with the goal to decentralize these executions we need to have an execution that links events and task executions together in a way that any node can have access to the same data and that these data should be sufficient to reach consensus, process and verify the execution.

Executions should not be possible without any workflow. Execution will be triggered because of an event or a result of a previous execution.

We need to make sure that we can trace any execution back to its beginning, an event and a list of result. more details here.

Execution Database

We need to remove/rename/add a bunch of attributes:

  • ID to rename in hash (we calculate the hash so let’s call it hash)
  • remove ExecutionDuration (calculated so don’t need to be in the data, can be transformed in function if we need to)
  • remove Service and just put the service definition hash (we don’t need the full service definition)
  • OutputData and OutputKey (Simplification of the task's output)
  • The previous execution that we can call parentHash (this will be helpful for data resolution of the workflow)
  • Calculate the hash of the execution based on previous execution hash, inputs, service hash, task key

When we will start the network we will add a few attributes for consensus of the event (emitters), execution (executor) and validation (validators).

API

With this focus on execution, we need proper API for that. The goal is on future versions to only use this API and deprecate and even delete the ExecuteTask, ListenEvent, ListenResult, SubmitResult.

  • Create a new proto api package under /protobuf/api/
  • Create a executions.proto in this package
  • Create a proto service Execution
  • Create the apis
    • Get(hash) -> Execution
    • List() -> Execution[]
    • Updates(filter) -> Stream<Execution>
  • Create the Execution definition in /protobuf/definition/execution.proto
  • Create the api server in the /api/ package (previouslyinterface`)
  • Create a manager/subpackage in the sdk package (previously api) called execution and create the functions
    • Get -> Execution
    • List -> Execution[]
    • Updates -> Channel<Execution>
  • Link the api to the sdk functions

In another step we will remove the ExecuteTask, ListenEvent, ListenResult and SubmitResult in order to use only the Execution’s APIs

We must:

move check output data from execution to api

We should also:

change ID type to []byte

We don’t want to:

refactor and rename api package to sdk in this PR (we can do it first in seperate one)

Questions:

  • api.List what it supose to do, returns all the execution in database? It will be milions of it so it can’t be done without for example paging
  • how you want to manage parentHash aka previous executon hash if we don’t have any link between exeuciton in this step
1 Like

This was supposed to be done already with the renaming of the project cc @Nicolas

Pagination would be great, I’m not sure for a first version it’s necessary but if it doesn’t make the feature really complicated i’m fine to have it directly.

Just having the structure now and this will be populated by the Workflow implementation

Question: what is exacly, or what will be filter, because this is one that is missing here.

Is this will be string custom format (or lang like tcpdump?) or it will be struct in protobuf like:

message Request {
  bytes hash = 1;
  timestamp from = 2;
  bytes serviceHash = 3;
  bytes instanceHash = 4;
  Status status
  // etc...
}

I don’t actually have any preferences for that. We also don’t need to implement this right now for everything. It can be just for the status (the same logic that what is already in the api).

I don’t want to have something specific for us and have to parse it or something like that.

I would go with the request your propose (protobuf struct) renamed with Filter and only have the status for now. We will be able to add more filters later on.