Rule Pipeline

We can form rule pipelines by importing results of prior rule into the following rule. This is possible by employing intermediate storage or MQ such as mqtt broker. By using the pair of memory source and sink, we can create rule pipelines without external dependencies.

Usage

Rule pipeline will be implicit. Each rule can use an memory sink / source. This means that each step will be created separately using existing api (example below).

  1. #1 Create the source stream
  2. {"sql" : "create stream demo () WITH (DATASOURCE=\"demo\", FORMAT=\"JSON\")"}
  3. #2 Create rule and sink to memory
  4. {
  5. "id": "rule1",
  6. "sql": "SELECT * FROM demo WHERE isNull(temperature)=false",
  7. "actions": [{
  8. "log": {
  9. },
  10. "memory": {
  11. "topic": "home/ch1/sensor1"
  12. }
  13. }]
  14. }
  15. #3 Create a stream from the memory topic
  16. {"sql" : "create stream sensor1 () WITH (DATASOURCE=\"home/+/sensor1\", FORMAT=\"JSON\", TYPE=\"memory\")"}
  17. #4 Create another rules to consume from the memory topic
  18. {
  19. "id": "rule2-1",
  20. "sql": "SELECT avg(temperature) FROM sensor1 GROUP BY CountWindow(10)",
  21. "actions": [{
  22. "log": {
  23. },
  24. "memory": {
  25. "topic": "analytic/sensors"
  26. }
  27. }]
  28. }
  29. {
  30. "id": "rule2-2",
  31. "sql": "SELECT temperature + 273.15 as k FROM sensor1",
  32. "actions": [{
  33. "log": {
  34. }
  35. }]
  36. }

By using the memory topic as the bridge, we now form a rule pipeline: rule1->{rule2-1, rule2-2}. The pipeline can be multiple to multiple and very flexible.

Notice that, the memory sink can be used together with other sinks to create multiple rule actions for a rule. And the memory source topic can use wildcard to subscirbe to a filtered topic list.