Once the data stream is ingested through windowing, you can process these windows in your child transformation. Use the child transformation to adjust the data and handle event alerts as needed. After processing, you can either load the windowed data into various outputs or publish it back into the data message stream. You can publish data back into the message stream by using the following producer steps for your specified target:
- AMQP Producer: Advanced Message Queuing Protocol brokers
- JMS Producer: Apache ActiveMQ Java Messaging Service server or the IBM MQ middleware
- Kafka Producer: Kafka server
- Kinesis Producer: Support for pushing data to a specific region and stream located within the Amazon Kinesis Data Streams service
- MQTT Producer: Message Queuing Telemetry Transport broker or clients
You can also use the data streaming window to capture data for analysis. Streaming Pentaho data services can be created from output steps in the child transformation. You can use CTools to create dashboards using these services as data sources. See Pentaho CTools for more information.
Once started, streaming data transformations run continuously. You can stop these transformations using the following tools:
- The stop option in the PDI client.
- The Abort step in either the parent or child transformation.
- Restarting the Pentaho or Spark execution engine.