Pipelining
7 minute read
With the PetClinic App View created, one thing becomes immediately apparent- the PetClinic app is very chatty outputting DEBUG statements related to its pool stats every 30 seconds. While it’s simple enough to edit the logging configuration for the app itself, it would require changes to the config file, maybe there’s additional DEBUG info that we do want to see, just not that of the HikariPool output.
For the purposes of this exercise, we will assume the PetClinic app is a 3rd party app that we don’t have control over but we still want to remove the DEBUG output that is cluttering up our Log Analysis. For that, we can utilize a Pipeline that will look for those entries and drop them before passing the non-DEBUG entries on to Log Analysis.
Build a Pipeline
To get started, click the Pipeline icon in the top-left corner of the dashboard:
then click on New Pipeline:
Name the new pipeline PetClinic preprocess:
Click Save.
Next we’ll add a Source that will receive log data from the OpenTelemetry Collector we configured earlier. Click the Sources - Add button.
NOTE
The Mezmo Exporter in the OpenTelemetry Collector is responsible for sending this data. We haven’t yet configured it to send the data, but we will in a few short steps.We will receive the OpenTelemetry logs via HTTP. In the list of available sources, type
http
in the filter which should highlight the HTTP SourceClick on the HTTP source.
Configure the following:
- Title as
OTEL Ingest
- add a meaningful Description such as
Receive logs from OTEL
- leave the Decoding Method as
json
Click Save.
- Title as
With the
OTEL Ingest
endpoint created, we need to edit its configuration to create a new Access Key. Click onOTEL Ingest
endpoint and then click theCreate new key
button.For the Title, enter
Ingest Key
and click Create.Be sure to copy and save the value of the new key as well as the URL to this specific endpoint as we will need to use them in a later step:
In this example:
- the value of Ingest Key is
+19opdnwjWmDUD302J2jsT9xCF87Ibu0rk2t95jC/ps=
and - the URL is
https://pipeline.mezmo.com/v1/b745ce28-546e-11ed-a64b-d233826e7531
.
Click Update.
- the value of Ingest Key is
Our pipeline is starting to take shape as:
Next, the log data received from OTEL follows the format specified in the Send Log Lines API. An example of a JSON payload with two log entries is sent as:
{
"lines": [
{
"timestamp": 1666275427712,
"line": "\u003c135\u003eOct 20 09:17:02 mbp1 HikariPool-1 - Pool stats (total=10, active=0, idle=10, waiting=0)",
"app": "",
"level": "info",
"meta": {}
},
{
"timestamp": 1666275427712,
"line": "\u003c135\u003eOct 20 09:17:02 mbp1 HikariPool-1 - Fill pool skipped, pool is at sufficient level.",
"app": "",
"level": "info",
"meta": {}
}
]
}
We want to process each log entry individually, so we will use the Unroll processor to do this. This processor will convert a JSON array into individual JSON objects that will appear at the output of the processor.
Click Processors → Add. In the Processor list, filter on unroll
and select the Unroll Processor:

Click on the Unroll processor.
Configure the following:
- Title as
Unroll Logs
- add a meaningful Description such as
Convert logs array to individual logs for processing
- set the Field value to
.lines
Click Save.
Our pipeline now appears as:
- Title as
NOTE
- Hover your mouse over the right edge of the HTTP source we configured (
OTEL Ingest
). An attach anchor will appear:

Click and drag from the HTTP source to the Unroll processor. The Unroll processor will be highlighted:

With the Unroll processor highlighted, release the mouse click and the HTTP source will now send its output to the Unroll processor.
NOTE
The pipeline should now appear as:

- With the logs converted to individual entries, we can accomplish what we set out to do- remove the
DEBUG
entries. To accomplish this, we’ll add a Filter processor.
Click Processors → Add. In the Processor list, filter on filter
and select the Filter Processor:
Click on the Filter processor.
Configure the following:
- Title as
Discard DEBUG Msgs
- add a meaningful Description such as
Allow non-DEBUG messages to pass
- set the Field value to
.lines.level
- set the Operator to
not_equal
- set the Value to
DEBUG
(the value is case-sensitive)
Click Save.
- Title as
Connect the output from the Unroll processor to the input of the Filter processor similar to Step 8. Our pipeline now appears as:
To prepare to send our log entries from our pipeline to Log Analysis, we must first convert the JSON format to a string format. For this, we use the Stringify processor. Click Processors → Add. In the Processor list, filter on
stringify
and select the Stringify Processor:Click on the Stringify processor.
Configure the following:
- Title as
Stringify
- add a meaningful Description such as
Convert JSON to text
Click Save.
- Title as
Connect the output from the Filter processor to the input of the Stringify processor similar to Step 8. Our pipeline now appears as:
Finally, we are ready to take the output of the Stringify processor, and send it out of our pipeline and over to Log Analysis. To do so, we’ll add a Destination. The destination will require an ingest key as part of its configuration. To obtain your ingest key, click Settings (
) → Organization → API Keys.
We can use the existing Ingestion Key by simply clicking on the clipboard to copy it:
We will paste this value into the Destination that will be configured next.
Click Destinations → Add. In the Destinations list, filter on
log analysis
and select the Mezmo Log Analysis destination:Click on the Mezmo Log Analysis destination.
Configure the following:
- Title as
Send to LA
- add a meaningful Description such as
Send logs to Log Analysis
- leave End-to-end Acknowledgement as checked
- set the Mezmo Host value to
logs.mezmo.com
- set the Hostname to
petclinic-pipeline
- paste the ingestion key in the Ingestion Key field that we looked up in the previous step
Click Save.
- Title as
Lastly, connect the output of the Stringify processor to the Send to LA destination. The final pipeline should appear as:
With the pipeline finalized, the last step is to deploy the pipeline so it is active. Click the Deploy pipeline button:
When the deployment completes, a blue checkmark will appear next to the pipeline name indicating the pipeline is now active:
Reconfigure OTEL
You may recall when we installed and configured the OpenTelemetry Collector that we set it up to send the logs to the Log Analysis endpoint. We now want to reconfigure the collector to, instead, send logs to our pipeline endpoint. This will start the flow of logs through the pipeline and the processors we’ve configured above.
Stop the OTEL Collector if it’s running.
Edit the
$HOME/otelcol/config.yaml
file.- Change the value of
ingest_url
to the URL we saved from Step 8 in the previous section. - Change the value of
ingest_key
to the Ingest Key value we saved from Step 8 in the previous section.
The
mezmo
section will look similar to this:####################################### exporters: mezmo: ingest_url: "https://pipeline.mezmo.com/v1/b745ce28-546e-11ed-a64b-d233826e7531" ingest_key: "+19opdnwjWmDUD302J2jsT9xCF87Ibu0rk2t95jC/ps="
Save the changes and exit.
- Change the value of
Restart the collector with:
./otelcol-contrib --config config.yaml
Testing
At this point, any new log messages output by the PetClinic app will get picked up by the collector and sent to the pipeline. The pipeline should filter out any messages with level=DEBUG
set and forward on the rest to Log Analysis.
Head over to PetClinic App View we created earlier. No new DEBUG messages should be showing up now.
NOTE
Be sure to check the timestamp on any existing messages in the View. They should have a timestamp that is older than when we changed the flow of messages to the pipeline.One method for generating INFO messages in the PetClinic app is to restart the PetClinic app. Both shutting down as well as starting up the app generate quite a bit of INFO and DEBUG entries. Shut down the PetClinic app and you should see only the INFO logs show up in the View:
likewise, starting the PetClinic app produces similar INFO-only logs:
java -jar target/spring-petclinic-*.jar --spring.profiles.active=mysql
Feedback
Was this page helpful?
Glad to hear it! Please tell us how we can improve.
Sorry to hear that. Please tell us how we can improve.