You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Chronicle log processing pipelines allow you to transform, filter, and enrich log data before it is stored in Chronicle. Common use cases include removing empty key-value pairs, redacting sensitive data, adding ingestion labels, filtering logs by field values, and extracting host information. Pipelines can be associated with log types (with optional collector IDs) and feeds, providing flexible control over your data ingestion workflow.
331
+
332
+
The CLI provides comprehensive commands for managing pipelines, associating streams, testing configurations, and fetching sample logs.
333
+
334
+
#### List pipelines
335
+
336
+
```bash
337
+
# List all log processing pipelines
338
+
secops log-processing list
339
+
340
+
# List with pagination
341
+
secops log-processing list --page-size 50
342
+
343
+
# List with filter expression
344
+
secops log-processing list --filter "displayName:production*"
345
+
346
+
# List with pagination token
347
+
secops log-processing list --page-size 50 --page-token "next_page_token"
Test a pipeline configuration against sample logs before deployment:
484
+
485
+
```bash
486
+
# Test with inline JSON
487
+
secops log-processing test --pipeline '{"displayName":"Test","processors":[{"filterProcessor":{"include":{"logMatchType":"REGEXP","logBodies":[".*"]},"errorMode":"IGNORE"}}]}' --input-logs input_logs.json
488
+
489
+
# Test with files
490
+
secops log-processing test --pipeline pipeline_config.json --input-logs test_logs.json
491
+
```
492
+
493
+
Example `input_logs.json` (logs must have base64-encoded data):
494
+
```json
495
+
[
496
+
{
497
+
"data": "U2FtcGxlIGxvZyBlbnRyeQ==",
498
+
"logEntryTime": "2024-01-01T00:00:00Z",
499
+
"collectionTime": "2024-01-01T00:00:00Z"
500
+
},
501
+
{
502
+
"data": "QW5vdGhlciBsb2cgZW50cnk=",
503
+
"logEntryTime": "2024-01-01T00:01:00Z",
504
+
"collectionTime": "2024-01-01T00:01:00Z"
505
+
}
506
+
]
507
+
```
508
+
328
509
### Parser Management
329
510
330
511
Parsers in Chronicle are used to process and normalize raw log data into UDM (Unified Data Model) format. The CLI provides comprehensive parser management capabilities.
0 commit comments