Skip to content

Commit 52cf86f

Browse files
authored
Merge pull request #161 from google/feature/log-processing-pipeline-methods
feature: log processing pipeline methods
2 parents 2feed86 + 9f8f9ab commit 52cf86f

15 files changed

+3761
-8
lines changed

CHANGELOG.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,20 @@ All notable changes to this project will be documented in this file.
55
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
66
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
77

8+
## [0.29.0] - 2025-12-17
9+
### Added
10+
- Support for following log/data processing pipeline methods:
11+
- List pipelines
12+
- Create pipeline
13+
- Get pipeline details
14+
- Update pipeline
15+
- Delete pipeline
16+
- Associate stream to pipeline
17+
- Dissociate stream from pipeline
18+
- Fetch associated pipeline using stream
19+
- Fetch sample logs by stream
20+
- Test pipeline
21+
822
## [0.28.1] - 2025-12-11
923
### Updated
1024
- CLI to show help when required sub-command/argument not provided.

CLI.md

Lines changed: 181 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -325,6 +325,187 @@ secops log generate-udm-mapping \
325325
--compress-array-fields "false"
326326
```
327327

328+
### Log Processing Pipelines
329+
330+
Chronicle log processing pipelines allow you to transform, filter, and enrich log data before it is stored in Chronicle. Common use cases include removing empty key-value pairs, redacting sensitive data, adding ingestion labels, filtering logs by field values, and extracting host information. Pipelines can be associated with log types (with optional collector IDs) and feeds, providing flexible control over your data ingestion workflow.
331+
332+
The CLI provides comprehensive commands for managing pipelines, associating streams, testing configurations, and fetching sample logs.
333+
334+
#### List pipelines
335+
336+
```bash
337+
# List all log processing pipelines
338+
secops log-processing list
339+
340+
# List with pagination
341+
secops log-processing list --page-size 50
342+
343+
# List with filter expression
344+
secops log-processing list --filter "displayName:production*"
345+
346+
# List with pagination token
347+
secops log-processing list --page-size 50 --page-token "next_page_token"
348+
```
349+
350+
#### Get pipeline details
351+
352+
```bash
353+
# Get a specific pipeline by ID
354+
secops log-processing get --id "1234567890"
355+
```
356+
357+
#### Create a pipeline
358+
359+
```bash
360+
# Create from inline JSON
361+
secops log-processing create --pipeline '{"displayName":"My Pipeline","description":"Filters error logs","processors":[{"filterProcessor":{"include":{"logMatchType":"REGEXP","logBodies":[".*error.*"]},"errorMode":"IGNORE"}}]}'
362+
```
363+
364+
# Create from JSON file
365+
secops log-processing create --pipeline pipeline_config.json
366+
367+
Example `pipeline_config.json`:
368+
```json
369+
{
370+
"displayName": "Production Pipeline",
371+
"description": "Filters and transforms production logs",
372+
"processors": [
373+
{
374+
"filterProcessor": {
375+
"include": {
376+
"logMatchType": "REGEXP",
377+
"logBodies": [".*error.*", ".*warning.*"]
378+
},
379+
"errorMode": "IGNORE"
380+
}
381+
}
382+
],
383+
"customMetadata": [
384+
{"key": "environment", "value": "production"},
385+
{"key": "team", "value": "security"}
386+
]
387+
}
388+
```
389+
390+
#### Update a pipeline
391+
392+
```bash
393+
# Update from JSON file with update mask
394+
secops log-processing update --id "1234567890" --pipeline updated_config.json --update-mask "description"
395+
396+
# Update from inline JSON
397+
secops log-processing update --id "1234567890" --pipeline '{description":"Updated description"}' --update-mask "description"
398+
```
399+
400+
#### Delete a pipeline
401+
402+
```bash
403+
# Delete a pipeline by ID
404+
secops log-processing delete --id "1234567890"
405+
406+
# Delete with etag for concurrency control
407+
secops log-processing delete --id "1234567890" --etag "etag_value"
408+
```
409+
410+
#### Associate streams with a pipeline
411+
412+
Associate log streams (by log type or feed) with a pipeline:
413+
414+
```bash
415+
# Associate by log type (inline)
416+
secops log-processing associate-streams --id "1234567890" --streams '[{"logType":"WINEVTLOG"},{"logType":"LINUX"}]'
417+
418+
# Associate by feed ID
419+
secops log-processing associate-streams --id "1234567890" --streams '[{"feed":"feed-uuid-1"},{"feed":"feed-uuid-2"}]'
420+
421+
# Associate by log type (from file)
422+
secops log-processing associate-streams --id "1234567890" --streams streams.json
423+
```
424+
425+
Example `streams.json`:
426+
```json
427+
[
428+
{"logType": "WINEVTLOG"},
429+
{"logType": "LINUX"},
430+
{"logType": "OKTA"}
431+
]
432+
```
433+
434+
#### Dissociate streams from a pipeline
435+
436+
```bash
437+
# Dissociate streams (from file)
438+
secops log-processing dissociate-streams --id "1234567890" --streams streams.json
439+
440+
# Dissociate streams (inline)
441+
secops log-processing dissociate-streams --id "1234567890" --streams '[{"logType":"WINEVTLOG"}]'
442+
```
443+
444+
#### Fetch associated pipeline
445+
446+
Find which pipeline is associated with a specific stream:
447+
448+
```bash
449+
# Find pipeline for a log type (inline)
450+
secops log-processing fetch-associated --stream '{"logType":"WINEVTLOG"}'
451+
452+
# Find pipeline for a feed
453+
secops log-processing fetch-associated --stream '{"feed":"feed-uuid"}'
454+
455+
# Find pipeline for a log type (from file)
456+
secops log-processing fetch-associated --stream stream_query.json
457+
```
458+
459+
Example `stream_query.json`:
460+
```json
461+
{
462+
"logType": "WINEVTLOG"
463+
}
464+
```
465+
466+
#### Fetch sample logs
467+
468+
Retrieve sample logs for specific streams:
469+
470+
```bash
471+
# Fetch sample logs for log types (from file)
472+
secops log-processing fetch-sample-logs --streams streams.json --count 10
473+
474+
# Fetch sample logs (inline)
475+
secops log-processing fetch-sample-logs --streams '[{"logType":"WINEVTLOG"},{"logType":"LINUX"}]' --count 5
476+
477+
# Fetch sample logs for feeds
478+
secops log-processing fetch-sample-logs --streams '[{"feed":"feed-uuid"}]' --count 10
479+
```
480+
481+
#### Test a pipeline
482+
483+
Test a pipeline configuration against sample logs before deployment:
484+
485+
```bash
486+
# Test with inline JSON
487+
secops log-processing test --pipeline '{"displayName":"Test","processors":[{"filterProcessor":{"include":{"logMatchType":"REGEXP","logBodies":[".*"]},"errorMode":"IGNORE"}}]}' --input-logs input_logs.json
488+
489+
# Test with files
490+
secops log-processing test --pipeline pipeline_config.json --input-logs test_logs.json
491+
```
492+
493+
Example `input_logs.json` (logs must have base64-encoded data):
494+
```json
495+
[
496+
{
497+
"data": "U2FtcGxlIGxvZyBlbnRyeQ==",
498+
"logEntryTime": "2024-01-01T00:00:00Z",
499+
"collectionTime": "2024-01-01T00:00:00Z"
500+
},
501+
{
502+
"data": "QW5vdGhlciBsb2cgZW50cnk=",
503+
"logEntryTime": "2024-01-01T00:01:00Z",
504+
"collectionTime": "2024-01-01T00:01:00Z"
505+
}
506+
]
507+
```
508+
328509
### Parser Management
329510

330511
Parsers in Chronicle are used to process and normalize raw log data into UDM (Unified Data Model) format. The CLI provides comprehensive parser management capabilities.

0 commit comments

Comments
 (0)