Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,20 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.29.0] - 2025-12-17
### Added
- Support for following log/data processing pipeline methods:
- List pipelines
- Create pipeline
- Get pipeline details
- Update pipeline
- Delete pipeline
- Associate stream to pipeline
- Dissociate stream from pipeline
- Fetch associated pipeline using stream
- Fetch sample logs by stream
- Test pipeline

## [0.28.1] - 2025-12-11
### Updated
- CLI to show help when required sub-command/argument not provided.
Expand Down
181 changes: 181 additions & 0 deletions CLI.md
Original file line number Diff line number Diff line change
Expand Up @@ -325,6 +325,187 @@ secops log generate-udm-mapping \
--compress-array-fields "false"
```

### Log Processing Pipelines

Chronicle log processing pipelines allow you to transform, filter, and enrich log data before it is stored in Chronicle. Common use cases include removing empty key-value pairs, redacting sensitive data, adding ingestion labels, filtering logs by field values, and extracting host information. Pipelines can be associated with log types (with optional collector IDs) and feeds, providing flexible control over your data ingestion workflow.

The CLI provides comprehensive commands for managing pipelines, associating streams, testing configurations, and fetching sample logs.

#### List pipelines

```bash
# List all log processing pipelines
secops log-processing list

# List with pagination
secops log-processing list --page-size 50

# List with filter expression
secops log-processing list --filter "displayName:production*"

# List with pagination token
secops log-processing list --page-size 50 --page-token "next_page_token"
```

#### Get pipeline details

```bash
# Get a specific pipeline by ID
secops log-processing get --id "1234567890"
```

#### Create a pipeline

```bash
# Create from inline JSON
secops log-processing create --pipeline '{"displayName":"My Pipeline","description":"Filters error logs","processors":[{"filterProcessor":{"include":{"logMatchType":"REGEXP","logBodies":[".*error.*"]},"errorMode":"IGNORE"}}]}'
```

# Create from JSON file
secops log-processing create --pipeline pipeline_config.json

Example `pipeline_config.json`:
```json
{
"displayName": "Production Pipeline",
"description": "Filters and transforms production logs",
"processors": [
{
"filterProcessor": {
"include": {
"logMatchType": "REGEXP",
"logBodies": [".*error.*", ".*warning.*"]
},
"errorMode": "IGNORE"
}
}
],
"customMetadata": [
{"key": "environment", "value": "production"},
{"key": "team", "value": "security"}
]
}
```

#### Update a pipeline

```bash
# Update from JSON file with update mask
secops log-processing update --id "1234567890" --pipeline updated_config.json --update-mask "description"

# Update from inline JSON
secops log-processing update --id "1234567890" --pipeline '{description":"Updated description"}' --update-mask "description"
```

#### Delete a pipeline

```bash
# Delete a pipeline by ID
secops log-processing delete --id "1234567890"

# Delete with etag for concurrency control
secops log-processing delete --id "1234567890" --etag "etag_value"
```

#### Associate streams with a pipeline

Associate log streams (by log type or feed) with a pipeline:

```bash
# Associate by log type (inline)
secops log-processing associate-streams --id "1234567890" --streams '[{"logType":"WINEVTLOG"},{"logType":"LINUX"}]'

# Associate by feed ID
secops log-processing associate-streams --id "1234567890" --streams '[{"feed":"feed-uuid-1"},{"feed":"feed-uuid-2"}]'

# Associate by log type (from file)
secops log-processing associate-streams --id "1234567890" --streams streams.json
```

Example `streams.json`:
```json
[
{"logType": "WINEVTLOG"},
{"logType": "LINUX"},
{"logType": "OKTA"}
]
```

#### Dissociate streams from a pipeline

```bash
# Dissociate streams (from file)
secops log-processing dissociate-streams --id "1234567890" --streams streams.json

# Dissociate streams (inline)
secops log-processing dissociate-streams --id "1234567890" --streams '[{"logType":"WINEVTLOG"}]'
```

#### Fetch associated pipeline

Find which pipeline is associated with a specific stream:

```bash
# Find pipeline for a log type (inline)
secops log-processing fetch-associated --stream '{"logType":"WINEVTLOG"}'

# Find pipeline for a feed
secops log-processing fetch-associated --stream '{"feed":"feed-uuid"}'

# Find pipeline for a log type (from file)
secops log-processing fetch-associated --stream stream_query.json
```

Example `stream_query.json`:
```json
{
"logType": "WINEVTLOG"
}
```

#### Fetch sample logs

Retrieve sample logs for specific streams:

```bash
# Fetch sample logs for log types (from file)
secops log-processing fetch-sample-logs --streams streams.json --count 10

# Fetch sample logs (inline)
secops log-processing fetch-sample-logs --streams '[{"logType":"WINEVTLOG"},{"logType":"LINUX"}]' --count 5

# Fetch sample logs for feeds
secops log-processing fetch-sample-logs --streams '[{"feed":"feed-uuid"}]' --count 10
```

#### Test a pipeline

Test a pipeline configuration against sample logs before deployment:

```bash
# Test with inline JSON
secops log-processing test --pipeline '{"displayName":"Test","processors":[{"filterProcessor":{"include":{"logMatchType":"REGEXP","logBodies":[".*"]},"errorMode":"IGNORE"}}]}' --input-logs input_logs.json

# Test with files
secops log-processing test --pipeline pipeline_config.json --input-logs test_logs.json
```

Example `input_logs.json` (logs must have base64-encoded data):
```json
[
{
"data": "U2FtcGxlIGxvZyBlbnRyeQ==",
"logEntryTime": "2024-01-01T00:00:00Z",
"collectionTime": "2024-01-01T00:00:00Z"
},
{
"data": "QW5vdGhlciBsb2cgZW50cnk=",
"logEntryTime": "2024-01-01T00:01:00Z",
"collectionTime": "2024-01-01T00:01:00Z"
}
]
```

### Parser Management

Parsers in Chronicle are used to process and normalize raw log data into UDM (Unified Data Model) format. The CLI provides comprehensive parser management capabilities.
Expand Down
Loading
Loading