You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/docs/features/cli.mdx
+100Lines changed: 100 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -263,6 +263,85 @@ llms --chat request.json
263
263
llms --chat request.json "Override prompt"
264
264
```
265
265
266
+
### Standard Input
267
+
268
+
`llms` now accepts [OpenAI-compatible Chat Completion requests](https://platform.openai.com/docs/api-reference/chat/create) via standard input, making it easy to integrate into shell pipelines and scripts.
269
+
270
+
When JSON is piped in, `llms` detects it automatically — no extra flags needed:
271
+
272
+
```bash
273
+
cat request.json | llms
274
+
```
275
+
276
+
Build requests inline with a heredoc:
277
+
278
+
```bash
279
+
llms <<EOF
280
+
{
281
+
"model": "Minimax M2.5",
282
+
"messages": [
283
+
{ "role": "user", "content": "Capital of France?" }
284
+
]
285
+
}
286
+
EOF
287
+
```
288
+
289
+
Combine with other CLI tools to generate requests dynamically:
Copy file name to clipboardExpand all lines: content/docs/latest.mdx
+96Lines changed: 96 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,6 +3,102 @@ title: Latest Features
3
3
description: Latest features and updates in llms.py
4
4
---
5
5
6
+
## Feb 15, 2026
7
+
8
+
### Standard Input
9
+
10
+
`llms` now accepts [OpenAI-compatible Chat Completion requests](https://platform.openai.com/docs/api-reference/chat/create) via standard input, making it easy to integrate into shell pipelines and scripts.
11
+
12
+
When JSON is piped in, `llms` detects it automatically — no extra flags needed:
13
+
14
+
```bash
15
+
cat request.json | llms
16
+
```
17
+
18
+
Build requests inline with a heredoc:
19
+
20
+
```bash
21
+
llms <<EOF
22
+
{
23
+
"model": "Minimax M2.5",
24
+
"messages": [
25
+
{ "role": "user", "content": "Capital of France?" }
26
+
]
27
+
}
28
+
EOF
29
+
```
30
+
31
+
Combine with other CLI tools to generate requests dynamically:
0 commit comments