Skip to content

Commit d1588c3

Browse files
authored
updated docs (#70)
1 parent 9715037 commit d1588c3

File tree

5 files changed

+15
-13
lines changed

5 files changed

+15
-13
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,9 +15,9 @@ _Note: Guardrails is an alpha release, so expect sharp edges and bugs._
1515

1616
Guardrails is a Python package that lets a user add structure, type and quality guarantees to the outputs of large language models (LLMs). Guardrails:
1717

18-
does pydantic-style validation of LLM outputs,
19-
takes corrective actions (e.g. reasking LLM) when validation fails,
20-
enforces structure and type guarantees (e.g. JSON).
18+
- does pydantic-style validation of LLM outputs (including semantic validation such as checking for bias in generated text, checking for bugs in generated code, etc.)
19+
- takes corrective actions (e.g. reasking LLM) when validation fails,
20+
- enforces structure and type guarantees (e.g. JSON).
2121

2222

2323
## 🚒 Under the hood

docs/guard.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
<!-- ::: my_library.my_module.my_class -->
22

33

4-
::: guardrails.guardrails.Guard
4+
::: guardrails.guard.Guard
55
options:
66
members:
77
- "from_rail"

docs/index.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,9 +6,9 @@ _Note: Guardrails is an alpha release, so expect sharp edges and bugs._
66

77
Guardrails is a Python package that lets a user add structure, type and quality guarantees to the outputs of large language models (LLMs). Guardrails:
88

9-
does pydantic-style validation of LLM outputs,
10-
takes corrective actions (e.g. reasking LLM) when validation fails,
11-
enforces structure and type guarantees (e.g. JSON).
9+
- does pydantic-style validation of LLM outputs. This includes semantic validation such as checking for bias in generated text, checking for bugs in generated code, etc.
10+
- takes corrective actions (e.g. reasking LLM) when validation fails,
11+
- enforces structure and type guarantees (e.g. JSON).
1212

1313
## 🚒 Under the hood
1414

@@ -40,12 +40,12 @@ To learn more about the `rail` spec and the design decisions behind it, check ou
4040
## 📍 Roadmap
4141

4242
- [ ] Adding more examples, new use cases and domains
43-
- [ ] Adding integrations with langchain, gpt-index, minichain, manifest
43+
- [x] Adding integrations with langchain, gpt-index, minichain, manifest
4444
- [ ] Expanding validators offering
4545
- [ ] More compilers from `.rail` -> LLM prompt (e.g. `.rail` -> TypeScript)
4646
- [ ] Informative logging
47-
- [ ] Improving reasking logic
47+
- [x] Improving reasking logic
4848
- [ ] A guardrails.js implementation
4949
- [ ] VSCode extension for `.rail` files
5050
- [ ] Next version of `.rail` format
51-
- [ ] Add more LLM providers
51+
- [x] Add more LLM providers

docs/integrations/pydantic_validation.ipynb

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
"# Validating LLM Outputs with Pydantic\n",
99
"\n",
1010
"!!! note\n",
11-
" To download this example as a Jupyter notebook, click [here](https://github.com/ShreyaR/guardrails/blob/main/docs/examples/pydantic_validation.ipynb).\n",
11+
" To download this example as a Jupyter notebook, click [here](https://github.com/ShreyaR/guardrails/blob/main/docs/integrations/pydantic_validation.ipynb).\n",
1212
"\n",
1313
"In this example, we will use Guardrails with Pydantic.\n",
1414
"\n",
@@ -38,8 +38,9 @@
3838
"Ordinarily, we would create an RAIL spec in a separate file. For the purposes of this example, we will create the spec in this notebook as a string following the RAIL syntax. For more information on RAIL, see the [RAIL documentation](../rail/output.md).\n",
3939
"\n",
4040
"Here, we define a Pydantic model for a `Person` with the following fields:\n",
41-
"- `name`: a string\n",
42-
"- `age`: an integer\n",
41+
"\n",
42+
"- `name`: a string \n",
43+
"- `age`: an integer \n",
4344
"- `zip_code`: a string zip code\n",
4445
"\n",
4546
"and write very simple validators for the fields as an example. As a way to show how LLM reasking can be used to generate data that is consistent with the Pydantic model, we can define a validator that asks for a zip code in California (including being perversely opposed to the \"90210\" zip code). If this validator fails, the LLM will be sent the error message and will reask the question.\n",

mkdocs.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,7 @@ nav:
4343
# - 'SFW tutoring system for kids': examples/sfw_tutoring.md
4444
- 'Integrations':
4545
- 'LangChain': integrations/langchain.ipynb
46+
- 'Pydantic': integrations/pydantic_validation.ipynb
4647

4748

4849
theme:

0 commit comments

Comments
 (0)