Skip to content

Commit 1b90f03

Browse files
wpbarcoCopilotlnhsinghmdrxy
authored
integration: langchain-timbr provider documentation (#1161)
## Overview This PR adds Timbr as a new integration provider to the LangChain documentation. Timbr is a semantic SQL knowledge graph platform that enables natural language to SQL query generation using ontology-driven semantic layers. The documentation covers: * Timbr provider overview page with installation instructions * Integration guide showing usage of Timbr's chains and agents * Complete setup and configuration examples * Environment-based configuration support * Multiple usage patterns: pre-built agent, individual chains, and manual connector ## Type of change **Type:** Add new integration documentation ## Related issues/PRs * Related integration files in langchain-timbr package * [Timbr documentation](https://docs.timbr.ai/doc/docs/integration/langchain-sdk/) ## Checklist - [X] I have read the contributing guidelines - [X] I have tested my changes locally using make lint_md - [X] All code examples follow LangChain documentation standards - [X] I have used root relative paths for internal links - [ ] I have updated navigation in `src/docs.json` if needed (not applicable - added sections to existing pages) - [ ] I have gotten approval from the relevant reviewers - [ ] (Internal team members only / optional) I have created a preview deployment using the [Create Preview Branch workflow](https://github.com/langchain-ai/docs/actions/workflows/create-preview-branch.yml) ## Additional notes **Changes made:** 1. `src/oss/python/integrations/providers/timbr.mdx` (new file): * Provider overview describing Timbr's semantic SQL capabilities * Installation instructions with optional LLM provider selection * Import examples for main chains and agent * Cross-reference to detailed usage guide 2. `src/oss/python/integrations/graphs/timbr.mdx` (new file): * Comprehensive integration guide with setup instructions * Environment-based configuration documentation * Examples for main tools: * ExecuteTimbrQueryChain for query execution * GenerateAnswerChain for response generation * TimbrLlmConnector for manual integration * LLM configuration examples using both standard LangChain models and Timbr's LlmWrapper * Links to additional resources (PyPI, GitHub, official docs) 3. `src\oss\python\integrations\providers\all_providers.mdx` (updated): * Added Timbr card in alphabetical order * Description: "Semantic layer for data integration and querying" * Link to Timbr provider page ## Key features documented: * **Semantic SQL generation:** Natural language to SQL using ontology-driven semantic layers * **Environment configuration:** Optional environment variables for simplified setup * **Multiple LLM support:** Compatible with OpenAI, Anthropic, Google, Azure, Snowflake, Databricks, and Vertex AI * **Flexible usage patterns:** Pre-built agent, individual chains, or manual connector * **Usage metadata:** Token counting and tracking for LLM interactions --------- Co-authored-by: Copilot <[email protected]> Co-authored-by: Lauren Hirata Singh <[email protected]> Co-authored-by: Mason Daugherty <[email protected]> Co-authored-by: Mason Daugherty <[email protected]>
1 parent ee64a2f commit 1b90f03

File tree

3 files changed

+255
-0
lines changed

3 files changed

+255
-0
lines changed
Lines changed: 213 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,213 @@
1+
---
2+
title: Timbr
3+
---
4+
5+
>[Timbr](https://docs.timbr.ai/doc/docs/integration/langchain-sdk/) integrates natural language inputs with Timbr's ontology-driven semantic layer. Leveraging Timbr's robust ontology capabilities, the SDK integrates with Timbr data models and leverages semantic relationships and annotations, enabling users to query data using business-friendly language.
6+
7+
>Timbr provides a pre-built SQL agent, `TimbrSqlAgent`, which can be used for end-to-end purposes from user prompt, through semantic SQL query generation and validation, to query execution and result analysis.
8+
9+
>For customizations and partial usage, you can use LangChain chains and LangGraph nodes with our 5 main tools:
10+
11+
>- `IdentifyTimbrConceptChain` & `IdentifyConceptNode` - Identify relevant concepts from user prompts
12+
>- `GenerateTimbrSqlChain` & `GenerateTimbrSqlNode` - Generate SQL queries from natural language prompts
13+
>- `ValidateTimbrSqlChain` & `ValidateSemanticSqlNode` - Validate SQL queries against Timbr knowledge graph schemas
14+
>- `ExecuteTimbrQueryChain` & `ExecuteSemanticQueryNode` - Execute (semantic and regular) SQL queries against Timbr knowledge graph databases
15+
>- `GenerateAnswerChain` & `GenerateResponseNode` - Generate human-readable answers based on a given prompt and data rows
16+
17+
>Additionally, `langchain-timbr` provides `TimbrLlmConnector` for manual integration with Timbr's semantic layer using LLM providers.
18+
19+
For a comprehensive example of the `langchain-timbr` integration, see the [demo notebook](https://github.com/WPSemantix/Timbr-GenAI/tree/main/LangChain/LangChain_Timbr_Demo.ipynb).
20+
21+
## Setting up
22+
23+
### Installation
24+
25+
#### Install the package
26+
27+
```bash
28+
pip install langchain-timbr
29+
```
30+
31+
#### Optional: Install with selected LLM provider
32+
33+
Choose one of: `openai, anthropic, google, azure_openai, snowflake, databricks, vertex_ai (or 'all')`
34+
35+
```bash
36+
pip install 'langchain-timbr[<your selected providers, separated by comma without spaces>]'
37+
```
38+
39+
We default to OpenAI models in this guide.
40+
41+
```python
42+
import getpass
43+
import os
44+
45+
if "OPENAI_API_KEY" not in os.environ:
46+
os.environ["OPENAI_API_KEY"] = getpass.getpass("OpenAI API Key:")
47+
```
48+
49+
## Configuration
50+
51+
Starting from `langchain-timbr` v2.0.0, all chains, agents, and nodes support optional environment-based configuration. You can set the following environment variables to provide default values and simplify setup for the provided tools:
52+
53+
### Timbr Connection Parameters
54+
55+
- `TIMBR_URL`: Default Timbr server URL
56+
- `TIMBR_TOKEN`: Default Timbr authentication token
57+
- `TIMBR_ONTOLOGY`: Default ontology/knowledge graph name
58+
59+
When these environment variables are set, the corresponding parameters (`url`, `token`, `ontology`) become optional in all chain and agent constructors and will use the environment values as defaults.
60+
61+
### LLM Configuration Parameters
62+
63+
- **LLM_TYPE**: The type of LLM provider (one of langchain_timbr LlmTypes enum: 'openai-chat', 'anthropic-chat', 'chat-google-generative-ai', 'azure-openai-chat', 'snowflake-cortex', 'chat-databricks')
64+
- **LLM_API_KEY**: The API key for authenticating with the LLM provider
65+
- **LLM_MODEL**: The model name or deployment to use
66+
- **LLM_TEMPERATURE**: Temperature setting for the LLM
67+
- **LLM_ADDITIONAL_PARAMS**: Additional parameters as dict or JSON string
68+
69+
When LLM environment variables are set, the `llm` parameter becomes optional and will use the `LlmWrapper` with environment configuration.
70+
71+
Example environment setup:
72+
73+
```bash
74+
# Timbr connection
75+
export TIMBR_URL="https://your-timbr-app.com/"
76+
export TIMBR_TOKEN="tk_XXXXXXXXXXXXXXXXXXXXXXXX"
77+
export TIMBR_ONTOLOGY="timbr_knowledge_graph"
78+
79+
# LLM configuration
80+
export LLM_TYPE="openai-chat"
81+
export LLM_API_KEY="your-openai-api-key"
82+
export LLM_MODEL="gpt-4o"
83+
export LLM_TEMPERATURE="0.1"
84+
export LLM_ADDITIONAL_PARAMS='{"max_tokens": 1000}'
85+
```
86+
87+
## Querying the semantic layer
88+
89+
We can now use Timbr's chains to query the semantic layer. Import and utilize your intended chain/node, or use TimbrLlmConnector to manually integrate with Timbr's semantic layer.
90+
91+
```python
92+
from langchain_timbr import ExecuteTimbrQueryChain
93+
from langchain_openai import ChatOpenAI
94+
```
95+
96+
```python
97+
# You can use the standard LangChain ChatOpenAI/ChatAnthropic models
98+
# or any other LLM model based on langchain_core.language_models.chat.BaseChatModel
99+
llm = ChatOpenAI(model="gpt-4o", temperature=0, openai_api_key="open-ai-api-key")
100+
101+
# Optional alternative: Use Timbr's LlmWrapper, which provides generic connections to different LLM providers
102+
from langchain_timbr import LlmWrapper, LlmTypes
103+
llm = LlmWrapper(llm_type=LlmTypes.OpenAI, api_key="open-ai-api-key", model="gpt-4o")
104+
```
105+
106+
### ExecuteTimbrQueryChain example
107+
108+
```python
109+
execute_timbr_query_chain = ExecuteTimbrQueryChain(
110+
llm=llm,
111+
url="https://your-timbr-app.com/",
112+
token="tk_XXXXXXXXXXXXXXXXXXXXXXXX",
113+
ontology="timbr_knowledge_graph",
114+
schema="dtimbr", # optional
115+
concept="Sales", # optional
116+
concepts_list=["Sales","Orders"], # optional
117+
views_list=["sales_view"], # optional
118+
note="We only need sums", # optional
119+
retries=3, # optional
120+
should_validate_sql=True # optional
121+
)
122+
123+
result = execute_timbr_query_chain.invoke({"prompt": "What are the total sales for last month?"})
124+
rows = result["rows"]
125+
sql = result["sql"]
126+
concept = result["concept"]
127+
schema = result["schema"]
128+
error = result.get("error", None)
129+
130+
usage_metadata = result.get("execute_timbr_usage_metadata", {})
131+
determine_concept_usage = usage_metadata.get('determine_concept', {})
132+
generate_sql_usage = usage_metadata.get('generate_sql', {})
133+
# Each usage_metadata item contains:
134+
# * 'approximate': Estimated token count calculated before invoking the LLM
135+
# * 'input_tokens'/'output_tokens'/'total_tokens'/etc.: Actual token usage metrics returned by the LLM
136+
```
137+
138+
```output
139+
{'rows': [{'total_sales': 150000}], 'sql': 'SELECT SUM(amount) as total_sales FROM sales WHERE date >= DATEADD(month, -1, GETDATE())', 'concept': 'Sales', 'schema': 'dtimbr'}
140+
```
141+
142+
### Using multiple chains with SequentialChain
143+
144+
You can combine multiple Timbr chains to create more complex workflows.
145+
146+
```python
147+
from langchain.chains import SequentialChain
148+
from langchain_timbr import ExecuteTimbrQueryChain, GenerateAnswerChain
149+
150+
execute_timbr_query_chain = ExecuteTimbrQueryChain(
151+
llm=llm,
152+
url='https://your-timbr-app.com/',
153+
token='tk_XXXXXXXXXXXXXXXXXXXXXXXX',
154+
ontology='timbr_knowledge_graph',
155+
)
156+
157+
generate_answer_chain = GenerateAnswerChain(
158+
llm=llm,
159+
url='https://your-timbr-app.com/',
160+
token='tk_XXXXXXXXXXXXXXXXXXXXXXXX',
161+
)
162+
163+
pipeline = SequentialChain(
164+
chains=[execute_timbr_query_chain, generate_answer_chain],
165+
input_variables=["prompt"],
166+
output_variables=["answer", "sql"]
167+
)
168+
169+
result = pipeline.invoke({"prompt": "What are the total sales for last month?"})
170+
```
171+
172+
```output
173+
{'prompt': 'What are the total sales for last month?', 'answer': 'Based on the query results, the total sales for last month amount to $150,000.', 'sql': 'SELECT SUM(amount) as total_sales FROM sales WHERE date >= DATEADD(month, -1, GETDATE())'}
174+
```
175+
176+
## Using the TimbrLlmConnector
177+
178+
For manual integration with Timbr's semantic layer, you can use the `TimbrLlmConnector` which includes the following methods:
179+
180+
- `get_ontologies` - List Timbr's semantic knowledge graphs
181+
- `get_concepts` - List selected knowledge graph ontology representation concepts
182+
- `get_views` - List selected knowledge graph ontology representation views
183+
- `determine_concept` - Identify relevant concepts from user prompts
184+
- `generate_sql` - Generate SQL queries from natural language prompts
185+
- `validate_sql` - Validate SQL queries against Timbr knowledge graph schemas
186+
- `run_timbr_query` - Execute (semantic and regular) SQL queries against Timbr knowledge graph databases
187+
- `run_llm_query` - Execute agent pipeline to determine concept, generate SQL, and run query from natural language prompt
188+
189+
```python
190+
from langchain_timbr import TimbrLlmConnector
191+
192+
connector = TimbrLlmConnector(
193+
llm=llm,
194+
url="https://your-timbr-app.com/",
195+
token="tk_XXXXXXXXXXXXXXXXXXXXXXXX",
196+
ontology="timbr_knowledge_graph"
197+
)
198+
199+
# Get available concepts
200+
concepts = connector.get_concepts()
201+
print("Available concepts:", concepts)
202+
203+
# Run a complete query pipeline
204+
result = connector.run_llm_query("What are the top 5 customers by revenue?")
205+
print("Query result:", result)
206+
```
207+
208+
## Additional Resources
209+
210+
- [PyPI](https://pypi.org/project/langchain-timbr)
211+
- [GitHub](https://github.com/WPSemantix/langchain-timbr)
212+
- [LangChain Timbr Docs](https://docs.timbr.ai/doc/docs/integration/langchain-sdk/)
213+
- [LangGraph Timbr Docs](https://docs.timbr.ai/doc/docs/integration/langgraph-sdk)

src/oss/python/integrations/providers/all_providers.mdx

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2672,6 +2672,14 @@ Browse the complete collection of integrations available for Python. LangChain P
26722672
Entity resolution and data matching.
26732673
</Card>
26742674

2675+
<Card
2676+
title="Timbr"
2677+
href="/oss/integrations/providers/timbr"
2678+
icon="link"
2679+
>
2680+
Semantic layer for data integration and querying.
2681+
</Card>
2682+
26752683
<Card
26762684
title="Together"
26772685
href="/oss/integrations/providers/together"
Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
---
2+
title: Timbr
3+
---
4+
5+
>What is `Timbr`?
6+
7+
>- Timbr is a `semantic SQL knowledge graph platform` that specializes in connecting data through ontology-driven semantic layers.
8+
>- Timbr allows you to represent and query data using business-friendly language, making it ideal for handling complex data relationships and business logic.
9+
>- Timbr provides `natural language to SQL` capabilities, making it easy to interact with your data using plain English queries.
10+
>- With Timbr, you can achieve high-performance `semantic data querying`, suitable for production-level analytics and business intelligence.
11+
12+
>Get started with Timbr by visiting [their website](https://docs.timbr.ai/doc/docs/getting-started/intro-timbr/).
13+
14+
## Installation and Setup
15+
16+
- Install the Python SDK with `pip install langchain-timbr`
17+
18+
### Optional: Install with selected LLM provider
19+
20+
Choose one of: openai, anthropic, google, azure_openai, snowflake, databricks, vertex_ai (or 'all')
21+
22+
```bash
23+
pip install 'langchain-timbr[<your selected providers, separated by comma without spaces>]'
24+
```
25+
26+
## Semantic SQL Queries
27+
28+
Timbr provides a wrapper around its semantic layer that generates SQL statements based on natural language input and retrieves relevant information from your knowledge graph.
29+
30+
```python
31+
from langchain_timbr import create_timbr_sql_agent, ExecuteTimbrQueryChain, GenerateTimbrSqlChain
32+
```
33+
34+
See a [usage example](/oss/integrations/graphs/timbr)

0 commit comments

Comments
 (0)