Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
213 changes: 213 additions & 0 deletions src/oss/python/integrations/graphs/timbr.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,213 @@
---
title: Timbr
---

>[Timbr](https://docs.timbr.ai/doc/docs/integration/langchain-sdk/) integrates natural language inputs with Timbr's ontology-driven semantic layer. Leveraging Timbr's robust ontology capabilities, the SDK integrates with Timbr data models and leverages semantic relationships and annotations, enabling users to query data using business-friendly language.

>Timbr provides a pre-built SQL agent, `TimbrSqlAgent`, which can be used for end-to-end purposes from user prompt, through semantic SQL query generation and validation, to query execution and result analysis.

>For customizations and partial usage, you can use LangChain chains and LangGraph nodes with our 5 main tools:

>- `IdentifyTimbrConceptChain` & `IdentifyConceptNode` - Identify relevant concepts from user prompts
>- `GenerateTimbrSqlChain` & `GenerateTimbrSqlNode` - Generate SQL queries from natural language prompts
>- `ValidateTimbrSqlChain` & `ValidateSemanticSqlNode` - Validate SQL queries against Timbr knowledge graph schemas
>- `ExecuteTimbrQueryChain` & `ExecuteSemanticQueryNode` - Execute (semantic and regular) SQL queries against Timbr knowledge graph databases
>- `GenerateAnswerChain` & `GenerateResponseNode` - Generate human-readable answers based on a given prompt and data rows

>Additionally, `langchain-timbr` provides `TimbrLlmConnector` for manual integration with Timbr's semantic layer using LLM providers.

For a comprehensive example of the `langchain-timbr` integration, see the [demo notebook](https://github.com/WPSemantix/Timbr-GenAI/tree/main/LangChain/LangChain_Timbr_Demo.ipynb).

## Setting up

### Installation

#### Install the package

```bash
pip install langchain-timbr
```

#### Optional: Install with selected LLM provider

Choose one of: openai, anthropic, google, azure_openai, snowflake, databricks, vertex_ai (or 'all')

```bash
pip install 'langchain-timbr[<your selected providers, separated by comma without spaces>]'
```

We default to OpenAI models in this guide.

```python
import getpass
import os

if "OPENAI_API_KEY" not in os.environ:
os.environ["OPENAI_API_KEY"] = getpass.getpass("OpenAI API Key:")
```

## Configuration

Starting from `langchain-timbr` v2.0.0, all chains, agents, and nodes support optional environment-based configuration. You can set the following environment variables to provide default values and simplify setup for the provided tools:

### Timbr Connection Parameters

- **TIMBR_URL**: Default Timbr server URL
- **TIMBR_TOKEN**: Default Timbr authentication token
- **TIMBR_ONTOLOGY**: Default ontology/knowledge graph name

When these environment variables are set, the corresponding parameters (`url`, `token`, `ontology`) become optional in all chain and agent constructors and will use the environment values as defaults.

### LLM Configuration Parameters

- **LLM_TYPE**: The type of LLM provider (one of langchain_timbr LlmTypes enum: 'openai-chat', 'anthropic-chat', 'chat-google-generative-ai', 'azure-openai-chat', 'snowflake-cortex', 'chat-databricks')
- **LLM_API_KEY**: The API key for authenticating with the LLM provider
- **LLM_MODEL**: The model name or deployment to use
- **LLM_TEMPERATURE**: Temperature setting for the LLM
- **LLM_ADDITIONAL_PARAMS**: Additional parameters as dict or JSON string

When LLM environment variables are set, the `llm` parameter becomes optional and will use the `LlmWrapper` with environment configuration.

Example environment setup:

```bash
# Timbr connection
export TIMBR_URL="https://your-timbr-app.com/"
export TIMBR_TOKEN="tk_XXXXXXXXXXXXXXXXXXXXXXXX"
export TIMBR_ONTOLOGY="timbr_knowledge_graph"

# LLM configuration
export LLM_TYPE="openai-chat"
export LLM_API_KEY="your-openai-api-key"
export LLM_MODEL="gpt-4o"
export LLM_TEMPERATURE="0.1"
export LLM_ADDITIONAL_PARAMS='{"max_tokens": 1000}'
```

## Querying the semantic layer

We can now use Timbr's chains to query the semantic layer. Import and utilize your intended chain/node, or use TimbrLlmConnector to manually integrate with Timbr's semantic layer.

```python
from langchain_timbr import ExecuteTimbrQueryChain
from langchain_openai import ChatOpenAI
```

```python
# You can use the standard LangChain ChatOpenAI/ChatAnthropic models
# or any other LLM model based on langchain_core.language_models.chat.BaseChatModel
llm = ChatOpenAI(model="gpt-4o", temperature=0, openai_api_key='open-ai-api-key')

# Optional alternative: Use Timbr's LlmWrapper, which provides generic connections to different LLM providers
from langchain_timbr import LlmWrapper, LlmTypes
llm = LlmWrapper(llm_type=LlmTypes.OpenAI, api_key="open-ai-api-key", model="gpt-4o")
```

### ExecuteTimbrQueryChain example

```python
execute_timbr_query_chain = ExecuteTimbrQueryChain(
llm=llm,
url="https://your-timbr-app.com/",
token="tk_XXXXXXXXXXXXXXXXXXXXXXXX",
ontology="timbr_knowledge_graph",
schema="dtimbr", # optional
concept="Sales", # optional
concepts_list=["Sales","Orders"], # optional
views_list=["sales_view"], # optional
note="We only need sums", # optional
retries=3, # optional
should_validate_sql=True # optional
)

result = execute_timbr_query_chain.invoke({"prompt": "What are the total sales for last month?"})
rows = result["rows"]
sql = result["sql"]
concept = result["concept"]
schema = result["schema"]
error = result.get("error", None)

usage_metadata = result.get("execute_timbr_usage_metadata", {})
determine_concept_usage = usage_metadata.get('determine_concept', {})
generate_sql_usage = usage_metadata.get('generate_sql', {})
# Each usage_metadata item contains:
# * 'approximate': Estimated token count calculated before invoking the LLM
# * 'input_tokens'/'output_tokens'/'total_tokens'/etc.: Actual token usage metrics returned by the LLM
```

```output
{'rows': [{'total_sales': 150000}], 'sql': 'SELECT SUM(amount) as total_sales FROM sales WHERE date >= DATEADD(month, -1, GETDATE())', 'concept': 'Sales', 'schema': 'dtimbr'}
```

### Using multiple chains with SequentialChain

You can combine multiple Timbr chains to create more complex workflows.

```python
from langchain.chains import SequentialChain
from langchain_timbr import ExecuteTimbrQueryChain, GenerateAnswerChain

execute_timbr_query_chain = ExecuteTimbrQueryChain(
llm=llm,
url='https://your-timbr-app.com/',
token='tk_XXXXXXXXXXXXXXXXXXXXXXXX',
ontology='timbr_knowledge_graph',
)

generate_answer_chain = GenerateAnswerChain(
llm=llm,
url='https://your-timbr-app.com/',
token='tk_XXXXXXXXXXXXXXXXXXXXXXXX',
)

pipeline = SequentialChain(
chains=[execute_timbr_query_chain, generate_answer_chain],
input_variables=["prompt"],
output_variables=["answer", "sql"]
)

result = pipeline.invoke({"prompt": "What are the total sales for last month?"})
```

```output
{'prompt': 'What are the total sales for last month?', 'answer': 'Based on the query results, the total sales for last month amount to $150,000.', 'sql': 'SELECT SUM(amount) as total_sales FROM sales WHERE date >= DATEADD(month, -1, GETDATE())'}
```

## Using the TimbrLlmConnector

For manual integration with Timbr's semantic layer, you can use the `TimbrLlmConnector` which includes the following methods:

- `get_ontologies` - List Timbr's semantic knowledge graphs
- `get_concepts` - List selected knowledge graph ontology representation concepts
- `get_views` - List selected knowledge graph ontology representation views
- `determine_concept` - Identify relevant concepts from user prompts
- `generate_sql` - Generate SQL queries from natural language prompts
- `validate_sql` - Validate SQL queries against Timbr knowledge graph schemas
- `run_timbr_query` - Execute (semantic and regular) SQL queries against Timbr knowledge graph databases
- `run_llm_query` - Execute agent pipeline to determine concept, generate SQL, and run query from natural language prompt

```python
from langchain_timbr import TimbrLlmConnector

connector = TimbrLlmConnector(
llm=llm,
url="https://your-timbr-app.com/",
token="tk_XXXXXXXXXXXXXXXXXXXXXXXX",
ontology="timbr_knowledge_graph"
)

# Get available concepts
concepts = connector.get_concepts()
print("Available concepts:", concepts)

# Run a complete query pipeline
result = connector.run_llm_query("What are the top 5 customers by revenue?")
print("Query result:", result)
```

## Additional Resources

- [PyPI](https://pypi.org/project/langchain-timbr)
- [GitHub](https://github.com/WPSemantix/langchain-timbr)
- [LangChain Timbr Docs](https://docs.timbr.ai/doc/docs/integration/langchain-sdk/)
- [LangGraph Timbr Docs](https://docs.timbr.ai/doc/docs/integration/langgraph-sdk)
8 changes: 8 additions & 0 deletions src/oss/python/integrations/providers/all_providers.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -2664,6 +2664,14 @@ Browse the complete collection of integrations available for Python. LangChain P
Entity resolution and data matching.
</Card>

<Card
title="Timbr"
href="/oss/integrations/providers/timbr"
icon="link"
>
Semantic layer for data integration and querying.
</Card>

<Card
title="Together"
href="/oss/integrations/providers/together"
Expand Down
34 changes: 34 additions & 0 deletions src/oss/python/integrations/providers/timbr.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
---
title: Timbr
---

>What is `Timbr`?

>- Timbr is a `semantic SQL knowledge graph platform` that specializes in connecting data through ontology-driven semantic layers.
>- Timbr allows you to represent and query data using business-friendly language, making it ideal for handling complex data relationships and business logic.
>- Timbr provides `natural language to SQL` capabilities, making it easy to interact with your data using plain English queries.
>- With Timbr, you can achieve high-performance `semantic data querying`, suitable for production-level analytics and business intelligence.

>Get started with Timbr by visiting [their website](https://docs.timbr.ai/doc/docs/getting-started/intro-timbr/).

## Installation and Setup

- Install the Python SDK with `pip install langchain-timbr`

### Optional: Install with selected LLM provider

Choose one of: openai, anthropic, google, azure_openai, snowflake, databricks, vertex_ai (or 'all')

```bash
pip install 'langchain-timbr[<your selected providers, separated by comma without spaces>]'
```

## Semantic SQL Queries

There exists a wrapper around Timbr's semantic layer that allows you to generate SQL statements based on natural language input and use them to retrieve relevant information from your knowledge graph.

```python
from langchain_timbr import create_timbr_sql_agent, ExecuteTimbrQueryChain, GenerateTimbrSqlChain
```

See a [usage example](/oss/integrations/graphs/timbr)