Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 0 additions & 3 deletions docs/docs/communityhub/release_notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,6 @@ This document provides a summary of new features, improvements, and bug fixes in

## jaclang 0.8.10 / jac-cloud 0.2.10 / byllm 0.4.5 (Unreleased)

- **byLLM Lazy Loading**: Refactored byLLM to support lazy loading by moving all exports to `byllm.lib` module. Users should now import from `byllm.lib` in Python (e.g., `from byllm.lib import Model, by`) and use `import from byllm.lib { Model }` in Jac code. This improves startup performance and reduces unnecessary module loading.
- **NonGPT Fallback for byLLM**: Implemented automatic fallback when byLLM is not installed. When code attempts to import `byllm`, the system will provide mock implementations that return random using the `NonGPT.random_value_for_type()` utility.

## jaclang 0.8.9 / jac-cloud 0.2.9 / byllm 0.4.4 (Latest Release)

- **Typed Context Blocks (OSP)**: Fully implemented typed context blocks (`-> NodeType { }` and `-> WalkerType { }`) for Object-Spatial Programming, enabling conditional code execution based on runtime types.
Expand Down
18 changes: 9 additions & 9 deletions docs/docs/jac_book/chapter_5.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ pip install byllm
Next we replace the OpenAI import with that of the byLLM package

```jac
import from byllm.lib { Model }
import from byllm { Model }
glob llm = Model(model_name="gpt-4.1-mini");
```
<br />
Expand All @@ -94,7 +94,7 @@ def write_poetry(topic: str) -> str by llm();
Finally, lets put it all together and run the Jac code:
```jac
# mt_poem.jac - Simple AI integration
import from byllm.lib { Model }
import from byllm { Model }

glob llm = Model(model_name="gpt-4.1-mini");

Expand Down Expand Up @@ -137,7 +137,7 @@ Next we'll make use of MLTLLM's `Image` function to handle image inputs. This fu

```jac
# image_captioning.jac - Simple Image Captioning Tool
import from byllm.lib { Model, Image }
import from byllm { Model, Image }

glob llm = Model(model_name="gpt-4o-mini");

Expand Down Expand Up @@ -169,7 +169,7 @@ byLLM supports various AI models through the unified `Model` interface. For exam

```jac
# basic_setup.jac
import from byllm.lib { Model, Image }
import from byllm { Model, Image }

# Configure different models
glob text_model = Model(model_name="gpt-4o");
Expand All @@ -182,7 +182,7 @@ glob gemini_model = Model(model_name="gemini-2.0-flash");
The `Model` class allows you to configure various parameters for your AI model, such as temperature, max tokens, and more. Here's an example of how to set up a model with custom parameters:

```jac
import from byllm.lib { Model, Image }
import from byllm { Model, Image }

# Configure model with custom parameters
glob creative_model = Model(
Expand Down Expand Up @@ -210,7 +210,7 @@ Below is a breakdown of the parameters you can configure when creating a `Model`
Here we have a simple example of how to use the `Model` class to create a model instance with custom parameters:
```jac
# model_config.jac
import from byllm.lib { Model, Image }
import from byllm { Model, Image }

# Configure model with custom parameters
glob creative_model = Model(
Expand Down Expand Up @@ -252,7 +252,7 @@ Let's progressively build an image captioning tool that demonstrates byLLM's cap

```jac
# image_captioner.jac
import from byllm.lib { Model, Image }
import from byllm { Model, Image }

glob vision_llm = Model(model_name="gpt-4o-mini");

Expand Down Expand Up @@ -307,7 +307,7 @@ the stylish outfit of the dog contribute to a fun and lighthearted atmosphere.

```jac
# enhanced_captioner.jac
import from byllm.lib { Model, Image }
import from byllm { Model, Image }

glob vision_llm = Model(model_name="gpt-4.1-mini");

Expand Down Expand Up @@ -363,7 +363,7 @@ AI applications require robust error handling and testing strategies.

```jac
# robust_ai.jac
import from byllm.lib { Model, Image }
import from byllm { Model, Image }

glob reliable_llm = Model(model_name="gpt-4o", max_tries=3);

Expand Down
4 changes: 2 additions & 2 deletions docs/docs/jac_book/chapter_9.md
Original file line number Diff line number Diff line change
Expand Up @@ -197,7 +197,7 @@ Using the byLLM plugin, we can define a function that sends the agent's state to

Here’s the code for our mood function:
```jac
import from byllm.lib { Model }
import from byllm { Model }

# Configure the LLM
glob npc_model = Model(model_name="gpt-4.1-mini");
Expand Down Expand Up @@ -242,7 +242,7 @@ The `NPCWalker` first inherits the behavior of `StateAgent` (which collects cont

Finally, we can compose everything in a single entry point:
```jac
import from byllm.lib { Model }
import from byllm { Model }

# Configure different models
glob npc_model = Model(model_name="gpt-4.1-mini");
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/learn/beginners_guide_to_jac.md
Original file line number Diff line number Diff line change
Expand Up @@ -2678,7 +2678,7 @@ Now that you have the foundation, here are advanced Jac features to explore:
<div class="code-block">

```jac
import from byllm.lib { Model }
import from byllm { Model }

glob llm = Model(model="gpt-4");

Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@

import from byllm.lib { Model }
import from byllm { Model }
import from pathlib { Path }

glob llm = Model(model_name="gpt-4o-mini");
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/learn/examples/littleX/src/byllm_example.jac
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import from byllm.lib {Model}
import from byllm {Model}

glob llm = Model(host="http://127.0.0.1:11434", model_name="ollama/llama3.2:1b");

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/learn/examples/littleX/src/example_game.jac
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import from byllm.lib {Model}
import from byllm {Model}

glob llm = Model(model_name="gpt-4o");

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ obj Chat {
Configure the LLM for AI operations:

```jac
import from byllm.lib {Model}
import from byllm {Model}

glob llm = Model(model_name="gpt-4o");
```
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/learn/examples/mtp_examples/rpg_game.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,7 @@ We’ll connect to an LLM (GPT-4o here) and define AI-powered methods for genera
At the top of `level_manager.jac`, import the model:

```jac
import from byllm.lib { Model }
import from byllm { Model }

glob llm = Model(model_name="gpt-4o", verbose=True);
```
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/learn/jac-byllm/litellm_proxy.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Reference: [https://docs.litellm.ai/docs/proxy/deploy](https://docs.litellm.ai/d
Once The proxy server is setted up and running, you can connect to it by simply passing the URL of the proxy server to the byLLM model with the parameter `proxy_url`:

```python
from byllm.lib import Model
from byllm import Model

llm = Model(
model_name="gpt-4o", # The model name to be used
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/learn/jac-byllm/multimodality.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ pip install byllm[video]
byLLM supports image inputs through the `Image` format. Images can be provided as input to byLLM functions or methods:

```jac
import from byllm.lib { Model, Image }
import from byllm { Model, Image }

glob llm = Model(model_name="gpt-4o");

Expand Down Expand Up @@ -57,7 +57,7 @@ In this example, an image of a person is provided as input to the `get_person_in
byLLM supports video inputs through the `Video` format. Videos can be provided as input to byLLM functions or methods:

```jac
import from byllm.lib { Model, Video }
import from byllm { Model, Video }

glob llm = Model(model_name="gpt-4o");

Expand Down
9 changes: 4 additions & 5 deletions docs/docs/learn/jac-byllm/python_integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ byLLM functionality is accessed by importing the `byllm` module and using the `b
```python linenums="1"
import jaclang
from dataclasses import dataclass
from byllm.lib import Model, Image, by
from byllm import Model, Image, by

llm = Model(model_name="gpt-4o")

Expand Down Expand Up @@ -43,7 +43,7 @@ print(f"Name: {person.full_name}, Description: {person.description}, Year of Bir
In Jaclang, hyper-parameters are set by passing them to the LLM model:

```jac linenums="1"
import from byllm.lib { Model }
import from byllm { Model }

glob llm = Model(model_name="gpt-4o")

Expand All @@ -56,7 +56,7 @@ In Python, hyper-parameters are passed as follows:

```python linenums="1"
import jaclang
from byllm.lib import Model, by
from byllm import Model, by

llm = Model(model_name="gpt-4o")

Expand All @@ -70,7 +70,7 @@ Python functions can be used as tools in byLLM. Functions defined in Python are

```python linenums="1"
import jaclang
from byllm.lib import Model
from byllm import Model
llm = Model(model_name="gpt-4o")


Expand Down Expand Up @@ -98,7 +98,6 @@ Using `sem` functionality in python is a bit diferent as the attachment is done

```python
from jaclang import JacMachineInterface as Jac
from byllm.lib import Model, by

@Jac.sem('<Person Semstring>', {
'name' : '<name semstring>',
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/learn/jac-byllm/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ The `by` keyword abstraction enables functions to process inputs of any type and
#### Step 1: Configure LLM Model

```jac linenums="1"
import from byllm.lib {Model}
import from byllm {Model}

glob llm = Model(model_name="gemini/gemini-2.0-flash");
```
Expand Down Expand Up @@ -77,7 +77,7 @@ As byLLM is a python package, it can be natively used in jac. The following code

```python linenums="1"
import jaclang
from byllm.lib import Model, by
from byllm import Model, by
from enum import Enum

llm = Model(model_name="gemini/gemini-2.0-flash")
Expand Down
14 changes: 7 additions & 7 deletions docs/docs/learn/jac-byllm/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,31 +8,31 @@ byLLM uses [LiteLLM](https://docs.litellm.ai/docs) to provide integration with a

=== "OpenAI"
```jac linenums="1"
import from byllm.lib {Model}
import from byllm {Model}

glob llm = Model(model_name = "gpt-4o")
```
=== "Gemini"
```jac linenums="1"
import from byllm.lib {Model}
import from byllm {Model}

glob llm = Model(model_name = "gemini/gemini-2.0-flash")
```
=== "Anthropic"
```jac linenums="1"
import from byllm.lib {Model}
import from byllm {Model}

glob llm = Model(model_name = "claude-3-5-sonnet-20240620")
```
=== "Ollama"
```jac linenums="1"
import from byllm.lib {Model}
import from byllm {Model}

glob llm = Model(model_name = "ollama/llama3:70b")
```
=== "HuggingFace Models"
```jac linenums="1"
import from byllm.lib {Model}
import from byllm {Model}

glob llm = Model(model_name = "huggingface/meta-llama/Llama-3.3-70B-Instruct")
```
Expand Down Expand Up @@ -270,7 +270,7 @@ In this example:
The ReAct (Reasoning and Acting) method enables agentic behavior by allowing functions to reason about problems and use external tools. Functions can be made agentic by adding the `by llm(tools=[...])` declaration.

```jac linenums="1"
import from byllm.lib { Model }
import from byllm { Model }
import from datetime { datetime }

glob llm = Model(model_name="gpt-4o");
Expand Down Expand Up @@ -304,7 +304,7 @@ The streaming feature enables real-time token reception from LLM functions, usef
Set `stream=True` in the invoke parameters to enable streaming:

```jac linenums="1"
import from byllm.lib { Model }
import from byllm { Model }

glob llm = Model(model_name="gpt-4o-mini");

Expand Down
12 changes: 6 additions & 6 deletions docs/docs/learn/jac-byllm/with_llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ pip install byllm
Consider building an application that translates english to other languages using an LLM. This can be simply built as follows:
=== "Jac"
```jac linenums="1"
import from byllm.lib { Model }
import from byllm { Model }

glob llm = Model(model_name="gpt-4o");

Expand All @@ -56,7 +56,7 @@ Consider building an application that translates english to other languages usin
```
=== "python"
```python linenums="1"
from byllm.lib import Model, by
from byllm import Model, by

llm = Model(model_name="gpt-4o")

Expand All @@ -75,7 +75,7 @@ Consider a program that detects the personality type of a historical figure from

=== "Jac"
```jac linenums="1"
import from byllm.lib { Model }
import from byllm { Model }
glob llm = Model(model_name="gemini/gemini-2.0-flash");

enum Personality {
Expand All @@ -94,7 +94,7 @@ Consider a program that detects the personality type of a historical figure from
```
=== "Python"
```python linenums="1"
from byllm.lib import Model, by
from byllm import Model, by
from enum import Enum
llm = Model(model_name="gemini/gemini-2.0-flash")

Expand All @@ -119,7 +119,7 @@ Even if we are elimination prompt engineering entierly, we allow specific ways t

=== "Jac"
```jac linenums="1"
import from byllm.lib { Model }
import from byllm { Model }
glob llm = Model(model_name="gemini/gemini-2.0-flash");

"""Represents the personal record of a person"""
Expand All @@ -140,7 +140,7 @@ Even if we are elimination prompt engineering entierly, we allow specific ways t
```python linenums="1"
from jaclang import JacMachineInterface as Jac
from dataclasses import dataclass
from byllm.lib import Model, by
from byllm import Model, by
llm = Model(model_name="gemini/gemini-2.0-flash")

@Jac.sem('', { 'name': 'Full name of the person',
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/learn/tour.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ This snippet natively imports Python packages `math` and `random` and runs ident
Jac provides novel constructs for integrating LLMs into code. A function body can simply be replaced with a call to an LLM, removing the need for prompt engineering or extensive use of new libraries.

```jac
import from byllm.lib { Model }
import from byllm { Model }
glob llm = Model(model_name="gpt-4o");

enum Personality {
Expand Down
4 changes: 2 additions & 2 deletions jac-byllm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ pip install byllm
Consider building an application that translates english to other languages using an LLM. This can be simply built as follows:

```python
import from byllm.lib { Model }
import from byllm { Model }

glob llm = Model(model_name="gpt-4o");

Expand All @@ -45,7 +45,7 @@ This simple piece of code replaces traditional prompt engineering without introd
Consider a program that detects the personality type of a historical figure from their name. This can eb built in a way that LLM picks from an enum and the output strictly adhere this type.

```python
import from byllm.lib { Model }
import from byllm { Model }
glob llm = Model(model_name="gemini/gemini-2.0-flash");

enum Personality {
Expand Down
11 changes: 10 additions & 1 deletion jac-byllm/byllm/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +1,10 @@
"""byLLM Package - Lazy Loading."""
"""byLLM Package."""

from byllm.llm import Model
from byllm.mtir import MTIR
from byllm.plugin import JacMachine
from byllm.types import Image, MockToolCall, Video

by = JacMachine.by

__all__ = ["by", "Image", "MockToolCall", "Model", "MTIR", "Video"]
Loading