Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: add note about new architecture in readme and docs #127

Merged
merged 2 commits into from
Mar 7, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 26 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@

**ExecuTorch** is a novel framework created by Meta that enables running AI models on devices such as mobile phones or microcontrollers. React Native ExecuTorch bridges the gap between React Native and native platform capabilities, allowing developers to run AI models locally on mobile devices with state-of-the-art performance, without requiring deep knowledge of native code or machine learning internals.

## Compatibility

React Native Executorch supports only the [New React Native architecture](https://reactnative.dev/architecture/landing-page).

If your app still runs on the old architecture, please consider upgrading to the New Architecture.

## Readymade models 🤖

To run any AI model in ExecuTorch, you need to export it to a `.pte` format. If you're interested in experimenting with your own models, we highly encourage you to check out the [Python API](https://pypi.org/project/executorch/). If you prefer focusing on developing your React Native app, we will cover several common use cases. For more details, please refer to the documentation.
Expand All @@ -15,12 +21,12 @@ To run any AI model in ExecuTorch, you need to export it to a `.pte` format. If
Take a look at how our library can help build you your React Native AI features in our docs:
https://docs.swmansion.com/react-native-executorch

# 🦙 **Quickstart - Running Llama**

# 🦙 **Quickstart - Running Llama**
**Get started with AI-powered text generation in 3 easy steps!**

**Get started with AI-powered text generation in 3 easy steps!**
### 1️⃣ **Installation**

### 1️⃣ **Installation**
```bash
# Install the package
yarn add react-native-executorch
Expand All @@ -29,46 +35,51 @@ cd ios && pod install && cd ..

---

### 2️⃣ **Setup & Initialization**
Add this to your component file:
### 2️⃣ **Setup & Initialization**

Add this to your component file:

```tsx
import {
LLAMA3_2_3B_QLORA,
import {
LLAMA3_2_3B_QLORA,
LLAMA3_2_3B_TOKENIZER,
useLLM
useLLM,
} from 'react-native-executorch';

function MyComponent() {
// Initialize the model 🚀
const llama = useLLM({
modelSource: LLAMA3_2_3B_QLORA,
tokenizerSource: LLAMA3_2_3B_TOKENIZER
tokenizerSource: LLAMA3_2_3B_TOKENIZER,
});
// ... rest of your component
}
```

---

### 3️⃣ **Run the model!**
### 3️⃣ **Run the model!**

```tsx
const handleGenerate = async () => {
const prompt = "The meaning of life is";
const prompt = 'The meaning of life is';

// Generate text based on your desired prompt
const response = await llama.generate(prompt);
console.log("Llama says:", response);
console.log('Llama says:', response);
};
```

## Minimal supported versions

The minimal supported version is 17.0 for iOS and Android 13.

## Examples 📲

https://github.com/user-attachments/assets/27ab3406-c7f1-4618-a981-6c86b53547ee

We currently host two example apps demonstrating use cases of our library:

- examples/speech-to-text - Whisper and Moonshine models ready for transcription tasks
- examples/computer-vision - computer vision related tasks
- examples/llama - chat applications showcasing use of LLMs
Expand Down Expand Up @@ -100,10 +111,10 @@ yarn expo run:ios
```

### Warning
Running LLMs requires a significant amount of RAM. If you are encountering unexpected app crashes, try to increase the amount of RAM allocated to the emulator.

Running LLMs requires a significant amount of RAM. If you are encountering unexpected app crashes, try to increase the amount of RAM allocated to the emulator.

## License
## License

This library is licensed under [The MIT License](./LICENSE).

Expand Down
6 changes: 6 additions & 0 deletions docs/docs/fundamentals/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,12 @@ ExecuTorch is a novel AI framework developed by Meta, designed to streamline dep

React Native ExecuTorch is our way of bringing ExecuTorch into the React Native world. Our API is built to be simple, declarative, and efficient. Plus, we’ll provide a set of pre-exported models for common use cases, so you won’t have to worry about handling exports yourself. With just a few lines of JavaScript, you’ll be able to run AI models (even LLMs 👀) right on your device—keeping user data private and saving on cloud costs.

## Compatibility

React Native Executorch supports only the [New React Native architecture](https://reactnative.dev/architecture/landing-page).

If your app still runs on the old architecture, please consider upgrading to the New Architecture.

## Installation

Installation is pretty straightforward, just use your favorite package manager.
Expand Down
Loading