You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An open-source platform that enables students and teachers to interact with classroom materials through Retrieval-Augmented Generation (RAG). Upload documents, chat with course content, and streamline academic Q&A using powerful language models—all in one collaborative interface.
ClassroomLM provides each of the many classrooms within an organization a specialized LLM assistant that is specific and accurate to the subject matter and resources of that particular classroom.
4
5
5
-
## Tech Stack
6
+
> ???? Makes sense for classrooms in primary education or universities, and even for other adjacent organizations like clubs and student associations that want easy access to giving an AI assistant to their members specific to their documents.
7
+
8
+
The core feature and main value that ClassroomLM provides is the application's framework of having siloed knowledge bases per classroom to conduct Retrieval Augmented Generation (RAG) on, with the additional features like collaborative chat layered on top of this.
9
+
10
+
[video walktrhough of everything]
11
+
12
+
## Features
13
+
14
+
### **Classroom-Style Structure**
15
+
16
+
Teachers can create classrooms, upload documents (PDFs, slides, handwritten notes), and invite students.
17
+
18
+
Diagram of docs to classes (maybe add it sidebyside to main classroom page screenshot?)
19
+
20
+
### **Classroom-Specific AI Assistants**
21
+
22
+
Each classroom has access to an LLM assistant that is enabled by RAG, allowing it to be more specific and accurate, while also being better and smarter at retrieving information from the class' resources.
23
+
24
+
#### **Advantages over current user-facing AI assistant systems with use case examples**
25
+
26
+
-**More accurate and specific**: can operate in specific or even specially created context just for classrooms.
27
+
> **Use case example**: An NYU Professor has a variation of assembly created specifically for the classroom, called e20. Putting the e20 manual into the shared classroom dataset gave all students within this classroom access to an assistant that is now specialized, knowledgeable, and with full context of this niche, not-seen-before language personally created by a professor. \
28
+
> Comparing it to existing systems, they gave vague, non-specific, and non-accurate answers relevant to other assembly variants.
29
+
-**Logistical and practical benefits**: knowledge bases shared across entire classroom
30
+
- Rather than an entire classroom's worth of students across all of their individual classes having to upload their documents individually, keep it up to date with new resources, and separate it from other classes, having a shared (but siloed) knowledge base for an entire classroom reduces the overhead, friction, and allows for superior use of resources.
31
+
-\*\*Diverse ability and
32
+
33
+
- Humanities, etc.
34
+
35
+
<!-- Building conversation context | Continues and LLM triggered with /ask | LLM responds
Group chat support with other students the AI can participate with full chat context.
51
+
52
+
- Students can create multiple chatrooms per classroom and choose who to invite within each chatroom.
53
+
- Within the chatroom, students can pull the LLM into the conversation in the manner of a group chat with the **`/ask [message]`** command.
54
+
- The assistant in this case retains all the benefits described above for the personal chat, as it is also RAG enabled.
55
+
56
+
> To be clear, this isn’t the common implementation of a "group chat with an assistant" very often found in company Slacks, etc. where the LLM is just responding to the message that triggered it. In that case, it's only more useful than just doing asking the LLM personally since the answer is visible to everyone. Instead here, when triggered with the `/ask` command the LLM will have knowledge of all the previous conversation and respond accordingly, as if it’s just part of the conversation.
Supabase stores user information, classroom/chatroom metadata, collaborative chat history as realtime database
67
+
68
+
Node JS Shadcn ui
69
+
70
+
Then on the side we have dev stack
6
71
7
72
-**Framework**: Next.js 15 with App Router
8
73
-**Language**: TypeScript
@@ -11,53 +76,104 @@ An open-source platform that enables students and teachers to interact with clas
11
76
-**Database**: Supabase
12
77
-**Deployment**: Docker and Kubernetes support
13
78
14
-
## Features
79
+
## Usage
15
80
16
-
-**Classroom-Style Structure**
17
-
Teachers can create classrooms, upload documents (PDFs, slides, handwritten notes), and invite students.
81
+
For both development and deployment, the **instructions below need to be followed** to ensure you have a RagFlow and Supabase instance running. The only difference is that development could mean you can just have local versions of those two things.
18
82
19
-
-**Classroom-Specific AI Assistants**
20
-
Each classroom has its own RAG-based LLM assistant trained on uploaded materials.
83
+
### 1. Set up [RagFlow](https://github.com/infiniflow/ragflow)
21
84
22
-
-**Collaborative AI Chats**
23
-
Group chat support where the AI can participate with full chat context.
85
+
Follow [the instructions on the Ragflow docs](https://ragflow.io/docs/dev/) to **deploy and configure**it. This includes choosing the LLM to use, with many supported options to choose from.\
86
+
Note the deployment method they detail in the docs are with Docker Compose.Alternatively, they also have a [helm chart](https://github.com/infiniflow/ragflow/tree/main/helm) to deploy RagFlow onto a Kubernetes cluster.
24
87
88
+
> Note: since we're deploying our web app onto port 8080 as per our [Dockerfile](https://github.com/TechAtNYU/dev-team-spring-25/blob/main/Dockerfile), depending on whether or not your RagFlow engine is deployed on the same machine/network as the ClassroomLM application, you should consider changing the port for RagFlow's interface.
89
+
> Follow the instructions [here to update the HTTP and HTTPS port](<https://ragflow.io/docs/dev/configurations#:~:text=To%20update%20the%20default%20HTTP%20serving%20port%20(80)%2C>) away from 80 and 443 if you would not like RagFlow's web interface to use them up.
25
90
26
-
##Prerequisites
91
+
#### Create a RagFlow API Key
27
92
28
-
- Node.js (LTS version)
29
-
- pnpm (recommended package manager)
30
-
- Docker (for containerized development)
31
-
- Kubernetes (for deployment)
93
+
Follow the [instructions on the RagFlow docs](https://ragflow.io/docs/dev/acquire_ragflow_api_key) to create an API key.
[Supabase](https://supabase.com/) can be self-hosted. Most likely, this is the better option since you'll need to host RagFlow somewhere anyway. Follow the [instructions here](https://supabase.com/docs/guides/self-hosting) to see how to self host a Supabase instance using Docker Compose, Kuberenetes, etc.
98
+
99
+
Otherwise, you can choose to use Supabase's hosted service, which also [has a free tier](https://supabase.com/pricing).
100
+
101
+
If you're only developing locally, you can take a look at [this section on the Supabase docs.](https://supabase.com/docs/guides/local-development/cli/getting-started?queryGroups=platform&platform=npm#running-supabase-locally)
102
+
103
+
#### Provision Supabase instance
104
+
105
+
First, [install the Supabase CLI](https://supabase.com/docs/guides/local-development/cli/getting-started). If you already have the `npm` dependencies installed from the development setup, then you should already have it.
| NEXT_PUBLIC_SUPABASE_URL | Use either the given URL from the hosted version or a custom URL from your self-hosted solution |
123
+
| NEXT_PUBLIC_SUPABASE_ANON_KEY | Should be available somewhere in Supabase's settings |
124
+
| NEXT_PUBLIC_SITE_URL | The root URL for the site, to be used for callbacks after authentication |
125
+
| NEXT_PUBLIC_ALLOWED_EMAIL_DOMAINS | When users login with Google, these are the only email domains allowed to sign up. **Note that this is also needs to be configured within Supabase**|
126
+
| NEXT_PUBLIC_ORGANIZATION_NAME | The name of the organization |
127
+
| SUPABASE_SERVICE_ROLE_KEY | Should be available somewhere in Supabase's settings |
128
+
| RAGFLOW_API_KEY | Go back section 2 to make this key |
129
+
| RAGFLOW_API_URL | Publicly available hostname to access RagFlow's API |
130
+
131
+
### Deployment
132
+
133
+
#### Add configuration info to kubernetes files
134
+
135
+
Put that same information from `.env` into the `k8s/config.yaml` and `k8s/secret.yaml` (the info is split among those two files.)\
136
+
Note: the same info is duplicated because NextJS requires the environment variables at build time too.
137
+
138
+
#### Build Docker image
139
+
140
+
Next, we build the image with the following command, with your registry information filled in (or omitted). What's important is that it matches the deployment file later.
Change the **container image** within `k8s/deployment.yaml` to match the image tag in the previous step.
149
+
150
+
Then deploy:
151
+
152
+
```bash
153
+
kubectl apply -f k8s
154
+
```
155
+
156
+
### Development
157
+
158
+
1. Install dependencies:\
159
+
Assuming NPM is installed, we [recommend installing `pnpm`](https://pnpm.io/installation).\
160
+
Then, run the following in the root directory:
40
161
41
-
2. Install dependencies:
42
162
```bash
43
163
pnpm install
44
164
```
45
165
46
-
3. Set up environment variables:
47
-
Create a `.env.local` file in the root directory with the necessary environment variables.
48
-
```bash
49
-
cp .env.example .env
50
-
```
51
-
and update the appropriate variables
52
-
4. Start the development server:
166
+
2. Start the development server:
167
+
53
168
```bash
54
169
pnpm dev
55
170
```
171
+
56
172
The application will be available at [http://localhost:8080](http://localhost:8080)
57
173
58
-
## Available Scripts
174
+
####Available Scripts
59
175
60
-
-`pnpm dev` - Start development server with Turbopack
176
+
-`pnpm dev` - Start development server
61
177
-`pnpm build` - Build the application for production
62
178
-`pnpm start` - Start the production server
63
179
-`pnpm test` - Run tests in watch mode
@@ -77,20 +193,19 @@ An open-source platform that enables students and teachers to interact with clas
77
193
- Git hooks (via Husky) ensure code quality before commits
78
194
- Prettier and ESLint maintain consistent code style
79
195
80
-
## Testing
81
-
82
-
The project uses Vitest for testing with React Testing Library. Tests can be run in watch mode or as a single run. Coverage reports can be generated to ensure comprehensive testing.
83
-
84
196
## Deployment
85
197
86
198
The application can be deployed using Docker and Kubernetes. The project includes:
199
+
87
200
- Dockerfile for containerization
88
201
- Kubernetes manifests in the `k8s` directory
89
202
- Tekton pipelines for CI/CD
90
203
91
-
## Contributing
204
+
## Credits
205
+
206
+
ClassroomLM was initially created by the first cohort of [Tech@NYU's](https://techatnyu.org) Dev Team Program in the Spring 2025 semester.
0 commit comments