-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Goal
Transfer knowledge from a large, complex teacher model to a smaller, efficient student model.
Explanation of tasks
- 1. Propose a distillation methodology
- 2. Design a good prompt
- 3. Define output format of teacher
- 4. Prepare dataset for student training
- 5. Train the student model
- 6. Evaluate the student model
Propose a distillation methodology
- 1. Teacher model
- 2. Student model : Qwen2.5 VL 2B
- 3. Type of knowledge
- 4. Training method: offline
- 5. Type of knowledge transfer
- 6. Dataset: CarDD dataset
metrics
- Exact match
- F1 score
Sub-issues
Metadata
Metadata
Assignees
Labels
No labels