Skip to content

Implement Knowledge Distillation #9

@Thamirawaran

Description

@Thamirawaran

Goal

Transfer knowledge from a large, complex teacher model to a smaller, efficient student model.

Explanation of tasks

  • 1. Propose a distillation methodology
  • 2. Design a good prompt
  • 3. Define output format of teacher
  • 4. Prepare dataset for student training
  • 5. Train the student model
  • 6. Evaluate the student model

Propose a distillation methodology

  • 1. Teacher model
  • 2. Student model : Qwen2.5 VL 2B
  • 3. Type of knowledge
  • 4. Training method: offline
  • 5. Type of knowledge transfer
  • 6. Dataset: CarDD dataset

metrics

  1. Exact match
  2. F1 score

Sub-issues

Metadata

Metadata

Labels

No labels
No labels

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions