Hi, my name is Amir Hosseini, and I'm glad to have had the opportunity to work on such an interesting task. I appreciate the chance to contribute to it.
When dealing with large file imports into a database, there are several important considerations. The first challenge, in my view, is handling file uploads efficiently. In a real-world scenario, it would be beneficial to separate the upload process from the import service. Once the file is uploaded, the API returns the file ID, and the frontend can then trigger the import process. The file can be compressed with Gzip before being sent to the API.
To prevent memory overhead and reduce network load, I used batch processing and queues. In a real-world scenario, I would also implement authentication and authorization for security, as well as encrypt data exchanged between services.
For invalid rows in the CSV file, I implemented validation to separate them into a separate CSV file and send it to the user for correction.
- Language: [PHP]
- Framework: [Symfony 7]
- Database: [MySQL]
- Queue: [RabbitMQ]
- Other Tools: [Docker]
docker compose -p localbrandx up -d
docker compose -p localbrandx up -d --build app
docker exec -it app php bin/console messenger:consume async --limit=10 --memory-limit=512M
curl -X POST -F "file=@/home/amir/Downloads/import.csv" http://localhost:8002/api/employee
curl -X POST http://localhost:8002/api/employee/import/1
Endpoints:
GET /api/employee/{employeeId}
DELETE /api/employee/{employeeId}
curl -X GET http://localhost:8002/api/employee/470143
{
"id": 12,
"employeeId": "198429",
"email": "[email protected]"
}
curl -X DELETE http://localhost:8002/api/employee/470143
{
"message": "Employee with EmployeeID 470143 has been deleted."
}