This is a full-stack web application developed as part of a university project for the course Fundamentals of Cloud Technologies. The application parses data from a given GitHub profile
(with planned support for LinkedIn
in the future) and uses an AI-powered system to generate a personalized PDF resume based on the provided information and the user's custom query.
As part of the assignment, our team was required to:
- Build a web application with both frontend and backend components;
- Deploy the application to the cloud using different providers for hosting and additional services;
- Integrate at least one external cloud service provided by a different vendor than the one hosting the app;
- Include AI or IoT functionality within the application;
- Use a cloud-based database for data storage;
- Document the architecture and technical choices made during the development;
- Collaborate effectively via GitHub, with clear task distribution and version control.
According to the requirements, the frontend and backend are hosted on different cloud environments. The AI module runs as a separate microservice deployed on a different provider, ensuring the usage of multiple cloud platforms.
-
🖥️ Frontend
A user-friendly interface where users can:- Input their GitHub profile link
- Write a custom query
- Generate a personalized resume based on the provided data
- View the resume in real-time
- Download the final version as a PDF
-
🔗 GitHub Integration
A mechanism that allows users to input a GitHub link. The application then fetches relevant profile data — such as repositories, technologies used, and programming languages — via the GitHub API. -
🧾 Parsing Service
Responsible for processing and transforming raw GitHub data into a structured format. This includes extracting and organizing details about the user's profile, projects, and skills for further use in resume generation. -
💬 AI Chat Service
Leverages AI models to process both GitHub data and the user's custom text input. The service generates a LaTeX-formatted resume based on this information, ensuring the final document is tailored and professional. -
🔙 Backend API
The server-side engine of the application. It handles:- Routing and logic
- Data processing
- LaTeX-to-PDF conversion
- Communication between microservices (e.g., between the parsing service and the AI chat service)
-
🗂️ Database
Stores user-related data such as name, email, and password — with passwords securely stored using hashing to ensure privacy and security.
demo-video.mp4
- Head to the home page — you will see the login/register menu.
- Create an account, log in, or click "Don't Enter" to use the web-application without registration.
- On the left side of the app, you'll find input fields:
- Paste your GitHub or LinkedIn profile URL.
- Optionally, write additional brief information about yourself.
Recommended format:Name: John Surname: Smith Job: Backend Developer University: TUKE
- Click the "Generate Resume" button to generate your resume. It will be displayed on the right side of the screen in an A4-style viewer. You can navigate through the pages using arrows.
- To download your resume as a PDF, click "Download PDF". You can also click "..." to choose other available formats.
- To clear all input fields, click the "Reset Parameters" button.
- To log out, click the user icon at the top-left corner.
- If you're logged in, select "Log out" from the dropdown — you'll be redirected to the login menu.
- If you're using guest access, you'll see "Login" and "Register" buttons instead.
-
Clone the repository to your local machine:
git clone https://github.com/dmytro-varich/AI-Resume-Creator.git
-
Navigate to the project’s root directory:
cd AI-Resume-Creator
-
Start the backend, parser service and frontend containers:
./start-app.sh
-
Verify these containers are running:
docker container ls
You should see output similar to this:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES ef46f6e3ba13 ai-resume-creator-frontend "/docker-entrypoint.…" 2 minutes ago Up 2 minutes 0.0.0.0:3000->80/tcp appfrontend cf4f4fe59df3 ai-resume-creator-backend "dotnet ResumeCreato…" 2 minutes ago Up 2 minutes 0.0.0.0:8080-8081->8080-8081/tcp appbackend c4ab5cf8a2b8 ai-resume-creator-parser "python ParsingServi…" 2 minutes ago Up 2 minutes 0.0.0.0:5000->5000/tcp appparser fc43bdb66f6e postgres:15.11-alpine3.21 "docker-entrypoint.s…" 2 minutes ago Up 2 minutes 5432/tcp postgres
-
Access the application in your web browser:
- Frontend: http://localhost:3000.
- Backend: http://localhost:8080.
- Parser Service: http://localhost:5000.
-
To enter a running container:
docker exec -it <container-name> /bin/sh
-
To stop and remove the containers:
./end-app.sh
The system is deployed using a multi-cloud and hybrid infrastructure that distributes services across Microsoft Azure, Neon.tech, and a university-hosted server (TUKE). Each component is containerized with Docker, allowing consistent deployment and seamless cross-platform integration.
Name | Role | Description |
---|---|---|
Dmytro Varich | Frontend |
Designed and developed the user interface and user experience of the application. |
Ivan Tkachenko | Backend , DB |
Responsible for server-side logic, inter-service communication and database management. |
Nikita Pohorilyi | Parser |
Specializes in extracting and processing data from various sources. |
Arsenii Milenchuk | AI API |
Focuses on development of a convenient AI Chat service. |
- 🎨 Design — the user interface was designed using Figma, while the logo and documentation visuals were created with Canva.
- ⚛️ React.js — served as the main framework for building the client-side of the application. External libraries like pdfjsLib were used to render PDF resumes directly on the website.
- 💨 Tailwind CSS — used for fast and flexible UI styling, enabling a responsive and modern minimalist design.
- 📦 Docker — used to containerize the frontend application, simplifying deployment and ensuring portability across environments.
- ⚙️ C# with ASP.NET Core — provides a robust, high-performance, object-oriented framework for building scalable and secure web APIs.
- 🧩 Entity Framework Core — used for ORM and database migrations; it integrates seamlessly with C# and supports code-first development.
- 🐘 PostgreSQL — chosen for its reliability and because it can be hosted for free on Neon.
- 📄 OpenAPI (Swagger) — enables automatic API documentation and testing tools.
- 🧾 Pdflatex — used as a LaTeX compiler to generate PDF files from LaTeX templates.
- 🐍 Python — chosen for its simplicity, rapid development capabilities, and rich ecosystem of libraries.
- 🧴 Flask — lightweight web framework used to build the REST API.
- 🍾 Flask-RESTX — used for route management and automatic API documentation.
- 📄 OpenAPI — integrated for API visualization and testing.
- 📦 Docker — used to containerize the parser service for portability and ease of deployment.
- ☁️ GitHub API — utilized to fetch user and repository data.
- 🔗 Requests — Python HTTP library used for making external API calls.
- ⚡ FastAPI — asynchronous Python framework with built-in support for automatic documentation via Swagger.
- 🧠 Ollama — used to interface with large language models; offers a wide range of models and flexible configuration options.
- 🌐 Nginx — acts as a reverse proxy and load balancer; selected for its speed, popularity, and seamless Docker integration.
This setup was deployed using
docker-compose
.
Nginx was particularly necessary because the GPU server was hosted within a university VPN.
An external server was configured to handle incoming requests and forward them to the internal GPU server, enabling remote access through the VPN network.