How I Deployed the QAPIX Backend on GCP, Knowing Nothing About GCP (and Leveled Up the Project in One Day)
I spent a long time debating where to host the project. On my local machine—something felt off. I started comparing server rentals and narrowed it down to Google and Yandex. Yandex was cheaper, but I trusted Google more. So I sat down to deploy QAPIX on Google Cloud Platform. Until then, my exposure to GCP had been limited to screenshots online. I spun up an Ubuntu VM—and off we went. Docker, Docker Compose—all by hand, without any scripts or templates. Networking? VPC in a minute, but I didn’t immediately grasp why the firewall on ports 80 and 8080 wouldn’t respond to pings. Containers gave me freedom: PostgreSQL (postgres:15): packed into its own service, attached a volume to a Persistent Disk, initialized the database. FastAPI on Python 3.11 with Uvicorn: connected to the DB via asyncpg, and Swagger UI kicked in “out of the box.” Nginx: routed traffic from port 80 to 8080 so as not to break the frontend. I didn’t sweat the DB schema—I’ve got enough experience—so I modeled everything in Pydantic in half an hour. Another hour went into CRUD endpoints. One typo in a route—and everything crashed. I fell in love with testing via Swagger immediately: two clicks and you see the JSON flying. And my trusty Postman collections, with assertions on status codes and payload structures, saved me from endless “404” loops. So, knowing almost nothing about Docker, Nginx, or Python (those few levels I did in Mimo don’t really count), I managed to deploy the backend for our future system using only advice and scripts from ChatGPT. I’ll abandon it all later, but why—that’s a tale for another time. SystemAnalysis #APIDesign #AIinTech #OpenAPI #Microservices #DigitalTransformation #ProductIdea #TechLeadership #ChatGPT #StartupJourney #QAPIX #Innovation #DeveloperTools #APIAutomation #AIforDevelopers #IntegrationArchitecture

I spent a long time debating where to host the project. On my local machine—something felt off. I started comparing server rentals and narrowed it down to Google and Yandex. Yandex was cheaper, but I trusted Google more.
So I sat down to deploy QAPIX on Google Cloud Platform. Until then, my exposure to GCP had been limited to screenshots online.
I spun up an Ubuntu VM—and off we went. Docker, Docker Compose—all by hand, without any scripts or templates. Networking? VPC in a minute, but I didn’t immediately grasp why the firewall on ports 80 and 8080 wouldn’t respond to pings.
Containers gave me freedom:
PostgreSQL (postgres:15): packed into its own service, attached a volume to a Persistent Disk, initialized the database.
FastAPI on Python 3.11 with Uvicorn: connected to the DB via asyncpg, and Swagger UI kicked in “out of the box.”
Nginx: routed traffic from port 80 to 8080 so as not to break the frontend.
I didn’t sweat the DB schema—I’ve got enough experience—so I modeled everything in Pydantic in half an hour. Another hour went into CRUD endpoints. One typo in a route—and everything crashed.
I fell in love with testing via Swagger immediately: two clicks and you see the JSON flying. And my trusty Postman collections, with assertions on status codes and payload structures, saved me from endless “404” loops.
So, knowing almost nothing about Docker, Nginx, or Python (those few levels I did in Mimo don’t really count), I managed to deploy the backend for our future system using only advice and scripts from ChatGPT.
I’ll abandon it all later, but why—that’s a tale for another time.