Where I've been, what I've built, and how I got here.
Jan 2026
No clients, no deadlines, no legacy. Just me, a few interesting ideas, and the joy of vibe coding.
After years of freelancing, startups, and navigating other people's architectural decisions, I'm finally spending time on my own projects. Exploring ideas I've been sitting on, building at my own pace, and enjoying the process without the pressure of demos or investor timelines. Also keeping my eyes open for the right opportunity — something where I can build things that matter with people who care about how they're built.
Status: Active — vibing.
Jul 2025
A new chapter that freed me from legacy code. A greenfield medical AI platform — built from scratch, no compromises on the stack, no inherited mess to untangle.
We agreed on Next.js and FastAPI — I liked TanStack Router and Start more but couldn't bet on it as it was still too early. Vue with Nuxt was also a good option, but we wanted to minimize the effort of finding developers for the stack. The first months flew by without challenges. We built a complex platform on both frontend and backend. The idea was an AI system that could parse medical documents, visit histories, and appointment records so users could track their and their family members' entire medical life. Conditions, symptoms, allergies, active supplements, scheduled visits, current medical situation — all extracted and displayed on a UI where they could report how they feel, letting the system react and track their state so doctor visits would be more productive. Doctors had their own cabinet where they could see everything that happened to a patient and assist them better. The challenging part was the lack of medical knowledge among developers. We had to learn an entirely new field to write better code, and that sometimes led to reimplementing features when we discovered the different nature of certain medical phenomena. The other challenge was AI's indeterminism — we spent more time tuning prompts than writing actual code. The project relied heavily on LLM accuracy for chat-based information extraction, and the MVP was expected to be so perfect in testing that a 0.01% chance bug was marked as a major issue for immediate fix. Have you adapted your UI for a person's name that could be 100 symbols long and still look good in every case? I did. Have you written code that displays the correct description for a disease even when the LLM fails? I did. LLMs were expected to be 100% accurate in all cases — which is technically impossible — and that was treated as the developer's problem. You end up making your code intentionally defensive to the point where it handles every possible shape of AI response, and that's when things get ugly. Eventually the pace slowed down more and more, unable to meet expectations set impossibly high for AI systems of that era.
Why I moved on: Project stopped receiving financing from January 2026. The expectations for AI accuracy were ahead of what the technology could reliably deliver.
Jan 2025
A great opportunity — both financially and professionally. An interesting project in a difficult state, a tight schedule, and a field of work I genuinely cared about.
I was tired of freelancing — it took more time negotiating projects, planning milestones, and waiting for payments than doing actual productive work. I just wanted to write code and build cool things. Codevalet offered exactly that. What drew me in was the *vision* — they didn't just try to blindly exploit whatever LLMs generate for code understanding. The approach was more deterministic: code should be properly read, parsed, learned, documented, maintained, and developed further. LLMs were used for chatting with humans, but analyzing code was not a big burner of tokens. We had a complex phased pipeline — pulling code from GitHub, GitLab, or ZIP files, running semantic analysis, counting stats like LOC, file counts, code percentage, language breakdown, and more. A separate AI-sided project used open-source LLMs alongside other ML models to build code relationship graphs, resolve references, index them, and understand project architecture. Claude Code, Cursor, Copilot and similar projects were still early then. The project also focused on privacy and didn't use platforms from OpenAI, Anthropic, or Google. On my first day, I saw the stack and couldn't agree with the approach. The frontend and backend were built with a custom framework that rendered UI components written in Python, using py-script on the frontend — the idea being full-stack Python. The dev cycle was brutal: fix one bug, add a feature, break other features, fix those, break something else — shortcutting and patching over and over. Meanwhile, the project was heading toward investor demos for the next round of early investment, and the main focus was a smooth MVP. I didn't want to sabotage anything, so on my own time, as an initiative, I started reimplementing the UI and backend from scratch with Next.js, FastAPI, and PostgreSQL. I built Google and GitHub auth, an interface for pulling repos with private repo support from GitHub and GitLab, and minimal code analysis — I had to translate GitHub's Linguist project from Ruby to Python so the backend could analyze code without heavy dependencies. When I showed it to the team and C-levels, everyone liked it. We decided to slowly migrate from py-script to React. The pain with frontend and backend disappeared and we started a new dev cycle with better planning. Then I was pulled into the AI repo — which to my surprise was also full of bad coding practices. A much bigger repo with complex architecture, split across several FastAPI services in a strange attempt to separate functionality. My painful task was integrating that repo as legacy into our new front and back ends. It was a nightmare. The complexity was real but unnecessary. We eventually rebuilt the AI functionality from scratch using Celery, with message exchange between backend and AI server running through Redis Streams. Celery tasks triggered for each pipeline step, and the frontend gained a much better real-time architecture. API requests dropped dramatically in quantity and size, and the whole system became cleaner. I also created a Helm package for deploying all three services to AWS, GCP, or self-hosted Kubernetes, which made production deployment much easier.
Why I moved on: Disagreements on some architectural decisions and practices on the tech stack. Paused at the end of May, took a month off, then resumed part-time from September through December while focusing on Tusdi AI.
Aug 2024
A finance platform using LLMs to analyze company reports — quarterly, annual, internal documents — and help banks make credit approval decisions.
XLRT could parse report files, extract financial information, and calculate metrics like revenue growth, debt ratios, and profitability stats — all displayed on per-company dashboards with heavy use of charts and tables. The frontend challenge wasn't that chart libraries were bad — it was that we needed a super flexible dashboard highly tailored to each company, and at the scale of thousands of companies, no human could handle that level of customization. A huge layer of abstraction in the code was needed. The API responses and JSON structures were so dynamic that we ended up rendering entire tab layouts based on what the backend returned — normally I'd say the frontend should own how data is displayed, not the backend, but the flexibility demanded it. I also worked on the backend side — integrating chat functionality and report generation using selected financial indicators. Message passing ran through RabbitMQ, and automatic report generation used TinyMCE on the frontend with plans to migrate to Tiptap for better customization. As complexity grew the project slowed down, which didn't sit well with the allocated budget and team size.
Why I moved on: Complexity outpaced the budget and team capacity. I was also just tired of freelancing — too much time spent on negotiations, milestones, and chasing payments instead of building. Codevalet came along and offered what I'd been wanting for a long time: just write code and build cool things.
Jan 2024
A visual website builder competing with Webflow, Framer, and studio.design — I joined as a freelancer and ended up reshaping its frontend architecture.
Started by fixing frontend bugs, but quickly moved into the core of the product — building a separate library of customizable components for the visual builder and rearchitecting the project to be scalable and easy to develop on. One solution I'm still proud of was automating custom domain assignment — like Framer and Vercel do — where users could assign their domain, get DNS configuration instructions, and the system would automatically create Kubernetes resources: SSL certificates, ingress rules, and routing to the exact project based on domain name. I still think Kubernetes was the brilliant choice there — infinitely scalable and highly automatable. This was also right when ChatGPT exploded. We were among the first to try generating UI with it — back on GPT-3.5 Turbo, before Vercel's AI package even existed as an idea. The challenges with instruction following and structured outputs were real. Countless 'build a web page with a prompt' projects weren't a thing yet.
Why I moved on: We couldn't come to an agreement on project budget with the owners.
Jun 2023
After graduating TSUE and parting ways with YoFi, I poured months into my own project — an early attempt at unifying AI agents before the world had a word for them.
Asy AI wasn't just another LLM wrapper. At its core, it was an AI-enabled Zapier — a workflow builder with a visual editor where developers could build and host their own agents, and users could grant those agents permissions to the platforms they wanted automated. Our approach was unique — we unified multiple models, picking the best performers for specific tasks rather than relying on a single LLM for everything. We built a Shopping Agent that could take orders via natural language and place them on integrated online shopping platforms. Lexpert Agent — an experiment in legal AI that could research court cases and analyze contracts. A Mail Manager Agent integrated directly with Gmail that could read, analyze, and respond to emails based on your instructions. Terms like 'AI Agents,' 'automated AI workflows,' and 'agentic systems' were barely concepts back then — not business realities. The capabilities of AI on such complex tasks were limited and success chances were minimal, but we managed to make several of these agents work with surprisingly high quality. AI has evolved massively since then and many great projects have appeared, but I still believe the approach was sound — if we could have secured real investment back then, it had legs. I prepared it for the Prezident Tech Awards at awards.gov.uz — a competition with a $1 million prize fund, $100K for first place. I made it to the final phase but failed miserably at the presentation. I couldn't explain what it was and why anyone needed it in a way that landed. Most of the concepts were simply too early for business at that time.
Why I moved on: The presentation didn't succeed — mostly my lack of presenting skills and the difficulty of conveying a fundamentally new approach in an understandable way. Finances from YoFi were already running tight and I needed a source of income, so I moved to freelance work on ICOMS.
Dec 2021
Built backend infrastructure and ML pipelines for YoFi — an AI platform helping e-commerce merchants combat bots, fraud, and unauthorized resale.
Engineered serverless applications and API infrastructure across AWS and GCP. Developed ML models predicting customer order scores — return/cancellation likelihood, fraud detection, and promotion code abuse. Built a cutting-edge service for e-commerce merchants to combat bots and unauthorized resale of newly released products. Developed a secure identity platform to safeguard business profits and enable confident transaction decision-making. Orchestrated 20+ specialized microservices, led data pipeline construction improving ML workflow efficiency by 40% using Compute Engine, Dataproc, Apache Airflow, and BigQuery. Managed SNS, SQS, DynamoDB, Neptune, and other AWS services. Handled database migration to MongoDB for significant cost savings. Built scalable Lambda applications for Shopify API integration ensuring seamless onboarding for store owners. Contributed to the React frontend. Directed a high-availability backend infrastructure with 99.9% uptime.
Why I moved on: Paused my contract to focus on diploma work as an undergraduate. The company later laid off part of the team for cost optimization.
Sep 2019
A chapter I remember with real warmth — helping Uzbek scholars get seen by the world. We maintained publishing systems, built a national research platform, and learned a lot along the way.
It started with maintaining over 10 OJS (Open Journal Systems) installations for universities across Uzbekistan — JSPI, ADTI, TSUE, Karakalpakstan Medical Institute, and others. The goal was simple but meaningful: increase the visibility of Uzbek researchers in global search engines and scholar systems like Google Scholar, ORCID, and Scopus. I also helped with technical decisions and maintenance of Moodle installations at some of these universities, though that was a smaller part of my work. The bigger story was scienceweb.uz — a platform we decided to build to unify all of this. Instead of scattered journal systems, we wanted one focused platform with proper SEO optimization, indexing, and deep integrations with Scopus, Web of Science, ORCID, Google Scholar, and Crossref for automatic DOI resolution. I built the entire stack singlehandedly — Nuxt.js/Vue.js on the frontend, Laravel on the backend, with Python and Node.js modules for integrations and an OCR subsystem for scanned PDFs. It grew to 14,000+ publications from 4,000+ researchers. Scienceweb was a unique project in the region, but unfortunately it never found enough of an audience to cover its own costs. We kept maintaining it and pushing it forward, but it remained a labor of love more than a business.
Why I moved on: YoFi offered me orders of magnitude better compensation and an entrance into ML — a space I had been studying on my own for over two years. I handed the project over to the team and moved on.
Mar 2018
Not an engineering job — but the one that taught me how businesses actually work. I learned insurance, banking, and corporate operations from the inside, and quietly built software that changed how our branch operated.
I didn't walk in as an engineer. I was at the very beginning of my professional years, hired for economic and legal work — verifying insurance coverage, handling claims, managing authorizations, and serving customers at the counter. But that experience shaped how I think about software to this day. I stopped seeing code as a toy and started seeing it through the lens of business value, consumer needs, and project economics. The entire company ran on a painfully old internal database — used across all branches so headquarters could track what contracts were being created and what policy numbers were in use. It was more of a monitoring and auditing system than anything useful. Contracts and insurance policies were created manually. Reports were assembled by hand in Excel spreadsheets. Everything that could go wrong with manual data entry did. As the lowest person in the hierarchy, I couldn't change the massive legacy system across the company — and honestly, I probably would have failed trying at that experience level. So I built something small: a tool for our branch that generated contracts and policies ready for printing. It cut human-factor errors by over 70% and brought contract creation time from 10 minutes down to 30 seconds. On busy days, we went from serving a customer in 10–30 minutes to under 5. When supervisors at the main branch saw what I'd built, they invited me to work at headquarters. I never took that chance.
Why I moved on: I was already tired of banks and insurance and wanted to spend more time coding. Around the same time, I started studying Corporate Governance at Tashkent State University of Economics (TSUE) — and I-Edu Group came calling.