LeetCode for System Design
43 comments
·June 23, 2025nelsonfigueroa
koiueo
> marketing fluff
I bet this is not marketing, just an artifact of intensive vibe-coding
rbajp
Thanks for taking a look! I initially built this tool for myself and my friends to help us prep, and they're the ones who encouraged me to release it publicly. Good catch on those issues – will get those fixed soon.
The goal is absolutely the opposite of making system design interviews more miserable. Right now, preparing for these interviews is really gatekept – either you need friends in big tech willing to help you practice, or you have to pay hundreds of dollars for one-on-one sessions.
We're trying to change that, now that its possible with AI.
godot
> Right now, preparing for these interviews is really gatekept – either you need friends in big tech willing to help you practice, or you have to pay hundreds of dollars for one-on-one sessions.
So I'm probably not your target audience (I haven't had to interview for a software eng job for at least 7-8 years), but from my past experiences in these interviews from that time, system design is typically the most sensible part of the interview process and it really does just test if you have experience solving those problems, or if you're less senior, if you can think at least through the problems logically. There are even pretty good system design interview books written out there. What do you think is gatekept about this?
Leetcode took off as a trend because it wasn't easy for company interviewers to come up with good coding questions, partly because the daily work of most software engineers in most companies simply don't involve much tricky programming. System design is the opposite, most software engineers have to work on system design in their daily work and most companies can simply ask system questions very related to their companies' problems.
Looking at it from this lens, I'm not sure what a leetcode for system design adds in value, that a system design interview book doesn't already give.
baq
> Looking at it from this lens, I'm not sure what a leetcode for system design adds in value, that a system design interview book doesn't already give.
What does leetcode for code add if you can read a book about algorithms?
Rhetorical question, obviously. Practice makes perfect, reading about practice doesn’t. Interviews are not even close to what an engineer’s job looks like, so interviews need to be approached just like any other skill you need to learn i.e. practice.
fermentation
How will this not simply encourage hiring managers to include more and more ridiculous system design questions? Prior to the popularity of leetcode, it wasn’t expected to solve a leetcode hard
eunos
System design questions are already ridiculous. Seems that you need to follow 100% of the script they already had in mind fluently and smoothly.
jpgvm
They aren't meant to be but this will definitely force that.
In the past I have generally just had a list of dimensions to help the candidate explore, like separation of concerns, scalability on different system dimensions (concurrent txns, storage, memory, etc), analogies to existing systems/patterns, etc.
I usually have never had a script, merely a problem with a fairly generous solution space and a list of increasingly more difficult to satisfy requirements in order to pressure even the best candidates just a little.
HR has for the last decade tried to completely ignore that and instead try quantify candidates "goodness" with scores, scripts and other bullshit. This has had the rather obvious outcome in missing really good folks that didn't fit into their box and hiring utter trash that gamed their stupid metrics. They keep telling me this is "industry standard" and "how Google does it", but only the latter of that is actually true, the former was forced for no reason whatsover. They conveniently leave out that the reason Google did this for so long is they completely over-indexed on hiring fresh graduates with no experience, little to no intuition or real world knowledge and as such needed to entirely focus on IQ-test-esque questions to just try filter for the top X% of otherwise indistinguishable candidates. None of which is relevant for small teams hiring 10yr+ industry seniors with relevant domain expertise.
Interviews are meant to be about working out if someone will be successful on your team, that means determining if they have the technical chops, a decent enough communication style and enough experience/intuition to work in unfamiliar problem spaces effectively.
Really all you need is the vibe check, a good collaborative systems design exercise helps explore that vibe and quickly separates the pretenders from people with the required knowledge and intuition.
flare_blitz
Thanks for posting. I like the concept but ran into several showstopping bugs. I clicked on the "Start Interview" button and received a "No response from Gemini" error. When I exited the interview, it said that I had used all free interviews and suggested that I upgrade to Premium for $5 a month.
With all due respect, why would I pay $5 a month for something that self-implodes as soon as I hit one button?
captainzidgel
bottom right: "By signing in, you agree to our Terms of Service and Privacy Policy" no button or apparent link to click to show these TOS/PP.
eranation
How is this different from a “you are an expert system designer, please generate a system design interview question for {COMPANY}” OpenAI API wrapper?
rbajp
Well, its in very early stages still, but the main benefit looking forward will be the problem bank and the diagramming experience.
jedberg
I wanted to try this but sadly it just keeps saying "no response from gemini".
rbajp
Made a hotfix for this - but it may still be buggy
rbajp
Working on it!
low_tech_punk
Perfect example of show, don't tell - the project just shows how AI system design can go wrong.
dsab
I don't have a google account to login on this site, so it's useless for me
tayo42
This kind of thing frustrates me so much. Hiring and interviewing for software engineers is just so broken and detached from reality.
In theory, this should be the easiest discussion to pass while interviewing if you've done some kind of related work or qualified. Instead it's just turning into another insane cargo culted game where you need to do things according to some weird rules and hit certain buzzwords.
motorest
> In theory, this should be the easiest discussion to pass while interviewing if you've done some kind of related work or qualified. Instead it's just turning into another insane cargo culted game where you need to do things according to some weird rules and hit certain buzzwords.
Your comment reads as if you think system design interviews are good if you can pass them but they are bad if you fail them.
ddimitrov
I've failed system design interview where they repeatedly asked me to design "for scale" but all the volume and latency requirements I could get from them could be handled by a single HA pair running a monolithic server+ external managed data store.
They also seemed annoyed that I am asking them questions that are not part of the problem statement instead of getting down to drawing a fancy diagram.
I'm sure they hired someone who drew a lot of boxes with cute logos because webscale.
danpalmer
Latency and volume aren't the only requirements, perhaps you failed to discover the other requirements of the system?
- In a fast system most of the latency will come from geographical distance, an HA pair can't serve the whole world with low latency.
- How do you handle new releases? You can do HA with 2 machines, or you can do safe rollouts with blue/green machines, but you can't do HA and safe rollouts with just 2 machines.
- What if you want to test a new release with 1% of traffic? 100 small machines might be preferable to 1 big machine, or 10 medium machines each running 10 instances, or whatever.
- What does the failover look like?
- You use an "external managed data store", but often the tricky bit is the data store. Externalising this may be the most practical option, but it doesn't communicate that you know how it needs to function.
Alternatively this might have just been a bad interview. Many are.
motorest
> I've failed system design interview where they repeatedly asked me to design "for scale" but all the volume and latency requirements I could get from them could be handled by a single HA pair running a monolithic server+ external managed data store.
It's nice that you estimated throughputs and provided your assessment. Odds are that's not the problem you were tasked to solve?
I mean, arguing your way out of proposing a design for scale reads as if you're at a test and you're claiming you shouldn't be asked some questions.
> They also seemed annoyed that I am asking them questions that are not part of the problem statement instead of getting down to drawing a fancy diagram.
Clarifying questions are an expected part of the process, but by it's own nature the design process is iterative and has a initial design on where to iterate over. If you do not show forward progress or present any tangible design, your own system design skills are immediately called into question as, in the very least, it shows you're succumbing to analysis paralysis.
Think about it: you're presented with a problem, and your contribution is to complain about the problem instead of actually offering a solution? What do you think is the output valued by interviewers?
weq
Regurgitation what amounts to ai slop these days paints a clear picture of your Company to the market. Companies that use l33tcode are filtering out pools of talent, in exchange for monocultures of echo chambers. Makes it easier swing the layoff ax when they need to posture to the market differently.
The fact that this ^^^ is run by AI shows just how irrelevant technical interviews like this are these days. Why take a closed book test for a job that you never have to work a day in your life with the book closed?
If you are an experienced dev, u can rank other devs pretty quickly with a general conversation about any technology.
clippy99
What the poster is insinuating is that system design interviews may devolve into a scripted / memorized hoop-jumping experience rather than being a creative technical problem solving discussion.
motorest
> What the poster is insinuating is that system design interviews may devolve into a scripted / memorized hoop-jumping experience rather than being a creative technical problem solving discussion.
I think you're trying to rarionalize away the fact that system design involves knowing patterns and how to apply them.
In each and every single technical field, it's good to be able to improvise but it's even better to know what you are doing.
What you dismiss as "scripted / memorized hoop-jumping experience" actually translates to theoretical and practical knowledge to solve specific problems. Improvisation is a last-resort to fix problems you never faced before.
So yeah, this blend of criticism boils down to complaining that the only good tests are the ones you can pass, and everyone who is more experienced, prepared, and outright competent should not given preferential treatment.
tayo42
Thinking of it as pass or fail is really the problem. That leads to having some kind of criteria that ends up being gamed. Which is what happened to this style of interviewing.
"system design" as it's done now by everyone is a bad interview whether I can pass it or not. You just need to hit the right buzzwords or draw the right boxes. How is that effective?
motorest
> Thinking of it as pass or fail is really the problem. That leads to having some kind of criteria that ends up being gamed. Which is what happened to this style of interviewing.
What do you think a technical interview is? Everyone steps into a job interview hoping to outperform all other candidates in whatever criterias they will be evaluated on. Vibes play a role, but don't you think that being able to demonstrate technical skills matters?
supriyo-biswas
As far as system design is concerned, I feel like I have an answer as to why they're like this.
Typically, FAANG (and wannabe-FAANG) companies have overindexed on algorithm questions; which has been meant to draw people from a research/mathematics-heavy background which was well aligned with their initial needs by which they committed a sort of "original sin."
Since then they've forgotten to balance these requirements against the business applications they build, which require CRUD-heavy work and usually requires knowledge of how systems (databases, queues, performance tuning, load testing, etc.) work. Because said companies continue to overindex on algorithm skills, this means that aforementioned knowledge is deprioritized.
When systems fail to deliver the expected performance, these software engineers seeking new solutions with wild tradeoffs that may not have been required if they were able to come up with the right system model for the software. As an example, at two FAANGs that I've closely studied (and one of which I worked at), people always seem to be selecting serverless functions for scalability, and then adding queues and another set of serverless function to work the queues and slowly write to a database, only to work around the fact that it may have been easier to just have a background thread in a server-based model that processes results and commits them. (This is just one example; I'm sure there are cases where a queue may have been necessary to facilitate an async process. This doesn't discount the fact that serverless is usually the wrong model for most stuff out there.)
It also has a secondary effect on the software engineering market, where said engineers are now able to market their overly complicated solutions, selling it as THE way to build software and dismissing everything else as a toy. This also helps said FAANGs to market their cloud services; with sales people excitedly speaking about some supposed "innovation".
I remember a video from a FAANG organized event where they talked about how their database supports PITR upto the second. At the time, I was young, impressionable and lacked said knowledge, and came away mesmerized about how they could do this and thinking about how I could never write code to do that. Many years later, having read some introductory material about distributed systems and MySQL internals, I happened to remember "why wouldn't any database worth it's salt support PITR; after all, it's just replicating the WAL entries to another host!"
However, said practices have already taken hold, and it has lead to a generation of engineers who would continue to seek cloud services and design overly complicated solutions.
I have a lot of feedback.
No links to the Terms of Service and Privacy Policy.
No way to preview without signing in.
Only way to sign in is with Google.
"Trusted by engineers landing jobs at" ...given how new this is, is this line marketing fluff or is there evidence that engineers actually trust LeetSys?
Finally, I'm worried that this will make system design interviews as miserable as coding interviews.