Many hard LeetCode problems are easy constraint problems
205 comments
·September 12, 2025tracker1
samiv
The LC interviews are like testing people how fast they can run 100m after practice, while the real job is a slow arduous never ending jog with multiple detours and stops along the way.
But yeah that's the game you have to play now if you want the top $$$ at one of the SMEGMA companies.
I wrote (for example) my 2D game engine from scratch (3rd party libs excluded)
https://github.com/ensisoft/detonator
but would not pass a LC type interview that requires multiple LC hard solutions and a couple of backflips on top. But that's fine, I've accepted that.
MarcelOlsz
5 years ago you'd have a project like that, talk to someone at a company for like 30m-1hr about it, and then get an offer.
MarcelOlsz
100%. I just went through an interview process where I absolutely killed the assignment (had the best one they'd seen), had positive signal/feedback from multiple engineers, CEO liked me a lot etc, only to get sunk by a CTO who thought it would be cool to give me a surprise live test because of "vibe coding paranoia". 11 weeks in the process, didn't get the role. Beyond fucking stupid.
This was the demo/take-home: https://github.com/rublev/monumental
_whiteCaps_
A surprise live test is absolutely the wrong approach for validating whether someone's done the work. IMO the correct approach is to go through the existing code with the applicant and have them explain how it works. Someone who used AI to build it (or in the past had someone else build it for them) wouldn't be able to do a deep dive into the code.
MarcelOlsz
We did go into the assignment after I gently bowed out of the goofy live test. The CTO seemed uninterested & unfamiliar with it after returning from a 3 week vacation during the whole process. I waited. Was happy to run him through it all. Talked about how to extend this to a real-world scenario and all that, which I did fantastically well at.
johnfn
It's funny because this repo really does seem vibe-coded. Obviously I have no reason not to believe you, but man! All those emojis in the install shell script - I've never seen anyone other than an AI do that :) Maybe you're the coder that the AI companies trained their AI on.
Sorry about the job interview. That sucks.
MarcelOlsz
I used AI for the Docker setup which I've already done before. I'm not wasting time on that. Yeah you can vibe code basic backend and frontend and whatnot, but you're not going to vibe code your way to a full inverse kinematics solution.
I'm not a math/university educated guy so this was truly "from the ground up" for me despite the math being simple. I was quite proud of that.
tracker1
Damn... that's WAY more than I'll do for an interview process assignment... I usually time box myself to an hour or two max. I think the most I did was a tic-tac-toe engine but ran out of time before I could make a UI over it.
MarcelOlsz
I put absolutely every egg into that basket. The prospect of working in Europe (where I planned to return to eventually) working on cool robot stuff was enticing.
The fucking CTO thought I vibe-coded it and dismissed me. Shout-out to the hiring manager though, he was real.
_whiteCaps_
That is an insane amount of work for a job application. Were you compensated for it at all?
MarcelOlsz
No. Should I invoice them? I'm still livid about it. The kicker is the position pays a max of 60-120k euros, the maximum being what I made 5 years ago.
samiv
Wait, what.. you did this as a take home for a position? Damn that looks excessive.
MarcelOlsz
Yes. I put a ton of work into it. I had about 60 pages worth of notes. On inverse kinematics, FABRIK, cyclic algorithms used in robotics, A*/RRT for real-world scenarios etc. I was super prepared. Talked to the CEO for about two hours. Took notes on all videos I can find of team members on youtube and their company.
Luckily the hiring manager called me back and levelled with me, nobody kept him in the loop and he felt terrible about it.
Some stupid contrived dumbed down version of this crane demo was used for the live test where I had to build some telemetry crap. Nerves took over, mind blanked.
Here's the take-home assignment requirements btw: https://i.imgur.com/HGL5g8t.png.
Here's the live assignment requirements: [1] https://i.imgur.com/aaiy7QR.png & [2] https://i.imgur.com/aaiy7QR.png.
At this rate I'm probably going to starve to death before I get a job. Should I write a blog post about my last 2 years of experiences? They are comically bad.
This was for monumental.co - found them in the HN who's hiring threads.
another_twist
Its not really memorizing solutions. Yes you can get quite far by doing so but follow ups will trip people up. However if you have memorized it and can answer follow ups, I dont see a problem with Leetcode style problems. Problem solving is about pattern matching and the more patterns you know and can match against, the better your ability to solve problems.
Its a learnable skill and better to pick it up now. Personally I've solved Leetcode style problems in interviews which I hadnt seen before and some of them were dynamic programming problems.
These days its a highly learnable skill since GPT can solve many of the problems, while also coming up with very good explanations of the solution. Better to pick it up than not.
silisili
It is and isn't. I'd argue it's not memorizing exact solutions(think copy paste) but memorizing fastest algos to accomplish X.
And some people might say well, you should know that anyways. The problem for me is, and I'm not speaking for every company of course, you never really use a lot of this stuff in most run of the mill jobs. So of course you forget it, then have to study again pre interview.
Problem solving is the best way to think of it, but it's awkward for me(and probably others) to spend minutes thinking, feeling pressured as someone just stares at you. And that's where memorizing the hows of typical problems helps.
That said, I just stopped doing them altogether. I'd passed a few doing the 'memorizing' described above, only to start and realize it wasn't at all irrelevant to the work we were actually doing. In that way I guess it's a bit of a two way filter now.
bluGill
The only part of memorizing fastest algorithm the vast majority needs is whatever name that goes by in your library. Generic reusable code works very well in almost any language for algorithms.
Even if you are an exception either you are writing the library meaning you write that algorithm once for the hundreds of other users, or the algorithm was written once (long ago) and you are just spending months with a profiler trying to squeeze out a few more CPU cycles of optimization.
There are more algorithms than anyone can memorize that are not in your library, but either it is good enough to use a similar one that already is your library, or you will build it once and once again it works so you never go back to it.
Which is to say memorizing how to implement an algorithm is a negative: it means you don't know how to write/use generic reusable code. This lack is costing your company hundreds of thousands of dollars.
Freedom2
"Fastest algos" very rarely solve actual business problems, which is what most of us are here to do. There's some specialized fields and industries where extreme optimization is required. Most of software engineer work is not that.
HarHarVeryFunny
I'd say that learning to solve tough LeetCode problems has very little (if no precisely zero) value in terms of you as a programmer learning to do something useful. You will extremely rarely need to solve these type of tougher select-the-most efficient-algorithm problems in most real-world S/W dev jobs, and nowadays if you do then just as AI.
Of course you may need to pass an interview LeetCode test, in which case you may want to hold your nose and put in the grind to get good at them, but IMO it's really not saying anything good about the kind of company that thinks this is a good candidate filter (especially for more experienced ones), since you'd have to be stupid not to use AI if actually tasked with needing to solve something like this on the job.
tracker1
I'm fine with that in an interview... I'm not fine with that, in a literally AI graded assignment where you cannot ask clarifying questions. In those cases, if you don't have a memorized answer a lot of times I cannot always grasp the question at hand.
I've been at this for 30+ years now, I've built systems that handle millions of users and have a pretty good grasp at a lot of problem domains. I spent about a decade in aerospace/elearning and had to pick up new stuff and reason with it all the time. My issue is specifically with automated leetcode pre-interview screening, as well as the gamified sites themselves.
wyager
Leetcode with no prep is a pretty decent coding skill test
The problem is that it is too amenable to prep
You can move your score like 2stddev with practice, which makes the test almost useless in many cases
On good tests, your score doesn't change much with practice, so the system is less vulnerable to Goodharting and people don't waste/spend a bunch of time gaming it
GuB-42
> My biggest problem with leetcode type questions is that you can't ask clarifying questions.
What is there to clarify? Leetcode-type questions are usually clear, much clearer than in real life projects. You know the exact format of the input, the output, the range for each value, and there are often examples in addition to the question. What is expected is clear: given the provided example inputs, give the provided example outputs, but generalized to cover all cases of the problem statement. The boilerplate is usually provided.
One may argue that it is one of the reasons why leetcode-style questions are unrealistic, they too well specified compared to real life problems that are often incomplete or even wrong and require you to fill-in the gaps. Also, in real life, you may not always get to ask for clarification: "here, implement this", "but what about this part?", "I don't know, and the guy who knows won't be back before the deadline, do your best"
The "coin" example is a simplification, the actual problem statement is likely more complete, but the author of the article probably felt these these details were not relevant to the article, though it would be for someone taking the test.
strangattractor
IMO leetcode has multiple problems.
1. People can be hired to take the test for you - surprise surprise 2. It is akin to deciding if someone can write a novel from reading a single sentence.
another_twist
Hiring people for the test is only valid for online assessment. For an onsite, its very obvious if the candidates have cheated on the OA. I've been on the other side and its transparent.
> It is akin to deciding if someone can write a novel from reading a single sentence.
For most decent companies, the hiring process involves multiple rounds of these challenges along with system designs. So its like judging writing ability by having candidates actually write and come up with sample plots. Not a bad test.
lawlessone
The one's i've gotten have all seemed more like tests of my puzzle solving skills than coding.
The worst ones i've had though had extra problems though:
one i was only told about when i joined the interview and that they would be watching live.
One where they wanted me streaming my face the whole time (maybe some people people are fine with that)
And one that would count it against me if i tabbed to another page. So no documentation because they assume i'm just googling it.
Still it's mostly on me to prepare and expect this stuff now.
another_twist
You can make up API calls which you can say you'd implement later. As long as these are not tricky blocks, you'll be fine.
garrettgarcia
> My biggest problem with leetcode type questions is that you can't ask clarifying questions.
Huh? Of course you can. If you're practicing on leetcode, there's a discussion thread for every question where you can ask questions till the cows come home. If you're in a job interview, ask the interviewer. It's supposed to be a conversation.
> I wouldn't even mind the studying on leetcode types sites if they actually had decent explainers
If you don't find the hundreds of free explanations for each question to be good enough, you can pay for Leetcode Pro and get access to editorial answers which explain everything. Or use ChatGPT for free.
> It's not a matter of skill, it's just my ability to take in certain types of problems doesn't work well.
I don't mean to be rude, but it is 100% a matter of skill. That's good news! It means if you put in the effort, you'll learn and improve, just like I did and just like thousands and thousands of other humans have.
> Without any chance of additional info/questions it's literally a setup to fail.
Well with that attitude you're guaranteed to fail! Put in the work and don't give up, and you'll succeed.
tracker1
Last year, I saw a lot of places do effectively AI/Automated pre-inverview screenings with a leetcode web editor, and a video capture... This is what I'm talking about.
I'm fine with hard questions in an actual interview.
another_twist
> My biggest problem with leetcode type questions is that you can't ask clarifying questions.
Yeah this one confused me. Not asking clarifying questions is one of the sureshot ways of failing an interview. Kudos if the candidates ask something that the interviewers havent thought of, although its rare as most problems go through a vetting process (along with leak detection).
smohare
[dead]
epolanski
Many interviews now involve automated exercises on websites that track your activity (don't think about triggering a focus change event on your browser, it gets reported).
Also, the reviewer gets an AI report telling it whether you copied the solution somewhere (expressed as a % probability).
You have few minutes and you're on your own.
If you pass that abomination, maybe, you have in person ones.
It's ridiculous what software engineers impose on their peers when hiring, ffs lawyers, surgeons, civil engineers get NO practical nor theorical test, none.
SAI_Peregrinus
At least in the US, lawyers, surgeons, & civil engineers all have accredited testing to even enter the profession, in the form of the bar exam, boards, and FE & PE tests respectively. So they do have such theoretical tests, but only when they want to gain their license to practice in a given state. Software doesn't have any such centralized testing accreditation, so we end up with a mess.
dmoy
The major difference between software devs and lawyers, surgeons, and civil engineers is that the latter three have fairly rigorous standards to pass to become a professional (bar, boards, and PE).
That could exist for software too, but I'm not sure HN folks would like that alternative any better. Like if you thought memorizing leetcode questions for 2 weeks before an interview was bad, well I have some bad news.
Maybe in 50-100 years software will have that, but things will look very different.
lukan
"don't think about triggering a focus change event on your browser, it gets reported)."
So .. my approach would be to just open dev tools and deactivate that event.
Show of practical skill or cheating?
ok123456
How does asking clarifying questions work when a non-programmer is tasked with performing the assessment, because their programmers are busy doing other things, or find it degrading and pointless?
tbmbob
> Now if I actually brought these questions to an interview the interviewee could ruin my day by asking "what's the runtime complexity?"
This completely undermines the author's main point. Constraint solvers don't solve hard leetcode problems if they can't solve large instances quickly enough.
Many hard leetcode problems can be solved fairly simply with more lax runtime requirements -- coming up with an efficient solution is a large part of the challenge.
holden_nelson
I feel like if I'm being asked this in an interview, they're not asking me to use a constraint solver, they're asking me to _write_ a constraint solver. Just for a specific constraint problem, not a more general constraint solver.
gnfargbl
You're right, but that just shows how fundamentally silly this interview approach is.
In any real engineering situation I can solve 100% of these problems. That's because I can get a cup of coffee, read some papers, look in a textbook, go for a walk somewhere green and think hard about it... and yes, use tooling like a constraint solver. Or an LLM, which knows all these algorithms off by heart!
In an interview, I could solve 0% of these problems, because my brain just doesn't work that way. Or at least, that's my expectation: I've never actually considered working somewhere that does leetcode interviews.
segmondy
I was told to use ANY language in an interview. I asked them if they were sure, so I solved it with J. They were not too pleased and asked me if I could use another language, so I did prolog and we moved on to the next question. Then the idiot had the audacity to say I should not use "J and Prolog" but any common known language. I asked if assembly was fine, and they said no. Perhaps python or javascript. I did the rest in python, needless to say I didn't get the job. :-)
felixyz
You're a hero!
nice_byte
I find it hilarious when people brag about stupid shit like that. Congrats on sabotaging your own interview process I guess??
koakuma-chan
I haven't been asked leetcode questions in a while and when I was asked, it was an easy level problem. I don't know where they ask hard leetcode problems, I also never solved a hard leetcode problem on my own.
bluGill
The purpose of coding questions should be a problem that you can solve in about 20 minutes, then they ask another, and then you get 20 minutes to either finish or talk about other things. If you ask questions where either someone knows the trick and they pass, or they don't and fail you don't learn much. You need to watch the person write code to see if they are reasonable about it.
bradlys
I'm routinely asked LC Hard questions in interviews. Sometimes more than one in one 45 minute interview.
That said, I interview in silicon valley and I'm a mixed race American. (extremely rare here) I think a lot of people just don't want me to pass the interview and will put up the highest bar they can. Mind you, I often still give optimal solutions to everything within good time constraints. But I've practiced 1000+ problems and done several hundred interviews.
lucideer
This will be true in some interviews, but not in all.
I'm generally against using leetcode in interviews, but wherever I've seen it used it's usually for one reason & one reason alone: known dysfunctional hiring processes. These are processes where the participants in the hiring process are aware of the dysfunction in their process but are either powerless or - more often - too disorganised to properly reform the process.
Sometimes this is semi-technical director level staff leveraging HR to "standardise" interview techniques by asking the same questions across a wide range of teams within a large corp. Other times this is a small underresourced team cobbling together interview questions from online resources in a hurry, not having the cycles to write a tailored process for themselves.
In these cases, you're very likely to be dealing with a technical interviewer who is not an advocate of leetcode interviewing & is attempting to "look around" the standardised interview scoring approach to identify innovative stand out candidates. In a lot of cases I'd hazard even displaying an interest in / some knowledge of solvers would count significantly in your favour.
_alternator_
This. Literally every problem in NP can be cast as a constraint problem. The question of whether a solver is the right solution varies a lot depending on the application, and in an interview , it’s almost by definition not the right solution.
They can also be dreadfully slow (and typically are) compared to just a simple dynamic program.
Der_Einzige
If someone solves a leetcode hard with a constraint solver and you don't hire them, you are an idiot.
Do you know how few people in this world even know what a constraint solver is, let alone how to correctly define the problem into one?
I used a constraint solver to solve a homework problem once in my CS degree 3rd year. My god just writing the damn constraints was a huge cognitive load!
hackingonempty
I did this, wrote an Essence-prime program to generate Minion solver code for a simple instance of the knapsack problem, as part of a startups "solve one of these and get an interview" challenges. Because I had used those tools recently for a contract job (and wrote/presented a paper on invitation of the solver authors,) I thought it would be fun and didn't really want the job. Got an interview but every dev was like "why did you use a cannon to swat a fly?" and were clearly concerned that without strict supervision I would create baroque towers of garbage for them to clean up.
xenocratus
> If someone solves a leetcode hard with a constraint solver and you don't hire them, you are an idiot.
I do hope you're exagerating here, but in case you aren't: this is an extremely simplistic view of what (software) engineers have to do, and thus what hiring managers should optimize for. I'd put "ability to work in a team" above "raw academic/reasoning ability" for the vast majority of engineering roles, any day.
Not that the latter doesn't matter, of course, but it's by no means the one and only measure.
lucianbr
> I'd put "ability to work in a team" above "raw academic/reasoning ability" for the vast majority of engineering roles, any day.
In this hypothetical, why do you do leetcode hard interviews?
bryanrasmussen
OK, but obviously this presupposes a job where the hiring process is focused on leetcode.
Der_Einzige
Hey I'm with you 100% about the idea of code-interviews/leetcode being a problem and the importance of culture-fit and ability to work on a team.
I should have said "if you deemed this a fail on the code interview, you are an idiot".
yogorenapan
I've won a couple hackathons with just CP-SAT & Linear Programming which led to my first jobs. I'm surprised not more people know/use it. Very inefficient compared to the "correct" answer but the development speed is much faster.
> If someone solves a leetcode hard with a constraint solver and you don't hire them, you are an idiot
Sometimes you just don't want someone that takes these shortcuts. I think being able to solve the problem without a constraint solver is much more impressive
Analemma_
Yes and no: I've asked questions like this in interviews, and I'd count it as a plus if the candidate reached for a constraint solver. They're criminally underused in real-world software engineering and this would show the candidate probably knows how to get the right answer faster instead of wasting a bunch of time.
Now, if they did answer with a constraint solver, I'd probably ask some followup whiteboard questions to make sure they do actually know how to code. But just giving a constraint solver as an answer definitely wouldn't be bad.
qnleigh
Yes, especially if the interviewee said something like 'this may not be asymptomatically optimal, but if it's not a known bottleneck, then I might start with constraint solver to get something working quickly and then profile later.' Especially if it's a case where even the brute-force solution is tricky.
Otherwise penalizing interviewees for suggesting quick-and-dirty solutions reinforces bad habits. "Premature optimization is the root of all evil," after all.
bluGill
Using a bad algorithm when a good algorithm that is known to exist is premature pessimization and should be avoided.
There is some debate about what premature optimization is, but I consider it about micro optimizations that often are doing things a modern compiler will do for you better than you can. All too often such attempts result in unreadable code that is slower because the optimizer would have done something different but now it cannot. Premature optimization is done without a profiler - if you have a profile of your code and can show a change really makes a difference then it isn't premature.
On the other hand job interviews imply time pressure. If someone isn't 100% sure how to implement the optimization algorithm without looking it up brute force is faster and should be chosen then. In the real world if I'm asked to do something I can spend days researching algorithms at times (though the vast majority of the time what I need is already in my language's standard library)
PartiallyTyped
It’d be a positive in my book if they used a constraint solver.
YetAnotherNick
General constraint solver would be terribly inefficient for problems like these. It's a linear problem and constraint solver just can't handle O(10^6) variables without some beefy machine.
nextos
FWIW, the OP's problem is not linear. It's an integer programming problem.
A trick if you can't do a custom algorithm and using a library is not allowed during interview could be to be ready to roll your own DPLL-based solver (can be done in 30 LOC).
Less elegant, but it's a one-size-fits-all solution.
NoahZuniga
O(10^6) = O(1)
OutOfHere
Okay, but who says you need to use a simple constraint solver? There are various sophisticated constraint solvers that know how to optimize.
At this point, job interviews are so far removed from actual relevance. Experience and aptitude still matter a lot, but too much experience at one employer can ground people in rigid and limiting ways of thinking and solving problems.
kccqzy
Great insight. But this is sadly not applicable to interviews.
> It's easy to do in O(n^2) time, or if you are clever, you can do it in O(n). Or you could be not clever at all and just write it as a constraint problem
This nails it. The point of these problems is to test your cleverness. That's it. Presenting a not-clever solution of using constraint solvers shows that you have experience and your breadth of knowledge is great. It doesn't show any cleverness.
viccis
>The point of these problems is to test your cleverness.
In my experience, interviewers love going to the Leetcode "Top Interview 150" list and using problems in the "Array String" category. I'm not a fan of these problems for the kind of jobs I've interviewed for (backend Python mostly), as they are almost always a "give me a O(n) runtime O(1) memory algorithm over this array" type challenge that really doesn't resemble my day to day work at all. I do not regularly do in-place array algorithms in Python because those problems are almost always handled by other languages (C, Rust, etc.) where performance is critical.
I wish interviewers would go to the "Hashmap" section for interviews in Python, JavaScript, etc., type of languages. They are much less about cleverness and more about whether you can demonstrate using the appropriate tools in your language to solve problems that actually do resemble ones I encounter regularly.
There's also the problem of difficulty tuning on some of these. Problem 169 (Majority Element) being rated "Easy" for getting a O(n) runtime O(1) memory solution is hilarious to me. The algorithm first described in 1981 that does it (Boyer–Moore majority vote algorithm) has a Wikipedia page. It's not a difficult to implement or understand algorithm, but its correctness is not obvious until you think about it a bit, at which point you're at sufficient "cleverness" to get a Wikipedia page about an algorithm named after you. Seems excessive for an "Easy" problem.
bluGill
Interviews should not be about cleverness. They should test that you can code. I almost never write an algorithm because all the important algorithms are in my standard library already. Sure back in school I did implement a red-black tree - I don't remember if it worked, but I implemented it: I can do that again if you need me to, but it will take me several days to get all the details right (most of it looking up how it works again). I use red-black trees all the time, but they are in the language.
You need to make sure a candidate can program so asking programing question make sense. However the candidate should not be judged on if they finish or get an optimal or even correct answer. You need to know if they write good code that you can understand, and are on a path that if given a reasonable amount of time on a realistic story would finish it and get it correct. If someone has seen the problem before they may get the correct answer, but if they have not seen it they won't know and shouldn't expected to get the right answer in an hour.
Anon1096
Majority Element is rated easy because it can be trivially solved with a hashmap in O(N) space and that's enough to pass the question on Leetcode. The O(1) space answer is probably more like a medium.
viccis
Yeah it just depends on whether your interviewer considers that "solved". To test this out, I wrote a one liner in Python (after imports) that solves it with a hashmap (under the hood for Counter, which uses a heap queue to find the most common one):
return Counter(nums).most_common(1)[0][0]
And that's 50th percentile for runtime and memory usage. Doing it with another one liner that's 87% percentile for time because it uses builtin Python sorting but is 20th percentile for memory:
return sorted(nums)[len(nums) // 2]
But the interviewer might be looking for the best approach, which beats "100%" of other solutions in runtime per Leetcode's analysis:
m, c = -1, 0
for x in nums:
if not c:
m = x
c = 1
elif m == x:
c += 1
else:
c -= 1
return m
If I were interviewing, I'd be happy with any of these except maybe the sorted() one, as it's only faster because of the native code doing the sort, which doesn't change that it's O(n log n) time and O(n) space. But I've had interviews where I gave answers that were "correct" to the assumptions and constraints I outlined but they didn't like them because they weren't the one from their rubric. I still remember a Google interview, in which we're supposed to "design to scale to big data", in which they wanted some fiddly array manipulation algorithm like this. I gave one that was O(n log n) but could be done in place with O(1) memory, and the interviewer said it was "incorrect" in favor of a much simpler O(n) one using dicts in Python that was O(n) memory. Had the interviewer specified O(n) memory was fine (not great for "big data" but ok) I would have given him the one liner that did it with dicts lolI guess my point is that interviewers should be flexible and view it as a dialogue rather than asking for the "right answer". I much prefer "identify the bug in this self contained code snippet and fix it" type problems that can be completed in <15-30 minutes personally, but Leetcode ones can be fine if you choose the right problems for the job.
3vidence
Honestly in day to day programming I find data types & associated APIs are so so much more important than algorithms.
I would rather work with a flexible data type with suboptimal performance than a brittle data type that maybe squeezes out some extra performance.
Your example of in-place array mutation feels like a good example of such a thing. I feel like there should be a category of interviewing questions for "code-safety" not just performance.
roadside_picnic
> The point of these problems is to test your cleverness.
Last round I did at Meta it was clearly to test that you grinded their specific set of problems, over and over again, until you could reproduce them without thinking. It's clear because the interviewers are always a bit surprised when you answer with whatever is not the text-book approach on both leetcode and on the interview guide they studied.
Cleverness is definitely not high on the list of things they're looking for.
btilly
Bottom up dynamic programming algorithms require some cleverness.
All of the ones listed can be solved with a top down dynamic programing algorithm. Which just means "write recursive solution, add caching to memoize it".
For some of these, you can get cleverer. For example the coin change problem is better solved with an A* search.
Still, very few programmers will actually need these algorithms. The top thing we need is to recognize when we accidentally wrote a quadratic algorithm. A quick scan of https://accidentallyquadratic.tumblr.com/ shows that even good people on prominent projects make that mistake on a constant basis. So apparently being able to produce an algorithm on the test, doesn't translate to catching an algorithmic mistake in the wild.
chaboud
When I interview with problem solving problems, the point is to understand how the candidate thinks, communicates, and decomposes problems. Critically, problem solving questions should have ways to progressively increase and decrease difficulty/complexity, so every candidate "gets a win" and no candidate "dunks the ball".
Interviewers learn nothing from an instant epiphany, and they learn next to nothing from someone being stumped.
Unfortunately, this is why we can't have nice things. Problem solving questions in interviews can be immensely useful tools that, sadly, are rarely usefully used.
mjr00
> the point is to understand how the candidate thinks, communicates, and decomposes problems.
100% and it's a shame that over time this has become completely lost knowledge, on both sides of the interview table, and "leetcode" is now seen as an arbitrary rote memorization hurdle/hazing ritual that software engineers have to pass to enter a lucrative FAANG career. Interviewees grind problems until they've memorized every question in the FAANG interview bank, and FAANG interviewers will watch a candidate spit out regurgitated code on a whiteboard in silence, shrug, and say "yep, they used the optimal dynamic programming solution, they pass."
bluGill
If somebody writes the optimal algorithm that should be a negative unless their resume indicates they are writing that algorithm often. The only reason you should know any algorithm well enough to get it right is if your job is implementing the optimal version for every single language. Of course nobody maintains one algorithm in many different languages/libraries (say libc++, python, rust, ada, java - each has different maintainers), so I can safely safe the number is zero who should be able to implement your cleaver algorithm. Now if your cleaver algorithm is in the language standard library (or other library they often use) that should be able to call/use it, though even then I expect them to look up the syntax in most languages.
Peritract
> Critically, problem solving questions should have ways to progressively increase and decrease difficulty/complexity, so every candidate "gets a win" and no candidate "dunks the ball".
Absolutely agree. When I interview, I start with a simple problem and add complexity as they go. Can they write X? Can they combine it with Y? Do they understand how Z is related?
Apocryphon
> the point is to understand how the candidate thinks, communicates, and decomposes problems
Interviewers always say this, but consider: would you endorse a candidate who ultimately is unable to solve the problem you've presented them, even if they think, communicate, and decompose problems well? No interview in this industry prizes those things over getting the answer right.
bluGill
Every interview I know is severely time limited. I don't care if you can solve the problem, so long as your are clearly making progress and have proven you could solve the problem if given longer.
Now I give you problems I expect to take 20 minutes if you have never seen them before so you should at least solve 1. I have more than once realized someone was stuck on the wrong track and redirection efforts were not getting them to a good track so I switched to a different problem which they were then able to solve. I've also stopped people when they have 6 of 10 tests passing because it is clear they could get the rest passing but I wouldn't learn anything more so it wasn't worth wasting their time.
In the real world I'm going to give people complex problems that will take days to solve.
StefanBatory
Would a good answer be "I can do it as a constraint problem, but since I guess you are not asking for this, the solution is..." and then proceed as usual?
chaboud
Id probably stop the candidate, dig into how they’d using constraint based solvers, and how they might expect that to fall apart. Applicability and judgment is worth way more than raw algorithmic questions.
One way to think about this is:
Is a fresh graduate more likely to provide a solid answer to this than a strategic-thinking seasoned engineer? If so, just be conscious of what your question is actually probing.
And, yes, interview candidates are often shocked when I tell them that I’m fine with them using standard libraries or tools that fit the problem. It’s clear that the valley has turned interviewing into a dominance/superiority model, when it really should be a two-way street.
We have to remember that the candidate is interviewing us, too. I’ve had a couple of interviews as the interviewee where the way the interview was conducted was why I said “no” to an offer (no interest in a counter, just a flat “no longer interested” to the recruiter, and, yes, that surprises recruiters, too).
null
corimaith
>The point of these problems is to test your cleverness.
No it's just memorization of 12 or so specific patterns. The stakes are too high that virtually everyone going in will not be staking passing on their own inherent problem solving ability. LeetCode has been so thoroughly gamified that it has lost all utility of differentiability beyond willingness to prepare.
erikerikson
Given this consider that LeetCode solving is rarely ever part of your work. So then, what are they selecting for with the habit?
cratermoon
Selecting for people like themselves.
bee_rider
Yeah, it tests if the candidate enjoys the programming-adjacent puzzle game of LeetCode, which is a perfectly decent game to play, but it is just a signal.
If somebody grinds LeetCode while hating it, it signals they are really desperate for a job and willing to jump through hoops for you.
If somebody actually enjoys this kind of stuff, that is probably a signal that they are a rare premium nerd and you should hire them. But the probably play Project Euler as well (is that still up?).
If somebody figures out a one-trick to minmax their LeetCode score… I dunno, I guess it means they are aware of the game and want to solve it efficiently. That seems clever to me…
jkubicek
In defense of questions like this, “willingness to prepare” is a significant differentiator
erikerikson
But what is it differentiating? And is it really the best evidence of willingness to prepare? My MSc and BA on the topics, my open source contributions, two decades of industry experience... Those aren't evidence of not only willingness but execution of preparation?
bluGill
That they would ask me to prepare for that is a signal as well.
In no case is it a useful signal on if I can do my job better than someone else. Some people like this type of problem and are good at it anyway which is a good signal compared to average - but there are also above average people who don't enjoy this type of problem and so don't practice it. Note that both cases the people I'm talking about did not memorize the problem and solution.
avgDev
It also means "I don't have money for food, and at this point I am desperate".
tjpnz
That willingness to prepare doesn't reconcile with the realities of parenthood and all of the other responsibilities someone in their thirties may have. Consistently finding that time will be a huge ask, especially if you haven't worked on those problems in a while.
another_twist
No its not a measure of cleverness. Its about whether you can break down problems and apply common methods. Thats the entire job. Its a learnable skill and honestly resisting learning because of personal biases is a red flag in my book.
theflyinghorse
The point is to test whether or not you put in the time to sharpen common patterns and also to test your communication ability
RomanPushkin
- Constraint solvers? That's a nice concept, I heard about this once. However, for the purposes of the interview, let's just write some Python code, I wanna see your way of thinking...
(I think it's almost impossible to convince your interviewer into constraint solvers, while the concept itself is great)
aaronblohowiak
import z3
hermannj314
Most interviews are based on the premise that if a diabetic can't synthesize their own insulin in their basement, they are somehow cheating at the game of life.
If my wife's blood sugar is high, she takes insulin. If you need to solve a constraint problem, use a constraint solver.
If your company doesn't make and sell constraint solving software, why do you need me to presume that software doesn't exist and invent it from scratch?
mepiethree
It’s explicitly not testing if you can synthesize insulin in a crisis, it’s a general aptitude test for “if we tell you you need to cram this textbook on how to synthesize insulin by next week and then ask you how to do it on a call, can you coherently repeat that back to us?”
carabiner
What?
joelthelion
In defense of coding tests, most people who can't solve simple dynamic programming problems generally turn out to be pretty poor programmers IRL.
At least that's been my experience. I'm sure there are exceptions.
binarymax
A loonnngggg time ago when I was green, and wasn't taught about constraint solving in my State University compsci program, I encountered the problem when trying to help a friend with his idea.
He wanted to make an app to help sports club owners schedule players for the day based on a couple simple rules. I thought this was going to be easy, and failed after not realizing what I was up against. At the time I didn't even know what I didn't know.
I often look back on that as a lesson of my own hubris. And it's helped me a lot when discussing estimates and timelines and expectations.
542458
This might be a dumb question (as I'm not familiar with constraint solvers) but would a linear optimization approach be better? I've used linear optimization for scheduling in the past. The nice thing is that linear optimization handles rule conflicts well, because you just set weights on all your rules and the optimizer will find the "least bad" solution to the conflicts.
j2kun
This is what major sports leagues use for season scheduling (source: https://mathstodon.xyz/@j2kun/108975072813565989)
atilimcetin
Whenever constraint programming languages come up, you can’t miss mentioning Håkan Kjellerstrand. He’s put together an amazing collection of problems and examples—including plenty for MiniZinc—on his site: https://www.hakank.org/minizinc/
afro88
A little off topic, but I don't know much about greedy algorithms or dynamic programming. I got curious. This conversation was very insightful and now it's very clear in my mind: https://chatgpt.com/share/68c46d0b-8858-8004-aa03-f7ce321988...
krosaen
I find this post interesting independent of the question of whether leetcode problems are a good tool for interviews. It's: here are some kinds or problems constraint solvers are useful for. I can imagine a similar post about non-linear least squared solvers like ceres.
smj-edison
Yeah, especially for learning how to use a solver!
> Most constraint solving examples online are puzzles, like Sudoku or "SEND + MORE = MONEY". Solving leetcode problems would be a more interesting demonstration.
He's exactly right about what tutorials are out there for constraint programming (I've messed around with it before, and it was pretty much Sudoku). Having a large body of existing problems to practice against is great.
gnarlouse
Been working on a calendar scheduling app that uses a constraint solver to auto schedule events based on scheduling constraints (time of day preferences and requirements, recurrence rules), and track goal progress (are you slipping on your desired progress velocity? Get a notification). It’s also a meal planner: from a corpus of thousands of good, healthy recipes, schedule a meal plan that reuses ingredients nearing expiration, models your pantry, estimates grocery prices, meets your nutritional goals. Constraint solvers are black magic.
drob518
SAT, SMT, and constraint solvers are criminally underutilized in the software industry. We need more education about what they are, how they work, and what sorts of problems they can solve.
LegionMammal978
At least personally, I've been very underwhelmed by their performance when I've tried using them. Usually past a few dozen variables or so is when I start hitting unacceptable exponential runtimes, especially for problem instances that are unsatisfiable or barely-satisfiable. Maybe their optimizations are well-suited for knapsack problems and other classic OR stuff, but if your problem doesn't fit the mold, then it's very hit-or-miss.
js8
I wish I knew better how to use them for these coding problems, because I agree with GP they're underutilized.
But I think if you have constraint problem, that has an efficient algorithm, but chokes a general constraint solver, that should be treated as a bug in the solver. It means that the solver uses bad heuristics, somewhere.
LegionMammal978
I'm pretty sure that due to Rice's theorem, etc., any finite set of heuristics will always miss some constraint problems that have an efficient solution. There's very rarely a silver bullet when it comes to generic algorithms.
j2kun
I think this hits the nail on the head: performance is the obstacle, and you can't get good performance without some modeling expertise, which most people don't have.
drob518
Hence my call for more education.
drob518
Well, they aren’t magic. You have to use them correctly and apply them to problems that match how they work. Proving something is unsat is worst case NP. These solvers don’t change that.
LegionMammal978
Of course they aren't magic, but people keep talking about them as if they're perfectly robust and ready-to-use for any problem within their domain. In reality, unless you have lots of experience in how to "use them correctly" (which is not something I think can be taught by rote), you'd be better off restricting their use to precisely the OR/verification problems they're already popular for.
loeg
In what way? They're useful for toy problems like this but they're very slow on larger problems.
drob518
SAT solvers are used daily to generate solutions for problems that have literally millions of variables. So, what you said is just wrong on the face. Yes, some talented people can write custom code that solves specific problems faster than a general purpose solver, particularly for easy special cases of the general problem, but most of the time that results in the programmer recreating the guts of a solver customized to a specific problem. There’s sort of a corollary of Greenspun’s Tenth Rule that every sufficiently complicated program also contains an ad hoc, informally-specified, bug-ridden, slow-implementation of half of a SAT or SMT solver.
sigbottle
I mean right tool for the right job. Plenty of formulations and problems (our job has plenty of arbitrarily hard graph algorithms) that have 90% of the problem just being a very clever reduction with nice structure.
Then the final 10% is either NP hard, or we want to add some DSL flexibility which introduces halting problem issues. Once you lower it enough, then comes the SMT solvers.
My biggest problem with leetcode type questions is that you can't ask clarifying questions. My mind just doesn't work like most do, and leetcode to some extent seems to rely on people memorizing leetcode type answers. On a few, there's enough context that I can relate real understanding of the problem to, such as the coin example in the article... for others I've seen there's not enough there for me to "get" the question/assignment.
Because of this, I've just started rejecting outright leetcode/ai interview steps... I'll do homework, shared screen, 1:1, etc, but won't do the above. I tend to fail them about half the time. It only feels worse in instances, where I wouldn't even mind the studying on leetcode types sites if they actually had decent explainers for the questions and working answers when going through them. I know this kind of defeats the challenge aspect, but learning is about 10x harder without it.
It's not a matter of skill, it's just my ability to take in certain types of problems doesn't work well. Without any chance of additional info/questions it's literally a setup to fail.
edit: I'm mostly referring to the use of AI/Automated leetcode type questions as a pre-interview screening. If you haven't seen this type of thing, good for you. I've seen too much of it. I'm fine with relatively hard questions in an actual interview with a real, live person you can talk to and ask clarifying questions.