In defense of shallow technical knowledge
74 comments
·May 25, 2025al_borland
wagwang
Universities in the US are a giant clusterfuck because they confuse themselves between centers of status/prestige, a place for learning/training, and a semi-professional sports team.
jaccola
Ironically, I think the biggest dissonance is between two terms you grouped together: "learning/training".
What universities were (and still are) really great at is being a place for intense learning. What 99% of their customers now want is vocational training to get a well paying job.
Exactly because of the status/prestige (+ government incentives + being very far from a free market), customers that don't really want what the university is selling are attending in droves.
Henchman21
Related to this is the death of vocational studies at the high school level. EVERYTHING is "college prep" and there is no other course. Huge numbers of people could avoid universities if we offered better vocational training.
gota
Of varying importance for different groups/cohorts - going to college because it the rite of passage for adulthood.
echelon
> What universities were (and still are) really great at is being a place for intense learning.
Are they? A lot of professors hate and/or are bad at teaching. Especially at R1 research universities.
There are lots of problems, specifically for undergrads:
- Large class sizes
- Professors that don't like teaching
- Professors that can't teach well
- Professors that hand off teaching to TAs, RAs, doctoral students, etc.
- Poor feedback on homework, essays, projects, and exams
- Unnecessary classes
- Outdated classes and curriculum
- Sports programs that serve as recruitment and distraction that are orthogonal to learning
- Admin structures that care more about facilities, growth, and recruitment than learning and research
- Systems that are "too big to fail"
- Student loans that are disconnected from bankruptcy, which feeds a recursively growing monster in such a way that it isn't exposed to evolutionary pressures. There is no risk, so malinvestment doesn't bear consequences.
> What 99% of their customers now want is vocational training to get a well paying job.
You can want a curriculum rich in theory and the vocational training for a well-paying job. The problem is that universities are full of perverse incentives - the admin and faculty are at odds, and often even the faculty itself isn't aligned with teaching.
It's a weird org structure steeped in tradition, nostalgia from alumnus, and a faculty tenure system that doesn't always reward the right things.
LarsDu88
When America was founded, universities were basically Bible colleges.
With the enlightenment, came enlightenment values. During this time, study of Greek and Latin were practically standard.
With the coming of industrialization, many adopted the German model of education and became glorified trade schools for the industrial age, churning out classics majors and engineers in equal measure.
Post WW2 with Vannevar Bush's influence, American universities became institutions of research and a crucial part of the military industrial complex.
Finally, with the advent of television, college football became immensely profitable sources of funding for many colleges and universities as well as a huge attraction source for alumni donations.
You say Universities are a clusterfuck, but in reality they have simply evolved with the times, and hence carry a lot of cultural baggage. I don't think that's a bad thing.
wagwang
It's time to be unburdened by what has been and sharply divide the vocational school and the higher learning/research track like most other first world countries. It's very cruel imo to send mediocre students (who likely won't produce any meaningful research) down research paths while they drown in debt. The state is complicit in this too by guarantee'ing the loans.
And btw, I don't deny that there is value in intermingling vocational studies and cutting edge research - it very much makes sense for some STEM disciplines, but that's the exception not the rule imo.
sydbarrett74
Universities are the epitome of institutional inertia. That is a bad thing.
null
senderista
Not sure why you're getting downvotes; I think this is quite accurate.
riskassessment
They did not mention research which is like the defining characteristic of a university. They seem to be condescendingly grouping research into the category of "status/prestige"
BugheadTorpeda6
I don't see why being interested in academic subjects like that has to be driven via universities. In fact, they might even get in the way of developing a genuine interest in topics outside of your chosen major.
godelski
Ideally universities were where you could go to research whatever the fuck you wanted.
Historically there was no publish or perish paradigm and you could do this. People did get kicked out for years of no production but it wasn't uncommon for researchers to take a long time to publish anything. Usually it was "hey, just show us what you've been doing". Getting researchers to communicate to the rest of the community. The problem wasn't people sitting around doing nothing, it was them being caught up in their work and not sharing their progress.
Now, things got flipped upside-down. You get fired if you don't publish fast enough. We missed the reason we started measuring publication rates in the first place.
So now we have the inverse problem. People are trying to publish too early. It compounded though. We now changed the peer-review process. That used to be you publish and then peers... review... Responding with papers of their own and such. Journals were less picky, mostly rejecting works for major mistakes or plagiarism. Other than that... well... you can't really verify the correctness of a paper just by reading it... The purpose of journals was that we didn't have the internet and it would cost a lot of money to send every paper to every library. Before, you'd literally just take your "pre-print" and put it in a section of the library at your university where your university peers would make comments. Now we don't talk to the researcher who's next door.
And we now have this thing of novelty that's completely subjective and compounding with how obvious something is only after hearing it and highly dependent on how well it is communicated. Frankly, if I communicate something really well it should be obvious and you should think that you could have come up with it yourself. That's because you understand it! But now that is a frequent reason to reject. We think we can predict impact of work but there's tons of examples where highly influential works got rejected for lack of novelty or seeming trivial. Hell, a paper that got the Nobel prize in economics got rejected multiple times for these reasons. Both getting called "too obvious" AND "obviously false"[0]. We're just really bad at this lol Those rejections don't make papers better, they just waste time resubmitting and trying to rewrite to figure out how to win at some slot machine.
Academia still works and is still effective, but that doesn't mean there aren't issues. Certainly there are often fewer pressures in an academic setting to do research than at a company. The Uni only cares that you produce papers. Even if it is a misaligned metric, the incentive is to just pick easier problems. The business needs you to make something with __short term business value__. Bigger companies can give more freedom, but the goals are different.
Really, the problem can be seen through the Technology Readiness Level chart[1]. Businesses rarely want to research anything below TRL 5. Really you want to be at 7 or 8. Where the problem with academia is the incentives make it so you want to be around TRL 3 or 4, which leaves TRL 1 and 2 vacant. It still happens, just less of it. Tenure can't fix this if you still got grad student who must publish or perish.
[0] https://en.wikipedia.org/wiki/The_Market_for_Lemons#Critical...
[1] https://en.wikipedia.org/wiki/Technology_readiness_level
magicalhippo
Recalling key words and where or how to find out more has been my "superpower" when it comes to programming.
I can do this since I love reading about all sorts of random topics, a lot which pop up here, and while I seldom recall the details, I can recall enough to know when it might be relevant and how to find it again.
Sooo many diverse topics has suddenly cropped up at work, where everyone else is fairly stumped but I can say "I'm sure I've heard of this before" and with a few minutes have found back the resource which details the solution, or something to that effect.
Thus I too prefer getting blasted with info when starting a new job or new project, so I can recall the relevant key words when they pop up.
jiggawatts
Are you worried that this ability will become less valuable because the AIs are also wide but shallow?
I noticed colleagues calling me a lot less with questions that only I can answer. Several admitted to me that they now use AI for the same kind of “find me some obscure vaguely specified thing”. It is one of the few things the AIs do really well.
al_borland
While I’m sure AI will continue to improve, it still suffers from the same issue as the search engine, which is that you need to know enough to ask the right questions.
I have run into this countless times when using AI. I asked for ideas around a topic on how to solve a problem, and it seems to miss a really good solution. I bring it up, and it the says something like, “oh yeah, that is much better.” On the flip side, if I lead it with some ideas, it has trouble breaking free of that and it tells me I already have the best idea.
If the topics coming together are seemingly unrelated, it takes a good prompt to get the AI to link those ideas on the path toward a solution.
Just today I was asking Copilot about different ideas on how to structure a new project. I laid out some pseudo code with my initial idea, and it gave it back to me with a more complex syntax. I asked why, and if there were any advantages to the way it did it, and then it told me no, my way was better, cleaner, and the preferred way for the language. Though after pushing it some more it did suggested another alternate suggestion, which is tried to dismiss as worse, until I explained why it would actually be better. As far as I’ve seen, at least with Copilot (which is all I’m currently allowed to use at work), it’s no match for a person with some experience and knowledge when it comes to more abstract thinking.
magicalhippo
I haven't thought hard about it yet, but perhaps I should be a bit worried.
My other "superpower" is digging into documentation and figuring out how to actually use stuff I've never seen before. Another thing that might be under threat from AIs soon. I've certainly used AIs for my own hobby projects in this regard, sometimes with good result, so it's surely a matter of time.
Though at least at my current job, my most valuable skill is being able to understand the customer's needs, and being able to come up with solutions that solve their problems well without breaking the bank. Part of that is finding out how to best utilize existing code, which means I like to work on varied parts of the code base. Part of it is probing the customers to understand how they operate, which limitations they have and so on.
I think part of that is thanks to the same drive that lead me to all these obscure topics, which drives me to want to understand the existing code and the customers domain, which in turn puts me in a much better position to help and guide our customers to a good solution.
Not sure if AI's will do that too soon, time will tell.
godelski
This is actually where I get the biggest benefits from AI. It's not really good at making those connections itself. But it is good at fuzzy searching. By that I mean I can describe things but not know the name or proper terms for that. So I describe to an LLM, it can figure that out, and then I go off and search for depth and nuance (don't use the LLM for that...). This is something that traditional search is really bad at. But LLMs are also really bad at traditional search.
I'm not sure why we aren't trying to make this more complementary. I really don't want my LLM to be direct search just as I don't want my direct search to be an LLM. Frankly, the context of what I'm looking for matters. Don't make it an end-to-end thing. Just give me a fucking toggle switch for direct search or fuzzy search (hell, throw in an "I don't know" and make it the default)
I'm not worried about the AI replacing me here because the "superpower" (and I assume the gp's) here isn't having broad awareness. It is the ability to abstract and connect seemingly unconnected things. Once the AI is there, it's going to have already been able to replace a lot more stuff. The "superpower" is creativity combined with broad knowledge-base. LLMs seem more complementary here than replacing.
KoolKat23
Agreed, I can see fundamental flaws to many experienced peoples logic and knowledge, their suggestions often are missing key fundamentals. Often their ideas still work but they're either reinventing the wheel or overlooking something. Many things to them are relative, whereas it should be absolute (benefitting from past human learnings).
I mean it's on full display with social media, people these days are willing to chime in on things they have no understanding of and come to the wrong conclusions.
godelski
We're biased towards simplicity. But unfortunately, when you get better at things the small details and subtleties become more important. Literally by definition complexity increases. You can no longer ignore them and improve. Low order approximations will only get you so far.
The problem is we make these low order approximations, recognize that they (ideally) help and congratulate ourselves. It's just a matter of stopping too early. You see people say "don't let perfection get in the way of good enough." I don't think perfection is usually the issue, rather a disagreement about what's good enough. So sayings like that just become thought terminating cliches[0].
[0] https://en.wikipedia.org/wiki/Thought-terminating_clich%C3%A...
wslh
> I’ve seen more and more people dismiss the idea of universities, and more generally, dismissing the idea of learning anything that isn’t explicitly needed for the career they’re studying for. This always felt like a huge mistake to me.
As usual, it's not black and white: there's no single answer that fits everyone or every field. That said, I'll give an example from computer science where I've seen many people struggle if they haven’t taken (and approved) a course on operating systems: topics like race conditions, mutexes, and concurrency.
While these aren't especially difficult concepts, they're not inherently tied to knowing a specific programming language. They transcend language syntax, even though some languages offer syntax for handling them. The problem I often see is twofold: either developers don't apply locking mechanisms at all (leading to unsafe behavior), or they use them excessively or incorrectly, resulting in deadlocks or significant performance issues. Clearly concurrency could be really hard but I am talking here about the basics.
godelski
Honestly, this has confused me too [0]. Replies to your comment also seem to highlight this, being overly defensive of a position rejecting academia. Don't get me wrong, it's got lots of issues (I'm quite vocal about this too, even with my name associated), but they also do a lot of good.
What's most baffling to me is the rejection of research and theory (depth in knowledge). Claiming that the work isn't impactful. But that's like saying the ground you stand on doesn't matter...
I'm absolutely astounded this is such a common opinion among programmers and CS people. We're literally building the largest companies in the world and bringing about the information revolution and AI revolution on technology that isn't even 100 years old. It's rapidly getting better because of research and we're not waiting for a hundred years of return on investment here.
It's anti-intellectualism. Often spewed by those trying to prove their own genius, demonstrating the opposite. CS of all people should be able to recognize how complex even the simplest things are. For fuck's sake, we can't even get timezones right lol. We need to balance depth, not ignore it or reject it (I don't think the author argued that btw)
It feels excessively myopic. And honestly, the major problem with academia is the same myopia!
| How do you manage genius?
You don't
- Mervin Kelly (Bell Labs)
| What I would like to see is thousands of computer scientists let loose to do whatever they want. That's what really advances the field.
- Donald Knuth
I could quote a hundred more from a hundred fields.I think we have this weird image in our heads that researchers do nothing and if left to their own devices will just waste time and money. I write with my pocket computer that sends signals across the world and into space, passing through machines moving so fast their clocks disagree. Our science isn't taking centuries to benefit from. It rarely ever took decades.
Yet historically most science was done by the rich who had free time. Sure, we're moving faster now but we also have several orders of magnitude more scientists. Our continued growth doesn't mean we've become more efficient.
We seem to be really bad at attributing the causes of success. We're fixate on those at the end of a long chain of work. I mean even NVIDIA depends on TSMC, as does Apple, Google, Microsoft, and others. And TSMC is far from the root. I'm not trying to say how the economics should all fall out but its at least a helpful illustrative target to look at the biases in how we think.
al_borland
> What I would like to see is thousands of computer scientists let loose to do whatever they want.
I had a boss who let me do this for a while. He just told me to do whatever I wanted that would help the team. He didn’t talk to me for 2 years after that. For the first few weeks I was kind of stressing to find what to do and show some results, but after that the boredom set in, and that’s one things took off. It was the most productive I’ve ever been. I was regularly working 12+ hour days, because I was enjoying what I was working on. After 2 years I had so many projects and so much stuff that they built a whole team around what I was doing to spread the load out a little. That actually helped me get bored again, so the ideas started flowing again. Those were the good ole days.
A lot of what I did started as research, then I applied what I learned. It was a nice balance to keep things interesting, rather than being in research mode or build mode all the time.
godelski
I think the result is unsurprising when you think about it for a bit. Though non-obvious at first!
Most people want to work. They think "hey, I'm here, might as well do something." When we're talking about experts in a field (academic or work), usually what interests them the most is the things that matter the most. Giving free time to "play" allows for these larger challenges to be solved. Things that you could never pitch to a manager because it's highly technical, hard to measure, and difficult to prove. But expertise tends to fill in those gaps.
Obviously you can't and shouldn't do this with everyone. Juniors shouldn't have completely free range. They need some to be able to learn this process, but need much more hand holding. But a senior? That's a position with high levels of trust. They should know the core business and you're literally hiring them to be an expert, right? And of course there are people that just want a paycheck. I think a surprising amount of them will still work regardless, but maybe not as much and as effectively. Certainly, micromanaging people will not get these people to do more work, and you risk just becoming overburdened with people in administrative positions.
Usually, you can sniff out the people that should be given more free reign. You don't have to understand the technical, you only have to detect passion. Some people will fool you, but passion is a pretty good litmus test. There's no optimal global solution here, so we have to accept some losses. Doesn't prevent us from trying to minimize that loss, but I think we get overly concerned with the losses that are easy to detect. Removing those often just results in your losses being harder to detect, not becoming non-existent. It's like the survivorship bias problem. You can't measure the hits on the planes that don't make it back. In our case, losses through employees (including managers) metric hacking. Frankly, we want our losses to be visible, because that makes them treatable.
palmotea
I think the title is a little, I don't know, trollish? Unnecessarily controversial?, but I think the insight is true: it's very good to have some knowledge about things outside of "your area," instead of being too hyperfocused.
However, I think that's a broader thing than just "technical" knowledge: you should know a little about what your customers do, what your manager does, what the role is of systems peripheral to yours is, etc.
dfxm12
On first pass, it seems vague enough to border on click bait. Shallow compared to what? No knowledge? Adept knowledge? Expert knowledge? Is the argument we should intentionally stop at "shallow"? Why isn't this clear?
But, with the added context that it's from a personal blog, we should give the benefit of the doubt that the author is just not good at writing headlines and give the article a shot on its own merits...
hnthrow90348765
The depth of knowledge required to work on something tends to scale with how much performance is needed, or how large the impact will be. I feel this should be obvious, but maybe the reason we're having to defend shallow technical knowledge here might be because we're getting pushed to understand things deeply regardless of the performance/impact of the actual jobs.
I also think it should be fairly obvious that some jobs can be accomplished by only having shallow or ad-hoc knowledge because they work on low performance/impact, but are still needed by someone, and thus require an employee.
What has not been obvious is why we can't differentiate roles that require deep vs. shallow knowledge officially, because there is still quite a lot of ambiguity in the actual work demands of "Software Engineer" (or "Software Developer") which makes this kind of defense in the OP necessary.
mlinhares
Most of the time you should be working on both, sometimes you do shallow work, sometimes you do deep work and you naturally build a deep understanding of the pieces of the stack you mostly work with.
I find it hard to believe anyone does serious development work without some deep understanding of a piece of what they work on, the folks I've met that did that didn't last multiple review cycles.
harrall
I think it's critical to have both (1) actual experience working with something and (2) then reading about how it works. You gain natural intuition when you have done both.
When I'm listening to someone suggest an idea, you can tell
- if they're just working off something that they heard about (i.e. let's implement Redis, I heard it was fast),
- if they've done it before but don't really know how it works (i.e. MongoDB worked for me once, don't need to consider Postgres),
- or if they did both
lubujackson
Giving yourselves breadcrumbs to knowledge is a fantastic life skill... but one that may (even more) fall out of favor with AI replacing the value. If I suddenly need to add a caching layer and have learned nothing about it I can ask AI to give me an overview and go ahead and recommend the best solution for my codebase and situation. Yes, you may miss some subtly and have to flail around to ask the right questions, but there is no doubt the value of "shallow technical knowledge" is less important.
What will remain important or grow in importance is general curiosity. Connecting completely disparate ideas or ways of thinking will lead you to new creative thoughts or solutions that AI would never produce because everyone else is working from the same standard ideas.
I was an English major in college and took classes in politics, philosophy, math, language, etc. based on personal interest. And I ended up as an engineer (with my trusty CS minor). TI've met several developers who have had a similar background and they tend to become the most well-rounded and business-aware ones on the team. I worry that this shift to higher cost/higher stakes/higher competition education is making this approach to learning feel untenable and my approach of 20 years ago comes across as totally irresponsible. But I would argue American education is leading to a factory approach at exactly the time when "structured thinking" is being fully replaced by AI. What is the value of crushing leetcode nowadays? Better to have a dev that has some intuition as to why people aren't clicking that new button.
tetha
This is something I am observing at work, teaching a few people and observing in some other people outside of work: It is good to understand how your stuff touches adjacent stuff, and how adjacent things touch your stuff. This eventually enables communication across layers and teams as well.
To pick up one one of his examples, a few people at work understand Postgres very, very well. But some of them have troubles to discuss topics with developers, because they have no knowledge how application servers interact with the Postgres (usually via some pooling, sometimes not), how different kinds of applications have different query patterns (think REST-based applications that are heavily indexed for low-quantity retrievals vs ETL based applications) and so on. I can't write a production ready app in Rails, Spring, Django right now, or a data analysis system in Flink, Spark or whatever, but I tend to have an idea what the dev needs to do so.
On the flipside, if you have a motivated customer or provider, I find it very valuable to spend some time to show and teach them how our systems want to be touched, one way or another. Some "idle" or "non productive" time with some senior-ish devs just sharing ideas and knowledge how our Postgres functions, some somewhat unintuitive thoughts like index selectiveness, and wants to work at a somewhat shallow level has paid off a lot at work.
Suddenly people ask good questions about the postgres ecosystem before starting their project and such so they don't have to spend time building a postgres extension in a worse way in their application. How silly is that.
blindriver
The problem with shallow technical knowledge and, even worse, talking with confidence like the author does, is that it can propagate misinformation and loses a ton of nuance.
For example, the author talks very confidently about indexes, and makes a few conclusions, but they aren't as correct as his confidence suggests.
> That an index is only useful if it matches the actual query your application is making. If you index on “name plus email” and then start querying on “name plus address”, your index won’t be used and you’re back to the full table scan described in (1)
Not true. If you have single column indexes on both name and email, it could use the two indexes, though not as efficient as a single two-column index. If you query "name plus email" and the index is "name, email, age" then it could use the index.
> That indexes massively speed up reads, but must slow down database writes and updates, because each change must also be made again in the index “dictionary”
Must? No. The performance might be imperceptible and not material at all. If you have a ton of indexes, sure but not if you have a reasonable amount.
Shallow technical knowledge is fine but you should also have the humility to acknowledge that when you're dispensing said shallow knowledge. It can also lead to pretty bad engineering decisions if you think your shallow knowledge is enough.
sgarland
For that matter, if you index `(name, email)` and query with `name` and `address` as the predicates, unless there's a better option, or the table is tiny, there's an excellent chance the planner will use that index to narrow down the initial result set to filter.
tibbar
> Not true. If you have single column indexes on both name and email, it could use the two indexes, though not as efficient as a single two-column index. If you query "name plus email" and the index is "name, email, age" then it could use the index.
See, there are databases that implement clever optimizations like this, but those are going to vary widely by database and you would need some domain expertise with that system to know if such optimizations are working. By contrast, this mental model does help you ensure that you can create indexes that are actually helpful in the vast majority of databases.
So I think the author's mental model is working out pretty well for him here, honestly.
sgarland
MySQL and Postgres both support it; that's a huge percentage of what most devs are ever going to encounter. MSSQL and Oracle may do so as well, but I'm not familiar with those beyond some trivial usage.
More to the point, this lack of knowledge will almost certainly drive people to over-index, which harms performance.
blindriver
The point is that there's misinformation in the things that he's saying. There's a level of confidence that overexceeds the value of the information he is disseminating. If he were the lead engineer of a project, would he make bad decisions because of stuff like this? My guess is yes.
tibbar
It's definitely an overly-broad generalization. But these mental models would still improve how many product engineers work with databases, at the expense of a very simple explanation.
I think the interesting question is like, if I have X amount of time and mental bandwidth to learn about a technology, what's the most helpful lossy compression of concepts that fits?
nico
> requires having reliable shallow intuitions about how things work
This is very insightful and applies to many fields, maybe specially within STEM
Like for example the water flow analogy of current. It’s a great analogy that works to a great degree to explain and model a lot of things about electricity, but eventually breaks down
For 99%+ of the people and cases, the approximate analogy is perfectly useful
tibbar
I think the distinction here is between "I understand that X is a solution to Y", vs. "I understand that X is BASICALLY a mashup of A and B according to scheme C." A lot of times, people reach for X when they have a vaguely Y-ish problem, when that's actually inefficient or overkill, and/or you could easily validate if X would actually work by manually trying out A and B by hand.
In addition, there will be times when X cannot directly solve the version of Y you have, but there are simple ways to tweak A or B such that now you do have a solution to the problem. So you can become much, much more effective at solving Y-like problems by understand the the building blocks behind standard solutions.
4ndrewl
Does the description of database indexes even count as _shallow_ knowledge? Sounds like a dictionary description with an example. Not even table stakes to warrant your inclusion in any conversation on the topic.
I know that some birds migrate depending on the season and they fly in certain formations for efficiency. I'd never,ever think I could have any serious conversation with a biologist or ornithologist.
tibbar
I think you underestimate how abysmal many product developers' understanding of databases is:-). It's less about making people respectable experts, more about teaching basic safety principles so they won't do too much damage...
exiguus
If you are developing (technical) solutions, i would say that shallow knowledge is essential. As long as it allows you to make decisions.
o3b
With the pace of technology progress and the amount of new developments in computer science, it's quite easy today to remain on shallow knowledge, which could still be ok, accepted and further help progress itself. At the end of the day, one of the key point of the progress is to make difficult things easier for a larger audience. This introduces abstraction layers, one on the other. Hence you may not know how the KV Cache works in Attention mechanism, but you might still be able to train a super useful AI model.
However I think those unique engineers with vertical and deep knowledge in a tech stack (e.g. C, Java, Maths under NN) are still very needed in the world, because they are capable of building and repairing the fundamentals of everything which gets built upon. So, if you are interested a such fundamental stack, hack it, crash it, it won't be wasted time and world will need you :)
Over the last decade or so I’ve seen more and more people dismiss the idea of universities, and more generally, dismissing the idea of learning anything that isn’t explicitly needed for the career they’re studying for. This always felt like a huge mistake to me.
Universities have their problems, but getting students to see the value in subjects on the fringe, or completely outside, of their primary field of study is not one of them. These are the places new and novel solutions are born. Even if someone isn’t an expert, knowing enough to bring in an expert and facilitating the conversation can pay dividends.
I was once tasked with getting a new team up to speed quickly in a new site we were standing up. The manager at the time wanted to forgo training entirely to just let them figure it out, in the name of speed. I dug my heels in and still ran everyone through it. With some, it was extremely fast, and there was no way they were going to absorb it. However, I wanted them to at least hear it, so if something came up, they may not know what to do, but they will hopefully at least know enough to ask, so we can then dive deeper and show them the right way. The company had its own way of doing almost everyone, so someone doing what they thought was right based on previous experience often led to a mess.