IT needs more brains, so why is it so bad at getting them?::Open-book exams aren’t nearly open enough
The article argues for a reworked IT education industry in the hopes of a more skilled workforce:
The result would solve the industry’s most pressing need, for good people doing good work, and through expansion into other areas benefit us more than AI will ever manage.
Most IT today exists as a means to support business and commerce. Corporations post absurd profits year over year. They don’t need more knowledgeable IT staff. What is “good” for the IT industry employers may be more staff willing to say “yes, sir” and kick the can down the road. Business doesn’t care about efficient systems if their systems are profitable.
So why is IT bad at getting brains? Because it is against most leadership’s interests. Progress, change, automation all introduce risk which can hurt profitability.
If you’re not familiar with what they do, IT also be seen as a money sink, since there’s no obvious sign of them preventing things from going wrong. So they might seem like they’re just a department sitting there wasting money, or they’re a department you wasted money on when the company is inevitably hacked, for not stopping it in the first place.
Honestly just changing the interview process would be enough to get more people into the business.
Literally yesterday I did a code challenge to track the distance, speed, maintenance schedules, and predict collisions of forklifts in a warehouse. The job I was applying for was a pretty average SRE roll… System design, IaC, CI/CD pipelines, PromQL, etc… How is the code challenge representative of the job in any way?
I feel like I need to learn leetcode algorithm patterns just for the interviews… I never need them for the actual jobs I get hired for.
I’ve always wondered if the solution to the hiring fiasco in IT is to have official licenses similar to the way engineers and lawyers have formal credentials.
Most companies do dumb shit like this because is hard to know if you are actually qualified or if you are blowing smoke. Everyone has had that one guy on the team who barely has a clue how to even set up his ide let alone code.
The problem with this would be the same as it is with all licenses and certs. The tests don’t match real world practice. The other option is adopting the trades approach and combine that with licensing. Apprentice, journeyman etc.
Removed by mod
I remember failing an interview once because they wanted me to know all sorts of obscure c++ tricks. The kind of stuff that most people skipped over when they read about it because it has almost no use case. Had travelled 200 miles for that interview too.
No idea who they wanted… someone who had a photographic memory to memorise a textbook, maybe?
We tend to give practical tests when interviewing… ‘go away and write this thing’. We’re not testing whether they write it, or how they found the solution… google is there to be used… but the questions they ask about the (deliberately) interpertable spec and what the code looks like.
Removed by mod
Like certifications?
Yea, but more formal and less “sell you a boot camp study course” style.
I understand thst even the (law) bar has those courses but it’s also a pretty good filter. Law degree + passing the bar is a solid bare minimum. Then adopt a similar approach to trades where you are an apprentice for x years under a master/mentor before you become a journeyman.
The industry has sort of already adopted it but it’s not standardized and it’s not trustworthy. Calling yourself a senior software engineer means almost nothing. It’s the same as “vice president” in financial companies.
I think the trades approach is the way to go. It makes sense as far as training goes imo. And jesus christ anything needs to be done at this point.
Leetcode style interviews are good for showing off that you’re a smart and flexible employee who can solve novel problems.
The issue is that most companies don’t have any novel problems and they just need quiet competence… But want the best/smartest w/e
I’m just happy that the “rockstar developer” era died.
Speaking from years of experience in IT (nearly thirty of them), I can give my own unscientific opinion: because people put too much faith in certifications, and refuse to do any on-the-job training. You can have five of the six skills listed in a job ad, but if you don’t have that all-important sixth one, your application will get round-filed. It doesn’t matter if it would be a simple matter to train a tech on that one thing. Businesses want phoenixes for chicken scratch.
Certifications are a boondoggle, and have been for years. The tests have been rigged in such a way that candidates need to take them again and again to pass, and they get charged a fee for each attempt. The test itself is a revenue source for companies. The “prestige” those certifications bring for the companies that front them is based on their difficulty, not on their relevance or fairness.
I once attended a Microsoft certification “boot camp.” We all worked our asses off, studied the material, and most of us passed at least one test. Nobody passed all three exams except for one person. I had noticed that person using test prep software with a logo that didn’t match the stuff we’d been given. It looked like an orange DNA helix.
After the last test, a bunch of us milled around outside the building, and I asked the guy who passed how he made it through. He ran for his truck so fast that there was practically a dust cloud behind him. That’s when I decided to look up that logo on Google.
He’d been using a “brain dump” service. For those unaware of what a “brain dump” is, it’s when a third-party company sends a bunch of people to intentionally fail the exams over and over. During each attempt, those people memorize the test questions. Then the company has their plants aggregate all the possible questions in an exam pool and the correct answers to them. In effect, it’s a copy of the whole test.
Brain dumps are extremely common in IT. When I worked at VMware, many of our own employees used them to pass certification exams that were mandatory for continued employment. Those people had been doing their jobs for years. They just needed a bogus piece of virtual paper to prove it to our executive leadership. It was all about appearances.
Why is tech struggling for qualified workers?
Because it refuses to acknowledge them.
Fascinating insight about those brain dump services.
Thanks for sharing your experiences. Massive respect for you to have done 30 years in this silly industry!
Why is tech struggling for qualified workers?
Because it refuses to acknowledge them.
This seems to be a common problem with industries that just can’t find talent. “Qualified” is used in place of “they meet our desires perfectly.”
It’s the same idea even as absurd incel dating ideals. The issue may be the candidates sure; but maybe just maybe, the issue is you need to look in the mirror and ask yourself if you’re being (un)reasonable.
The brain dump docs are real from my first-hand experience.
deleted by creator
IT needs better recruiters and higher pay.
IT requires you to constantly learn new things to stay relevant. I don’t know if any other industry requires this as much as IT.
For me, that feeling of needing to learn new things I think comes not from new tech or tooling, but from needing to solve different problems all the time. I would say there is definitely a fast-moving, hype-driven churn in web development (particularly frontend development!). This really does wear me down. But outside of this, in IT you’re almost always interacting with stuff that has been the same for decades.
Off the top of my head…
Networking. From ethernet and wifi, up to TCP/IP, packet switching, and protocols like HTTP.
Operating systems. Vastly dominated by Windows and Linux. UNIX dates back to the 70s, and Windows on the NT kernel is no spring chicken either.
Hardware. There have been amazing developments over the years. But incredibly this has been mostly transparent to IT workers.
Programming. Check The Top Programming Languages 2023. Python, Java, C: decades old.
User interfaces. Desktop GUI principles are unchanged. iOS and Android are almost 15 years old now.
Dealing with public cloud infrastructure, for example, you’re still dealing with datacentres and servers. Instead of connecting to stuff over serial console, you’re getting the same data to you over VNC over HTTP. When you ask for 50 database servers, you make some HTTP request to some service. You wait, and you get a cluster of MySQL or Postgresql (written in C!) running on UNIX-like OS (written in C!) and we interact with it with SQL (almost 50 years old now?) over TCP/IP.
As I spend more time in the industry I am constantly learning. But this comes more from me wanting to, or needing to, dig deeper.
This is also my experience.
Whilst one can viably move around in IT to be near the bleeding edge (which moves around from area to area slowly over timeframes of a decade or so), most of what’s done in IT is pretty much the same old same old, maybe with bigger tech stacks because the expectations of fancy features keep going up yet the time frames are still the same (for example, integration with remote systems via networking used to be a pretty big deal, but nowadays it’s very much expected as norm in plenty of sintuations) so you end up with ever larger frameworks and ever larger and thicker stacks of external dependencies (20 or 30 years ago it was normal to manually manage the entire hierarchy of library dependencies, whilst nowadays you pull out a clean project from source control and spend the next half an hour waiting for the dependencies to be dowloaded by whatever dependency management system the project build framework - itself much more complex - uses).
deleted by creator
Science, both pompous universities and R&D
Engineering, I’d imagine.
deleted by creator
Gravity 2.0 released. 9.81 deprecated and considered an anti pattern.
Because people are not so interested in reinventing the wheel a thousand times when there could be just 3 optimal open source solutions.
Also many products are plain useless or even harmful to society such as mundane noneducational distracting addictive mobile games.
There are large, “modern” countries in this world where some politicians have learned that only the uneducated will vote for them. So they put a lot of effort into destroying the educational system, especially for the poor. The rich can always afford a private school, or at least a tutor.
Judging from the AI-generated picture above I assume it is because IT is an undead nightmare hellscape where you are shackled to ancient technology that sucks your life-blood out of you until you inevitably fuse into it and become part of the unending doom machine that is late-stage capitalism
Certifications vs real world experience. Hoods are better than suits. The pioneers does not even have one and yet we are here.