If you’re an information security administrator/analyst with average skills and little interest in the field, your days are numbered.
Contrary to what many believe, the field of information security won’t always be like this, i.e. an open, festering wound. I know that’s blasphemous to say — especially coming from a security guy — but it’s true. The only reason so many low to mid-level security professionals have jobs today is because absolutely no security considerations were made when the Internet (and the systems that connect to it) was first built.
Unfortunately, we’re still using that same first-generation technology today, and that’s why we’re up to our necks in trashware. Once that changes, however, the endless job postings for the average security pro will come to an abrupt end.
Information Technology today is like a high-rise construction site with nothing but wet cardboard for building materials. For nails and bolts we use bundles of rotten toothpicks tied together with some twine. We then proceed to erect these massive skyscrapers and then wonder why they topple so easily in the wind and rain.
Though most will tell you otherwise, it’s not because the wind and rain are unstoppable forces. It’s not as if buildings can’t be made that won’t stand up to the elements. No, the problem is in the building materials and schematics — not the wind and rain.
Analogies aside, systems today are so poorly put together that my fascination with penetration testing has been severely damaged. I still get a short rush when I break into a Unix or Windows system and get root/admin, but it’s immediately squashed by a voice that says, “Great, someone left the door wide open and you walked through it. So what? You didn’t even find the door yourself.”
Cracking systems today isn’t glamorous or overly difficult, in fact it’s often rather trivial. Some smart guy finds, through meticulous testing, a gaping hole in the diseased cavity we call “IT Infrastructure” and then everyone passes out color-coded directions on how to take advantage of the issue. I liken it to pushing down a 4-year-old child.
Nice. Now the poor kid’s sitting there with a skinned knee. What now? Throw your arms up in triumph? I think not. It was just a little kid. The only person who deserves any credit is the guy who tipped you off in the first place, i.e. the researcher who found the hole.
The Building Blocks Of Failure
It’s not hard to map out what has lead to cracking being so easy. Listed below are just a few of the factors that make breaking into today’s systems more like fighting with children than breaking through fortified defenses:
- The Internet Was Designed For A Few Academics, Not People In General The initial designers simply didn’t plan on mass adoption, not by any stretch. They didn’t factor in any security because they didn’t think anyone malicious would ever use it. It’s that simple. It’s not that they couldn’t add security — they didn’t even try.
- Microsoft Windows Is The Most Prolific OS Out There, And Its Security Has Been Horrible This isn’t about bashing Microsoft, it’s just true. They just became so successful that every one of their major flaws became the world’s problem. Again, just as the designers of the Internet, it all comes down to the benign mistake of designing for functionality rather than security. Once the default configuration of Windows and the other high-utilization Operating Systems starts incorporating decent security the whole landscape will change dramatically.
- The C Programming Language Is Dangerous, And It’s Still Being Used Extensively The problem with C isn’t that it’s a bad language, it’s that it’s dangerous for those who aren’t both skilled and really careful. We’ve all heard the foot shooting jokes and those exist for a reason. Having a language/IDE pair that provided pseudo-intelligent security checking before compile time would go a very long way toward stopping buffer overflows. Think of the IDE automatically adding bounds checks for copy functions, or better yet — standardized use of a languages that doesn’t allow you to make the mistake at all.
- System Architecture Allows Buffer Overflows By Default The vanilla computer system of today and yesterday essentially has tragedy built right in. You stuff too much into its small container and the computer scoops up the extra, takes it to the brain, and says, “Run this.” This is comically insane, and it could have been done a million other ways. The only reason we still have this problem is that we’re still stuck doing things the way they were done on the very first computers. As in, the computers from the 70’s and 80’s. Once again, this isn’t our best effort. It’s our first effort, and one made when the threat wasn’t even visible.
But things are changing. Although most of what’s coming in the immediate future equates to bandaids rather than redesign, the bandaids will have a major effect on information security. Specifically, there will be far fewer successful attacks against IT, and each one will require significantly more skill. What this’ll do for information security careers is trim them dramatically. There will be far less patching, fewer malware updates, and less scrambling to contain outbreaks. In short, there’ll be less grunt work to do. Here’s why:
- More Secure Default Configurations: As vendors begin to ship more secure default configurations things will improve greatly. This is especially true of the Windows operating system since it’s so ubiquitous. As a case in point, from a pen-testing standpoint, the difference between breaking into Windows 2000 Server vs. Windows Server 2003 is massive. 2003 still has a legion of issues, but it’s like a fortress compared to 2000.
- Enhanced System Architecture: Within a few years it’s going to be far more difficult to run arbitrary code on systems. There are various stack protection technologies coming to market now, and before long these types of defenses will be in the operating systems themselves.
- Protective Programming Environments: IDEs, and even the languages themselves, will be written specifically to protect the final product from the programmer. The language won’t allow most things, and what it does allow the IDE will balk at when done. This won’t eliminate errors, of course, but the effect will be enormous.
- Security Technology Integration: In addition to more secure default settings and better overall design, the new security technologies that we’re seeing today will be as common in future environments as printers and network cables are today. Technologies like Host-Based Intrustion Prevention will be in all operating systems, all network roll-outs will have Network Access/Admission Control elements, etc. Malware today has free-reign, and it won’t always be that way. The authors of the excellent Foundstone books have been quoted saying that had every system they went against had HIPS technology, things wouldn’t have gone so well for them.
Why I Could Be Wrong
So within the next 5-10 years things will get dramatically better for IT Security. When I say dramatically, what I mean is putting an Apache web server out on the Internet and having it survive for a couple years without patching. That’s dramatic.
I could be wrong though; it wouldn’t be the first time. Here’s how: just as information security is in its infancy, so is information technology itself. And the drive for features will continue to outpace the drive for security, simply because features are what drive technology.
That being said, it’s quite possible that as technology really takes off, e.g. personal computers (think current phones) hosting your own private daemons, full multimedia and display technologies, etc…all these things will be released at ludicrous speed. And that speed is likely to be much faster than proper security considerations can keep up. The result of that would be what we have today — gaping security wounds.
But I don’t think so. I think the framework for development, and most importantly the building materials themselves, will be so superior that even shoddy work will yield products that are 99% more secure than what we see today. Like I said, though — I could be wrong; we just have to wait and see.
Again, my prediction is for the average newly-deployed web server to have a lifespan of (at least) 2 years — on the Internet, with no patches, within 10 years. And I think that’s being conservative.
Either way, even with the more robust IT infrastructure of the future, there will always be work for talented and dedicated security professionals. Stupidity isn’t going away, and misconfiguration is as bad (or worse) than any architectural or design problem. As such, social engineering and other advanced attacks will always have a home, no matter how advanced security technologies become.
It may seem that the current balance between information security and attackers represents the natural, permanent equilibrium, but that’s simply not the case. That’s an illusion embraced by those who fail to see that information technology itself is in its absolute infancy. It isn’t as if an effort has even been made to create a secure system; all we’ve done up to this point is attempt to retrofit what was handed to us in the very beginning.
Once we do finally invest the time and effort into building new systems — systems designed with security as a primary consideration — the entire balance will shift dramatically in favor of infosec. At that point, news of a major hacking incident that didn’t involve a configuration mistake, insider attack, or social engineering will be headline news.
Anyway, I wonder what other security professionals think about this viewpoint, and I encourage you to contact me with your flames, comments, and questions at will.:
— Edit: This topic has been on my mind for a number of years now, and I tried (mostly unsuccessfully) to argue the same point in this paper which I submitted as my SANS GSEC practical a while back.