Checkout

Cart () Loading...

    • Quantity:
    • Delivery:
    • Dates:
    • Location:

    $

Real-World IT Horror Stories

Date:
Oct. 03, 2016
Author:
Global Knowledge

If you work in IT, there is no doubt that you've lived through more than a few IT nightmares. In celebration of Halloween, we've asked a few of our instructors to share some of the horror stories from their own consulting careers. Our tales of Russian espionage, employee sabotage, false competencies, computer bombs, and a stolen website, may all sound like movie plots, but they are all true. While the stories are entertaining, it's important to focus on the valuable lessons they teach about IT best practices that shouldn't be overlooked. Hopefully, you'll be double-checking your own networks, policies and documentation after you read these. Enjoy!

 

"The Day the Logic Bomb Went Off" by Michael Scarborough

AMC's television series "Halt and Catch Fire" shows North Texas' rise as the "Silicon Prairie". In the 1980s and early 1990s, it's where IBM PC cloning was explored and where first person shooter games were created. It's also the site of the first "logic bomb"-the source of my horror story. Very early in my career, a disgruntled employee used a logic bomb to digitally wipe out more than 160,000 corporate records.

A logic bomb is sometimes confused with a computer virus. However, a logic bomb is really a piece of code that is intentionally created to do malicious activities when a desired set of criteria is met. That's exactly what happened at my first IT job in Fort Worth Texas, my hometown.

There was a system programmer, who had a reputation for being difficult to work with. A couple of years prior, he created code that would automatically delete corporate records if certain criteria were met. This programmer was fired in September 1987. A couple of days after his firing the logic bomb went off, deleting about 168,000 important corporate records. The programmer who did this became the first person ever convicted of "harmful access to a computer".

This logic bomb had a profound effect on my career and how I thought about IT. I realized that those of us with technical IT jobs really do wield a lot of power under certain conditions. A few years later, when I had changed companies a few times and started to make my way up the corporate ladder, I thought about how a good IT change management process would have detected the programmer installing the malicious code, and would have saved the organization significant amounts of money related to recovery efforts.

 

"The Avoidable DNS Disaster" by Rich Morrow

While acting as vice president of a performance-based Internet marketing company, I oversaw the migration to new infrastructure. Imagine my horror when we mistakenly misrouted more than 300 of our DNS records, taking the business offline temporarily.
Performance-based marketers spend money buying media and sending emails, and are only reimbursed when/if, a resulting sale or conversion occurs. Days like our DNS outage were tough not only because they took the company offline, but mainly because they were completely avoidable.

What we did wrong is a checklist of all the top no-nos. First, our tiny team was already stretched to the limit, with everyone juggling six to eight mission critical deliverables every day. I'd also assumed "Mr. P." (the person I'd assigned the task to) had DNS experience because of his two decades in software development. We also didn't ensure that the time-to-live (TTL) settings, which represent the time it takes to propagate DNS records, were set to the lowest value possible. This would have allowed us to easily and more importantly, quickly, fix any issues. The nail in the coffin was the day we'd decided to do all this-Friday or more specifically, end-of-day Friday.

That next day (Saturday, of course), I got a panicked call from our senior vice president of ad buying:

"All the URLs are going to other websites... the mortgage URL is going to the acne products... the acne products URL brings up cell phone landing page. I've shut off all campaigns".

I pull up some URLs, and sure enough, each one was completely misrouted, at the exact time (weekend) when they're supposed to be making the most money. I tried to call Mr. P. It went to voicemail. I tried to go into the office only to find that (again, because we were all so busy, and because I'd not made this a priority for Mr. P.) we'd not documented how to access the DNS provider.

When we were finally able to track Mr. P. down, hours later, he discovered that the TTLs had not been set to five minutes as we'd thought. Most were still at six hours or more. Only after much internal stress, apologies to clients, and loss of over $20,000, did we get our 300 DNS records going back to the correct servers.

 

"The Case of the Disappearing Domain" by Brad Werner

Almost every IT administrator, whether they work in storage, networking, servers, or clients, has a story about the Domain Name System (DNS) not functioning properly. I collect DNS horror stories like some people collect bottle caps.
One was the "Case of the Disappearing Domain". I was working at a start-up company that had grown from a dozen people to about 80 employees in a couple of years. One Wednesday, we noticed a dramatic drop in incoming email, and the hits on the website dropped like a rock. Frantically, I tested from the outside and inside. Everything was fine.

Some important quotes and contract negotiations from our customers weren't coming through. I sent test messages. All of mine went through. The chief financial officer stopped by my office. He had just gotten off the phone with an investor. Our website was down, and we weren't receiving emails from them. Then, the CEO was in my office.

My sweaty fingers were wildly typing frantic magic incantations, desperately struggling to resuscitate our presence and existence in the networked world. Hour by hour it was getting worse. And then our network communications flatlined. All inbound communications had ceased.

That was the painful day that I discovered the agony that having someone buy your domain name out from under you can cause. I discovered that another registrar had allowed some company to register the domain name we already "owned." Our company name was stolen out from under us.

You might not think that DNS is a dangerous business that involves theft, espionage, sabotage, or outright hacker malice with no competitive angle. We got our domain back, though it took several days for email and Web services to start flowing fully again. I now make sure to lock in the registration so that no one can unregister the domain names of my business or those of my clients. Please be safe and always protect yourself.

 

"The Espionage Denial Nightmare" by Phill Shade

About three years ago while in a galaxy, unfortunately, all too nearby, I was working as a consultant for a small design company when I came across the nightmare of all nightmares: industrial espionage.

The company designed distinctive cases for major vendors to house their products. I was contacted by another consultant who needed help analyzing a possible data breach. The issue was that their designs were showing up on the illegal markets, sometimes before they were even in production.

Using Wireshark, GeoIP and graphical traceroute utilities, we discovered an internal connection originating in the company's design servers and reaching to St. Petersburg, Russia. Our next step was to set a trap. We created several fake designs and uploaded them to the server in question. We then attached Wireshark to a hub and connected the server back to the network switch. A capture filter was set inside Wireshark and set to the IP address of the server. We watched the very designs we had loaded into the server copied and transferred back to Russia. We had our villain!

We saved all of our evidence, created a quick report and prepared our presentation. What ensued still blows my mind to this day. Rather than accepting our findings and thanking us, the client instead stated:
"That can't be true, you're reading it wrong!"

When I gathered my thoughts and asked why, the next shock ensued:

"Our network can't possibly be compromised since we only use Mac computers and they are safe from hacking!" the client uttered with blind belief in modern advertising.

When we dared to ask what sort of security software or hardware they used to protect the network and infrastructure, we received nearly the same answer. So hoping for the best, we presented our presentation to the department head, then the chief technology officer and finally the CEO. Each piece of evidence was met with the same statement that there had to be a mistake and there was no need to follow any of the recommendations as this would make operating the network too difficult.

Completely at a loss for words all we could do was present the invoice for services rendered. The company was out of business in another year or so. I learned that sometimes all you can do is the job and the rest is up to the client.

 

Spine-Tingling Lessons

So there you have it. These IT nightmares are no joke. In the 21st century, IT is directly woven into almost every aspect of a business' day-to-day operations. Revenue, productivity and brand image can all be lost when IT efforts are compromised. In some instances, the shrugging off of IT's recommendations and living within a false sense of security brings with it the downfall of a company. Don't be one of those companies. What do you think? Do you have any stories of your own? If so, please share on Twitter and make sure you use #ITHorrorStories. Happy Halloween!