An Article From http://www.itnews.com.au/
Famous head-slap moments from history.
In light of the UK government’s ill-thoughtout and wrong-headed decision to support a ‘three strikes and you’re out’ rule for internet service providers we thought we’d take a look at the history of stupid decision making in the IT field.
We’ve all had the ‘oh no’ moment, when you realise two things – you’ve just done something incredibly stupid and you can’t take it back. Unfortunately, IT has the potential to make those moments a whole lot worse. As we say: “To err is human, to really screw things up you need a computer.”
So here they are, history as a lesson.
Honourable Mention – Watson’s “five computers in the world” comment
Shaun Nichols: Former IBM chairman Thomas Watson was credited for saying in the early 1940s that “there is a world market for maybe five computers.”
Though there is some debate as to whether the quote was ever actually uttered, similar comments attributed to other early computer scientists around that time indicated the belief that the total number of computers the world would ever need was anywhere from three to ten.
Regardless of the specifics, Watson’s quote explains the mindset of the early days of computing and suggests why it took so long for individual terminals and personal computers to really take off.
The thinking is better explained when one realises that at the time neither the transistor nor the computer chip had even been suggested, and computers were huge beasts that required immense amounts of time and attention to work. There really were fewer than a dozen entities that had the need and resources to develop and operate an old-fashioned vacuum tube computer.
Iain Thomson: Watson is certainly not the first person to misunderstand the potential of technology.
When shown a demonstration of the first telephones the mayor of Chicago was so impressed he thought they would be vital in the future and one day every city would have one. Alexander Graham Bell wrote down a list of ten uses for his newly invented phonograph and not one of them was playing music.
But what makes Watson’s comment more of a head-slap moment was that he was the founder of IBM. That’s what makes me a little bit suspicious about the facts and ensured Watson only gets an honourable mention.
Honourable mention- Intel’s Pentium III tracking system
Shaun Nichols: Back in 1999 Intel was gearing up for the release of its much-anticipated “Willamette” Pentium III processor. The new chip sported a 1.5ghz clock speed that had gamers and enterprise users alike salivating. Unfortunately, it had another feature that drove consumers to break out the torches and pitchforks.
As the Willamette chip was being designed and engineered, someone at Intel thought it would be a good idea to provide each chip with a unique tag and then use the information to gather data on the processor and track its use. IT managers could use it to track systems in companies too.
Imagine the company’s surprise when it learned that the public wasn’t exactly ecstatic with the idea of their systems being remotely tagged and monitored at the silicon level. Customers revolted, privacy groups were up in arms and at the All Saints Churchyard in Oxfordshire one could hear a faint whirling sound as George Orwell spun in his grave.
Fortunately for Intel, the company was actually paying attention to what its users were saying and the tracking feature was abandoned, thus saving the company much criticism and keeping them out of our top ten.
Iain Thomson: In 1999 I and other technology journalists attended a press conference given by Intel.
Intel was about to reveal the details behind its forthcoming Pentium III processor. The press conference went well, the technical specs looked good and there were no blinding revelations.
Then the announcer started going over security and proudly announced that from this time forth every Intel chip would have a unique ID number stamped on it, and that could be checked remotely. The company seemed to think this was a good idea. At the words “Any questions?” a sea of hands shot up and every single one of them was about the new scheme.
As the details unravelled it looked more and more of a pig’s breakfast. Intel said it wouldn’t use the system to find stolen chips – about the only thing it was good for. The software used to query the chip number was also hopelessly easy to hack. After a face-saving period the company backed down and our computer hardware remains untracked.
10. The UK identity card programme
Shaun Nichols: When it comes to bad ideas, it’s tough to beat the government. Put a bunch of crusty old politicians in charge of a massive technological infrastructure and you’re bound to get some terrible ideas.
The basis of the UK ID card effort is quite understandable. In the aftermath of the September 11 terrorist attacks on the US much of the world was looking into how it could improve security. In the United Kingdom, this took place in the form of a mandatory national ID card system.
As the hysteria from 9/11 subsided and people began to return to normal, opposition to the idea grew. As tests began on the system criticism mounted and at one point Microsoft even slammed the idea. Since then, the government has backed off a bit, though the programme remains in place for foreign workers.
Iain Thomson: For the life of me I can’t understand the keenness in government for the national ID card scheme.
It’s one of those projects that I’m sure involved the finest sales folk of the technology industry and the bureaucracy that wants everything accounted for getting together in a mutual love-fest. But anyone with half an ounce of sense could have told them it wasn’t going to work.
The technology behind the scheme was laughably simple to subvert. Biometric cards are all very well but unless the police officer checking it has a fast link to a database there’s no way of checking it. Even building the database looks to be massively expensive and will do little to achieve the stated aims of the plan.
And those aims have changed a lot. Officially billed as an anti-terrorism tool the card failed to excite, so then it became an efficiency platform to access services and when that didn’t work the government spinmeisters went for the traditional “It’ll keep out “Johnny Foreigner” routine.
It also overlooked some basic psychology. Brits don’t like ID cards. They accepted them during the Second World War but managed to get rid of them in 1952. No-one wants them back, apart from a few misguided souls, and the government should drop the scheme now, before the Tories take over and do it for them.
9.Yahoo re-hiring Jerry Yang
Shaun Nichols: Sometimes bringing back a founder can work wonders for an ailing company, such as Apple Computer with Steve Jobs. In the case of Yahoo, however, it took the company from a rut to a full-fledged financial crisis.
When the company found itself on the ropes in 2007, it turned to co-founder Jerry Yang to guide the company back into a dominant position and recoup shareholder losses. Yang then proceeded to bungle a deal with Microsoft, send the company’s value plummeting and cause a full-fledged shareholder revolt.
The company should have seen this coming. While Jobs was a shrewd businessman with years of experience, Yang was a computer scientist that struck gold with a good idea in the 90s.
The company needed to turn a page and Yang was stuck on the image of the free-wheeling start-up he created with David Filo. When Steve Ballmer came calling with a very good deal, Yang let those memories of the dot-com boom get the better of him and he turned down a US$32bn offer rather than sell out to a stuffy out-of-town rival.
Roughly 18 months later, with Yahoo’s value a fraction of the original offer, new chief executive Carol Bartz cut a deal for a fraction of what Yang turned down.
Iain Thomson: Yang’s pride in Yahoo was understandable. What is not is his pigheaded attitude over selling it.
Yahoo was a distant number two in the search market and had nothing really to offer except a sizable user base. Add into that the worsening economic situation and Yang should have taken the US$32bn (or even an apparent US$40bn), paid off the shareholders and headed off into the sunset for a life of whatever he pleased.
Instead he decided to fight on but was stymied at every turn. Eventually he had to step down and get someone to salvage the wreckage. I bet when Steve Ballmer signed off on the deal with Bartz he was grinning from ear to ear.
In 2000 the British government enacted new laws to manage data and privacy and it what became known as the The Regulation of Investigatory Powers Act 2000.
What followed is now a text-book case for how not to legislate on the internet, and proof of mission-creep in privacy law. The act was updated in 2003, at the peak of hype about terrorism, by the then Home Secretary David Blunkett and would have allowed the internet surveillance of anyone by pretty much every branch of government, right down to the members of the town council.
Thankfully his son, an IT consultant, sat his dad down and explained a few things to him. Not for the first time it fell to the younger generation to sit the old folks down and explain things to them. Blunkett cut the number of bodies allowed to conduct surveillance of citizens down to nine.
But because the bill was poorly crafted that number grew and now 792 organisations have the right to request all the details of your surfing habits, and over half a million did so last year. Isn’t that a comforting thought.
Shaun Nichols: Most people don’t like the idea of allowing the government to arbitrarily look up their online activities without any notice or given reason. Government officials know that there is no way citizens would allow them to pass a bill giving law enforcement those powers. That’s why the UK government went after the ISPs instead.
Early on this sort of thing didn’t seem like such a big deal. After all, we were at the height of the terrorist scare and unless you were doing something illegal like planning an attack or distributing child pornography, you didn’t really have much to worry about. Now that those powers have extended to things such as copyright theft, however, I think people are starting to get the idea.
Imagine if the police kicked down your door and demanded to search your house for recorded TV shows and old mixtapes. What was once designed to thwart terrorist attacks can now be used to bust people for sharing a song with a friend.
7. Ignoring Software as a Service.
Iain Thomson: When we first started to hear the word SaaS bandied around the office and at conferences it all looked pretty basic – client/server on steroids.
But the messianic young chief executive of a company called Salesforce.com had a dream; that accountancy software would be available to anyone with an internet connection. Sure, as dreams go it’s not on a par with curing smallpox or world peace but it works for Marc Benioff.
The competition thought he was mad. The idea of offering software in the browser was still in its infancy and this upstart was suggesting that companies would be happy handing their accounts online, instead of purchasing the reassuringly expensive financial software systems they owned.
Benioff, rather obnoxiously, described companies like Siebel as “dead” but the fact remains that he was right, and they’ve all had to get into the market and follow his line. It’s a salutary tale of never thinking you’re invulnerable.
Shaun Nichols: When Salesforce.com was first launched, the company was literally laughed at by the big names in the CRM market. After all, CRM software was sold to large corporations thousands of licenses at a time and put into place and managed by teams of IT professionals or specialised support firms. Running that sort of application in a web browser was unheard of and the likes of SAP and Siebel regarded Salesforce.com as little more than a toy.
The biggest fault of large companies is always their lack of flexibility and unwillingness to see the concerns of the individual user. While Marc Benioff likes to boast about bringing on the “end of software,” his company really succeeded because of its price and granularity, not because it didn’t sell its product on a CD-Rom.
Rather than be saddled with hefty license costs and massive deployments, Salesforce.com allowed customers to pay a small monthly subscription rate and run the software on as many or as few workstations as they pleased. This allowed smaller firms to try out CRM products and gave Salesforce a foothold in the market to establish itself and go after bigger clients.
Not everyone overlooked Salesforce, however. When Benioff told Larry Ellison of what he wanted to do, the Oracle founder not only gave his former protege his stamp of approval, but also asked to buy a stake in the new company.
The rude nature of his company’s early dismissal is a chip Benioff still carries on his shoulder. Shortly after the company went under, Benioff purchased the former headquarters of Siebel and turned it into an “incubator” for third party Salesforce platform developers.
6. Google censorship in China
Shaun Nichols: Google’s corporate motto is ‘do no evil.’ In at least one case, however, the company has been accused of directly aiding others in doing evil.
In 2006, the company took heat for allowing the Chinese government to censor results on Google searches within the country. Among the blocked items were pages on democracy and information on the Tiananmen Square protest.
From a business standpoint, one can see why Google would want to work with the Chinese government- after all the country is the most populous in the world and is becoming increasingly wealthy. From an ethical stance, however, this was a terrible idea.
As the dominant force in the search market Google could have made a very strong statement and taken a leadership role by refusing to cooperate with the censorship efforts. In the eyes of many, the move cost the company much of the spirit it had displayed since its beginning and destroyed the image of Google as a torch-bearer in the field of online rights.
Iain Thomson: I remember a friend bemoaning Google’s first foray into China, saying it was like the moment when young Anakin finally crosses over the the Dark Side.
We’d had such high hopes for Google. It had seemed so full of promise and the China question was a killer. You’d expect Microsoft to sell out its users, particularly if there was a buck in it. Yahoo too didn’t seem particularly trustworthy. But this was Google, and some people never forgave it for the decision, even after an apology from Brin.
Using the internet in China is a strangely dislocated affair, where you know there is information you want but you just can’t get to it. As a result of Google caving in there is now no impartial information source in the world’s most populous country and an entire generation is being brainwashed. Do no evil just looks like marketing now.
5. IBM turning down Microsoft
Iain Thomson: In the late 1970s Microsoft was just another struggling software company stuck in the heat in New Mexico.
It spent most of its time developing software in BASIC for the Altair, the computer that kicked off personal computers. Its very young leader Bill Gates showed promise but reportedly was at one time trying to sell the company. The sum mentioned was apparently US$80 million and the offer was made to IBM. They said no.
That Microsoft went on to great things is not in question. ‘What if’ historians would tell you that if Microsoft had been bought by IBM then it would have earned all the money that Microsoft did subsequently, but I’m not too sure.
IBM would have had a useful software arm to be sure, but it’s unlikely Bill Gates, and especially Paul Allen would have lasted long at IBM. Without Gates and Allen the investment would have been worthless.
Shaun Nichols: As Iain pointed out earlier, IBM was at the time not a big player in the software market, and the cost of buying Microsoft would have likely exceeded the costs of simply buying the company’s products for use in its machines at the time.
I would agree that the only way it would have worked would have been for IBM to leave Microsoft as an independent outfit, which once again would have made little sense when Big Blue could simply buy the rights to Microsoft’s products.
Keep in mind that Microsoft didn’t really hit it big until DOS, and even then they were still not a huge player in the market until Windows was developed. We have the advantage of hindsight in saying this was a huge mistake, but you can’t really fault IBM for not being able to see into the future.
4. Commodore’s PC price war
Shaun Nichols: Commodore founder Jack Tramiel operated under the motto “business is war.” In 1983, Tramiel would kick off the business equivalent of a nuclear war that wiped out an entire sector of the computing industry.
With the Commodore 64 battling several other firms in the burgeoning home computer market, Tramiel sought to get an edge by cutting the price of his machines. Normally, cutting prices is a good thing. Consumers flock to bargains and the company makes its money back with increased sales.
In this case, however, the cuts were a death blow. Tramiel’s cuts were too drastic, wiping out profit margins and causing the company to lose money. Other vendors followed suit and dropped their own retail prices beneath profitability. Perhaps they underestimated the market for home computers, or perhaps they thought the cuts would be temporary.
Regardless, the move bankrupted many companies and triggered what would later be known as the Great Video Game Crash. Console gaming was almost wiped out completely and only began to recover when Nintendo expanded to North America and Europe.
The cuts also dealt a crippling blow to Commodore. When all was said and done, the company had exhausted its financial resources and though the market had been greatly thinned, Commodore was no match for the oncoming wave of Macintosh and Windows PCs.
Iain Thomson: It’s a fact you seldom hear from free-marketeers but most businesses hate competition and open markets.
Tramiel was no exception and chose to try and monopolise the market and ensure that everyone used a Commodore. But he took things way too far in his lust to win and ended up nearly tanking the whole sector.
That said it’s possible we owe him a small debt of gratitude. Yes, he caused the first PC recession but he also got buyers into the mindset where computers should become faster and cheaper every year, unlike most other market sectors. This expectation isn’t solely Tramiel’s doing but he played a major part.
Iain Thomson: I have a personal conspiracy theory about Darl McBride, who’s still clinging on as head of SCO. I don’t think he likes the business of software.
SCO’s decision to go after control of UNIX was a mistake so egregious that it’s hard to believe it was taken. The small software company took on the open source movement, and Novell and IBM too. The resultant legal battle was epic, and drags on today, although these days SCO is a shell of its former self, limping from lawsuit to lawsuit with occasional bursts of energy as it finds more cash.
So the fact that maybe McBride never wanted to run SCO occasionally creeps into the back of my head. After all, this way he’s an internationally known figure, nice salary and he can probably milk this for years to come.
Shaun Nichols: While McBride is almost universally loathed in the IT sector, his guts have to be admired. Taking IBM to court over the rights to a multi-billion dollar software platform is akin to charging a Panzer tank with a pair of rollerskates and a broom handle.
At some point you have to think that one of SCO’s lawyers would have turned to the executives and simply asked if they had gone mad. One can imagine IBM’s huge legal team had a good laugh when this suit first crossed their desks.
Perhaps SCO simply wanted to push IBM and Novell into a settlement, making some cash and perhaps even securing a new licensing deal. Whatever the company’s motive, the move proved to be catastrophic failure which has all but killed SCO.
2. Apple losing Steve Jobs
Shaun Nichols: Through much of the 90s it seemed that Apple could do nothing right. Stifled with a dated and unreliable operating system and under assault from cheaper Windows machines, the company was nearly killed off by one fizzling product after another.
Those woes, however, can arguably be traced back to a single decision: the removal of Steve Jobs. In 1983 Jobs himself moved to bring John Sculley on board. Two years later, Sculley convinced Apple’s directors that the company had outgrown Jobs’ erratic ways and moved to push the company’s co-founder out of the business.
In the following years the company would experience a fall from grace that would only become fully apparent ten years after Jobs departure. A former Pepsi executive, Sculley was unable to keep up with the rapid pace of development in the computer market and by the mid-90s the company saw both sales and profits take a dive.
With the company on the ropes in 1997, Jobs returned only to find Apple on the brink of bankruptcy. Perhaps as a testament to the failures of the old regimes, one of Jobs first moves was to halt development on nearly every new project and overhaul the company’s entire product line. The rest is history.
Iain Thomson: Jobs lured Sculley from his old job with the line “Do you want to spend the rest of your life selling sugared water or do you want a chance to change the world?” He certainly changed Jobs’ life.
If I’d have been on the board of directors of Apple at the time I’d probably have got rid of the annoying little git as well, given his penchant for meetings that began early or ended past midnight. Jobs is not a good person to work for – although he is an excellent person to serve if you are so inclined.
That said Sculley really should have hired people who knew what they were doing after ousting Steve and Apple’s death spiral could only be halted by the return of the king. I just hope they know what to do the next time Jobs can’t show up for work.
1. Digital Research
Iain Thomson: Very few companies are unfortunate enough to know the precise moment of their peak. In the case of Digital it was when IBM came knocking.
IBM needed an operating system for its first PC and didn’t have time to develop one itself. So it went to discuss purchasing a licence for CP/M, the leading operating system of the time and sold by Digital.
Negotiations didn’t go well, due to IBM’s notoriously strict non-disclosure agreement which totally gagged Digital while allowing IBM full use of whatever was discussed. On the advice of its legal team they declined the meeting. Luckily the chairman of IBM knew Bill Gates’ mother and Microsoft was happy to sign because it saw the bigger picture.
In a way the final deal was a tipping point for two companies not one. Digital saw Microsoft’s code on the bulk of the world’s computers. Its competitive operating system drowned in numbers and was not supported by Microsoft.
But for IBM it was also a tipping point. The company had just handed over control of the most valuable bit of the deal. It’s understandable, IBM at the time was all about big iron computing and that was insanely profitable at the time.
But the software industry, or at least a few people in it, saw that you could still make plenty of money by building something once and then copying it and selling millions of copies at almost no extra cost. Microsoft showed the world just how much that could be.
Shaun Nichols: Think for a moment about just how different the computing industry, and the world as a whole, would be if the Kildalls hadn’t baulked at IBM’s disclosure demands. Perhaps Digital would be the biggest company in the business and Bill Gates would have be toiling in obscurity as a niche software developer. Without Windows does Apple become the dominant provider? Or perhaps Commodore and Amiga take over the market.
While it now looks like an epic missed opportunity, without the benefit of hindsight it’s an understandable decision by Digital. Rather than pay royalties IBM was said to be looking to pay out a single lump sum, and who could have known that the PC and DOS would take off the way it did?
Unfortunately, the deal was never made and the saga with IBM and Microsoft haunted DRI founder Gary Kildall until his untimely death at the age of 52.
As Iain noted, the deal was also a turning point for IBM, though I think it was bound to happen sooner or later. The company with all the savvy and precision of a well-oiled machine, and when it comes down to it they want little to do with the low-margin PC sector. Ironically, outside of its server efforts IBM is now for the most part a software and IT services outfit.