Businesses are investing more than ever on the devices their employees need to perform their work. But what is the overall cost and how do your device management choices affect the business?
How we got to The Cloud (part 2)
Part 1 of this blog can be found here:
We talk of 'The Cloud' as a pretty normal part of our information technology discourse these days. Some accounts trace the birth of the term back to 2006, when the likes of Google and Amazon began using 'cloud computing' to describe the new paradigm in which people started increasingly accessing their software, computer resource and data over the Web instead of on their desktops. But it wasn't always like this as you can see below.
In part one of this blog, we covered off some of the early pioneers of computer technology. From Charles Babbage's Difference Engine in the 1800s to the IBM System/360 Main Frame in the 60s and ending up at the first array of early desktop computers in the 80s.
Lets stay in the 80s and kick off part two. This list was harder to compile - because computers were almost ubiquitous from this point forward. The limelight today seems mostly focused on personal tech and the cloud services we consume, rather than the hardware behind the scenes.
The Rise of the Clones - The Compaq Portable
While a PC could be mostly replicated with retail parts, initially its BIOS belonged to IBM, which guaranteed proper IBM compatibility. However, companies such as Award and American Megatrends reverse-engineered IBM’s BIOS, and enabled companies such as Dell, Compaq and HP to build clone PCs. The first clone came from Columbia Data Products with 1982’s MPC 1600, but 1983 saw the landmark Compaq Portable, the first computer to be almost fully IBM compatible.
It was Compaq's first product, to be followed by others in the Compaq Portable series and later Compaq Deskpro series. It was not simply an 8088-CPU computer that ran Microsoft DOS as a PC "work-alike", but it actually contained the reverse-engineered BIOS, and a version of MS-DOS that was so similar to IBM's PC DOS that it ran nearly all of IBM's software too. Not exactly a 'looker' either.
The Compaq Portable.
1991 – NeXTCube. The age of WWW begins
In March 1989 Tim Berners-Lee wrote a document called Information Management: A Proposal, for his colleagues at CERN (European Organization for Nuclear Research) in Geneva. His boss, Mike Sendall, described the proposal as ‘vague but exciting…’ and in 1990, approved the purchase of the NeXT computer. This was the ideal platform for Berners-Lee to demonstrate his vision, merging the ideas of hypertext with the power of the Internet. The machine was the first web server and to turn it off would have simply meant turning off the World Wide Web, an idea which is inconceivable today.
The NeXTCube that the WWW was launched on had a 256Mhz processor, two gigabytes of disk, and a gray scale monitor running NeXTSTEP OS. Berners-Lee put the first web page online on August 6, 1991. He also designed the first web browser and an editor called WorldWideWeb on this machine. This was undoubtedly the high-point for the NeXT Cube too, because it was a bit of a dud commercially.
Tim Berners-Lee's NeXTCube
The Brick that Linus bought
On January 2, 1991 a young Helsinki student named Linus Torvalds went shopping for the most badass computer he could afford. He spent FIM 18,000 (about $5000NZD) on a grey brick that featured an Intel 386 DX33 Processor, and 4 MBs of RAM. This PC used MS-DOS but Torvalds knew that this OS did not take full advantage of his computer's hardware. Preferring the UNIX operating system that was runnning on the University's computers, he set out to create his own OS.
Torvald's friend, Ari Lemmke, was an administrator for the FTP (file transfer protocol) services in Finland and was encouraging him to upload the source code for his OS to a network so it was available for tuning by other programmers. Linix (Linus' MINIX) was only the name Torvald's gave his OS while he was working on it as he thought the name sounded too vain. Lemmke however created a directory called Linux on the FTP server and thus it became known as such.
Linux version 1.0.0 was released with 176,250 lines of code. Now its the world's most used operating system.
The Pentium Range
Not a computer as such, but the Pentium brand became synonymous with the PC in its heyday. It is the most successful desktop microprocessor architecture of all time and the brand still exists today. However, four major architectural changes and decades later, Pentium processors today have very little in common with their predecessors other than the name.
The Pentium family of microprocessors, developed by Intel were introduced in 1993 as the successor to Intel’s 80486 microprocessor, the Pentium contained two processors on a single chip and about 3.3 million transistors. Now numbers of transistors, number in the billions on modern day processors.
Ridiculous Fact Time. In terms of the total number of transistors in existence, it has been estimated that a total of 13 sextillion (1.3×1022) transistors have been manufactured worldwide between 1960 and 2018.
The original Pentium is an extremely modest design by today's standards, and when it was introduced in 1993 it wasn't exactly a show-stopper back then either. The main thing that the Pentium had going for it was x86 compatibility. In fact, Intel's decision to make sacrifices of performance, power consumption and cost for the sake of maintaining the Pentium's backwards compatibility with legacy x86 code was one of the most strategically-important decisions it made.
INTEL Pentium logos through the years
1996 – Sun Ultra II - Putting the dot in dot com
"We put the dot in dot com" (1999-2001). The famous advertising phrase of the dot-com bubble proved all too true. Fueled by brisk sales of its UltraSparc servers - which, with Java, proved irresistible to big Web sites everywhere - Sun was valued at approximately $200 billion at its peak, with a stock price of $247 per share. By the end of 2001 the stock price had plunged to $49 per share. They were bought out by Oracle in 2010.
The original Ultra workstations and the Ultra Enterprise servers were UltraSPARC-based systems produced from 1995 to 2001. This introduced the 64-bit UltraSPARC processor and in later versions, lower-cost PC-derived technology, such as the PCI and ATA buses. The original Ultra range were sold during the Dot Com boom, and became one of the biggest selling series of computers ever developed by Sun, with many companies and organisations, including Sun itself, relying on Sun Ultra products for years after their successor products were released.
Another notable company who used one of these was a company called Google. You may have heard of them. I may have got some of the information for this blog from it. Around 1998, a Sun Ultra II first hosted Larry Page and Sergey Brin’s Backrub search engine – which, of course, eventually evolved into Google. The Sun server had dual 200Mhz CPUs and 256MB of RAM, located at Stanford University. Google now has 450,000 servers in its datacenters around the world.
Google's humble beginnings - core to it, was the Sun Ultra 2 (with biggest monitor)
A data centre and cloud staple now, the first commercialised blade server architecture was invented by Christopher Hipp and David Kirkeby of Houston-based RLX Technologies. RLX, which consisted primarily of former Compaq employees, including Hipp and Kirkeby, shipped its first commercial blade server in 2001, and the term 'blade' was splashed all over their website around this time. Their time in the sun was short-lived however, RLX collapsed and was acquired by HP in 2005.
They created the first commercial blade server in 2001 to meet an industry need for more compact and efficient data storage technology. As well as being a stripped-down server with a modular design optimized to minimize the use of physical space and energy, the big innovation introduced by RLX, and later adopted by all the other big players, HP, IBM and Dell, was integrated, hot-pluggable blades. The RLX blades included everything except I/O, which was entirely enclosed on a standard rear connector. Every part of the system could be hot plugged, including the blades themselves, with none of the ribbon cables or expansion cards required by “high density servers” from the 1990’s.
2006-2022 - Welcome to the Cloud
We've arrived woohoo! In fact you could argue that the concept of cloud computing is is as old as the Internet itself, Arpanet as an example. And these days, with the advent of virtualization the concept of a server is not always associated with specific hardware it sits on. Its quite likely that the applications you're using aren't hosted in the same building or even the same city or country as you, and you'll have no idea what hardware they're running on, and you won't care less either. Buts its pretty cool to see where it all came from, I started my technology journey in the 1990s, the changes I have seen are impressive. But if Charles Babbage knew what his Difference Engine would lead to, he would fall off his chair.