An eye-opening look at the new computer revolution and the coming transformation of our economy, society, and culture.
A hundred years ago, companies stopped producing their own power with steam engines and generators and plugged into the newly built electric grid. The cheap power pumped out by electric utilities not only changed how businesses operated but also brought the modern world into existence. Today a similar revolution is under way. Companies are dismantling their private computer systems and tapping into rich services delivered over the Internet. This time it’s computing that’s turning into a utility. The shift is already remaking the computer industry, bringing new competitors like Google to the fore and threatening traditional stalwarts like Microsoft and Dell. But the effects will reach much further. Cheap computing will ultimately change society as profoundly as cheap electricity did. In this lucid and compelling book, Nicholas Carr weaves together history, economics, and technology to explain why computing is changing―and what it means for all of us.
Customer Reviews
Rating Breakdown
★★★★★
30%
(83)
★★★★
25%
(70)
★★★
15%
(42)
★★
7%
(19)
★
23%
(64)
Most Helpful Reviews
★★★★★
1.0
AFBBN6BGYIAQ2IJ4GRWU...
✓ Verified Purchase
An extended defense of a utopian vision of the IT future first published in Carr's HBR article
Save your money. This book contains nothing but an extended defense of a Utopian vision of the IT future first published in Carr's HBR article. Limited understanding of underlying IT technologies, haziness and lack of concrete detailed examples (obscurantism) are typical marks of Carr's style. Carr used focus on IT shortcomings as a smokescreen to propose a new utopia: users are mastering complex IT packages and perform all functions previously provided by IT staff, while "in the cloud" software service providers fill the rest. This is pretty fine humor, the caricature reminding me mainframe model, but not much more.
His analogies are extremely superficial and are completely unconvincing (Google actually can greatly benefit from owning an electrical generation plant or two :-) Complexity of IT systems has no precedents in human history. That means that analogies with railways and electrical grid are deeply and irrevocably flawed. They do not capture the key characteristics of the IT technology: its unsurpassed complexity and Lego type flexibility. IT became a real nerve system of the modern organizations. Not the muscle system or legs :-)
Carr's approach to IT is completely anti-historic. Promoting his "everything in the cloud" Utopia as the most important transformation of IT ever, he forgot (or simply does not know) that IT already experienced several dramatic transformations due to new technologies which emerged in 60th, 70th and 90th. Each of those transformations was more dramatic and important then neo-mainframe revolution which he tried to sell as "bright future of IT" and a panacea from all IT ills. For example, first mainframes replaced "prehistoric" computers. Then minicomputers challenged mainframes ("glass wall" datacenters) and PC ended mainframe dominance (and democratized computing.). In yet another transformation the Internet and TCP/IP (including wireless) converted datacenters to their modern form. What Carr views as the next revolution is just a blip on the screen in comparison with those events in each of which the technology inside the datacenter and on user desks dramatically changed.
As for his "everything in the cloud" software service providers there are at least three competing technologies which might sideline it: application streaming, virtualization (especially virtual appliances), and "cloud in the box". "In the cloud" software services is just one of several emerging technical trends and jury is still out how much market share each of them can grab. Application streaming looks like direct and increasingly dangerous competitor for the "in the cloud" software services model. But all of them are rather complementary technologies with each having advantages in certain situations and none can be viewed as a universal solution.
The key advantage of application streaming is that you use local computing power for running the application, not a remote server. That removes the problem of latency and bandwidth problems inherent in transmitting video stream generated by GUI interface on the remote server (were the application is running) to the client. Also modern laptops have tremendous computing power that is very expensive and not easy to match in remote server park. Once you launch the application on the client (from a shortcut ) the remote server streams (like streaming video or audio) the necessary application files to your PC and the application launches. This is done just once. After that application works as if it is local. Also only required files are sent (so if you are launching Excel you do NOT get those libraries that are shared with MS Word if it is already installed).
Virtualization promises more agile and more efficient local datacenters and while it can be used by "in the cloud" providers (Amazon uses it), it also can undercut "in the cloud" software services model in several ways. First of all it permits packaging a set of key enterprise applications as "virtual appliances". the latter like streamed applications run locally, store data locally, are cheaper, have better response time and are more maintainable. This looks to me as a more promising technical approach for complex sets of applications with intensive I/O requirements. For example, you can deliver LAMP stack appliance (Linux-Apache-PHP-MySQL) and use it on a local server for running your LAMP-applications (for example helpdesk) enjoying the same level of quality and sophistication of packaging and tuning as in case of remote software providers. But you do not depend on WAN as users connect to it using LAN which guarantees fast response time. And your data are stored locally (but if you wish they can be backed up remotely to Amazon or to other remote storage provider).
The other trend is the emergence of higher level of standardization of datacenters ("cloud in the box" or "datacenter in the box" trend). It permits cheap prepackaged local datacenters to be installed everywhere. Among examples of this trend are standard shipping container-based datacenters which are now sold by Sun and soon will be sold by Microsoft. They already contain typical services like DNS, mail, file sharing, etc preconfigured. For a fixed cost an organization gets set of servers capable of serving mid-size branch or plant. In this case the organization can save money by avoiding paying monthly "per user" fees -- a typical cost recovery model of software service providers. It also can be combined with previous two models: it is easy to stream both applications and virtual appliances to the local datacenter from central location. For a small organization such a datacenter now can be pre-configured in a couple of servers using Xen or VMware plus necessary routers and switches and shipped in a small rack.
I would like to stress that the power and versatility of modern laptop is the factor that should not be underestimated. It completely invalidates Carr's cloudy dream of users voluntarily switching to network terminal model inherent is centralized software services ( BTW mainframe terminals and, especially, "glass wall datacenters" were passionately hated by users). Remotely running applications have a mass appeal only in very limited cases (webmail). I think that users will fight tooth and nail for the preservation of the level of autonomy provided by modern laptops. Moreover, in no way users will agree to the sub-standard response time and limited feature set of "in the cloud" applications as problems with Google apps adoption demonstrated.
While Google apps is an interesting project which is now used in many small organizations instead of their own mail and calendar infrastructure, they can serve as a litmus test for the difficulties of replacing "installed" applications with "in the cloud" applications. First of all, if we are talking about replacing Open Office or Microsoft Office, Google apps functionality is really, really limited. At the same time Google have spend a lot of money and efforts creating them but never got any significant traction and/or sizable return on investment. After several years of existence this product did not even come close to the functionality of Open Office to say nothing about Microsoft Office. To increase penetration Google recently started licensing them to Salesforce and other firms. That means that the whole idea might be flawed because even such an extremely powerful organization as Google with its highly qualified staff and huge server power of datacenters cannot create an application suit that can compete with preinstalled on laptop applications, which means cannot compete with the convenience and speed of running applications locally on modern laptop.
In case of corporate editions the price is also an issue ($50 per user per year for Google apps vs. $ 220 for Microsoft Office Professional). In no way they ook like a bargain if we assume five-seven years life span for the MS Office. The same situation exists for home users: price-wise Microsoft Office can be now classified as shareware (Microsoft Office Home and Student 2007 which includes Excel, PowerPoint, Word, and OneNote costs ~$100 or ~$25 per application ). So for home users Google needs to provide Google apps for free, which taking into account the amount of design efforts and complexity of the achieving compatibility, is not a very good way of investing available cash. Please note that Microsoft can at any time add the ability to stream Office and other applications to laptops and put "pure play" cloud applications providers in a really difficult position: remote servers need to provide the same quality of interface and amount of computing power per user as the user enjoys on a modern laptop. That also suggests existence of some principal limitations of "in the cloud" approach for any complex application domain: SAP has problems with moving SAP/R3 to the cloud too and recently decided to scale back its efforts in this direction.
All-in-all computing power of a modern dual core 3 GHz laptops with 4G of memory and 200G hard drives represent a serious challenge for "in the cloud" software services providers. This power makes for them difficult to attract individual users money outside advertising-based or other indirect models. It's even more difficult for them "to shake corporate money loose": corporate users value the independence of locally installed on laptop applications and the ability to store data locally. Not everybody wants to share with Google their latest business plans.
Therefore Carr's 2003 vision looks in 2008 even less realistic then it used to be five years earlier. As during those five years datacenters actually continued to grow, Carr's value as a tech trends forecaster is open for review.
Another problem with Carr neo-mainframes vision is propaganda of "bandwidth communism". Good WAN connectivity is far from being free. Experience of any university datacenter convincingly demonstrates that a dozen of P2P enthusiasts in the neighborhood can prove futility of dreams about free high quality WAN connectivity to any skeptics. In other words this is a typical "tragedy of commons" problem and should be analyzed as such.
Viewing it from this angle makes Carr's views of reliable and free 24x7 communication with remote datacenters rather unrealistic. This shortcoming can be compensated by properties of some protocols (for example SMTP mail) and for such protocols this is not a problem, but for other it is and always will be. At the same time buying dedicated WAN links can be extremely expensive: for mid-side companies it is usually as expensive as keeping everything in house. Large companies usually already have "private clouds" anyway. That makes problematic "in the cloud" approach to any service where disruptions or low bandwidth in certain times of the day can lead to substantial monetary losses. Also bandwidth is limited: for example OC-1 and OC-3 lines have their upper limit of 51.84Mbit/s and 155.2 Mbit/s correspondingly. And even within organization not all bandwidth is used for business purposes. In a large organization there are always many "entertainment-oriented" users, who strain the connection of the firm to the Internet cloud.
Another relevant question to ask is: "What are financial benefits to a large organization for implementing Carr's vision." I do not see any substantial financial gains. IT costs in large enterprises are already minimized (often 1-3% of total costs) and further minimization does not bring much benefits. What can you save from just 1% of total costs? But you can lose a lot). Are fraction of a percent savings worth risks of outsourcing your own nerve system ? That translates into the question: "What are principal differences in behavior of those two IT models during catastrophic events ?"
The answer is: "When disaster strikes the difference between local and outsourced IT staff becomes really critical and entails huge competitive disadvantage for those organization who weakened their internal IT staff." during disasters internal IT staff really matter and treatment of the company by internal datacenter staff is completely different from treatment of the same company by google or Amazon, for which this is just another annoying customer. That brings us to the central problem with Carr's views: he is discounting IQ inherent in local IT staff. But if this IQ falls below certain threshold that really endangers an organization in case of catastrophic events.
Moreover it instantly opens such an enterprise to various form of snake-oil salesmen and IT consultants proposing their wares. In no way software service providers are altruists and if they sense that you became "IT challenged" and dependent on them they will act accordingly.
In other words an important side effect of dismantling of IT organization is that instantly makes a company a donor in the hands of ruthless external suppliers and contractors. I saw such cases as a side effects of outsourcing. Consultants (especially large consultant firms) can help but they also can become part of the problem due to the problem of loyalty. We all know what happened with medicine when doctors were allowed to be bribed by pharmaceutical companies. This situation which is aptly called "Viva Viagra" and in which useless or outright dangerous drags like Vioxx were allowed to became blockbusters was fully replicated in IT: myth about independence of IT consultants is just a myth (and moreover, some commercial IDS/IPS and EMS systems in their destructive potential are not that different from Vioxx ;-).
Carr's recommendation that companies should be more concerned with IT risk mitigation then IT strategy is complete baloney. He just does not have any "in depth" understanding of very complex security issues involved in large enterprise. Security cannot be achieved without sound IT architecture and participation of non-security IT staff. Sound architecture (which is a result of proper "IT strategy") is more important then any amount of "risk mitigation" activities which most commonly are waist of money or, worse, entail direct harm to the organizations (as SOX enthusiasts from big accounting firms recently aptly demonstrated to the surprised corporate world).
I touched only the most obvious weaknesses of the Carr's vision (or fallacy to be exact). All-in-all Carr proposed just another dangerous utopia and skillfully milked the controversy his initial HBR article generated in his two subsequent books.
77 people found this helpful
★★★★★
4.0
AGLP6SDQEXPCBXQWQZOX...
✓ Verified Purchase
The End of the IT Department?
Nicholas Carr tells us that a great transformation taking place: The Big Switch, as it were. Businesses are switching from inhouse IT departments to network services or, as the he calls it, utility computing. This switch is similar to what happened with electricity a hundred years ago. At that time companies produced their own electricity by operating their own generators. This, however, was enormously inefficient and expensive. Eventually companies saw the wisdom of using a giant centralized grid operated by companies like Edison and Westinghouse.
Utility computing has been talked about for years; people like Larry Ellison have been promoting it for a long time. Some companies are slowly making the transition, but most still buy their own computing equipment, their own software, and still hire legions of IT personnel. Carr argues that this will all change once everyone moves to the computing grid. Computing, he claims, is now a commodity like electricity was at the beginning of the last century. It is no longer cost effective for companies to try and differentiate themselves by doing all their IT services inhouse when everything is available on the Internet.
The social consequences of this transition will be huge. Some IT companies will prosper and others will suffer or become irrelevant. Companies like Microsoft and Intel will be losers since they will be selling less hardware and software. Others like Google, the archetypal utility computing company, will prosper. Google operates the largest data centers in the world and offers a wide variety of software apps that private companies no longer need to develop on their own. Carr believes that the Microsoft's client/server modal is on the way out.
As companies move to the grid their IT departments will be drastically downsized. Carr goes as far as foreseeing "just one person sitting at a PC and issuing simple commands over the Internet to a distant utility." He writes that even Internet companies such as Craigslist, YouTube, and Flickr operate with minimal staff since they are making maximum use of the grid.
The fate of content producers such as journalists, photographers, reviewers, and editors is even worse. (Read also [[ASIN:B000RWCBOE The Cult of the Amateur: How Today's Internet Is Killing Our Culture (Unabridged)]] by Andrew Keen.) Professionals are being replaced by hobbyists, who, by the way, don't make any money. The professional will have to find other work to support was is now their hobby.
Carr's vision of the future may be excessively bleak. No doubt the losers of the utility age will find their new niche just as electrical workers did in the last century. This book will be helpful to the IT professionals who are trying to reposition themselves as IT departments decline.
19 people found this helpful
★★★★★
1.0
AFZVGRSAC5ND6HE2V672...
✓ Verified Purchase
The History of Power Generation
After reading the book, the summation or description provided by Amazon above captures the core of the author's message. The first part of the book drags you through the beginnings of the electric power generation industry and how it grew and developed into what we have today. The author then uses this as an analogy to support his view that "utility computing" will replace corporate datacenters we have today. This long history wasn't necessary for the point to be delivered.
One thing this is frequently skipped over is hardware as a service and it's implementation and role that it plays in the growth and success of SaaS.
The author touched on some of the social and business impacts he sees and the impact that it has had on anyone that creates content that can be digitized. The rest of the book covered various observations about the impact of the Internet on society and business that can be found in just about any other Web 2.0 book out there.
This book continues the trend of taking a magazine article that touches the touches on the epicenter of Internet 2.0 that is so popular. "Everything is Miscellaneous" is another example. My opinion is that this book should have stayed as a magazine article. I don't recommend it unless it's the only Web 2.0 book you read.
16 people found this helpful
★★★★★
3.0
AHO5NLQHOYOTLC56VELJ...
✓ Verified Purchase
A pretty good book, with some serious flaws
This is a pretty good book, but by turns interesting and annoying. Carr sketches the history of the rise of the big electric utilities in the early 20th century, then predicts that "utility computing" will similarly displace inhouse corporate IT facilities in the early 21st century, just as companies stopped generating their own electricity way back then.
The historical review is nicely done -- I learned, for instance, that General Electric was once Edison General Electric -- and Carr is on to the reason why companies adopt new technology: it's cheaper, more convenient and/or the competition has already adopted it. The annoyances start when he starts prognosticating. As Yogi Berra once observed, "the trouble with predicting the future is that it is very hard." It looks like Carr read everyone else's Internet/computing predictions, mixed them up a bit, and regurgitated.
OK, I'm being a bit hard on him. Where Carr knows something about an industry -- publishing, for instance -- he has some sharp observations on the migration of newspapers online, and the consequent unbundling of the paper package you buy at the corner for a dollar. For other stuff, he's so scattershot, you'd be better off to read some of the original critics and prophets -- Carr has nothing new to add, and ends up confusing the reader (and probably himself).
So: read the history, the economics, and the publishing stuff, and skim or skip the rest -- that's my advice.
Happy reading--
Peter D. Tillman
10 people found this helpful
★★★★★
2.0
AE4PORCZQB2NEZJXPUEZ...
✓ Verified Purchase
Very little insight
This book reads like an extended magazine article. It is written at a non-technical level for a general audience. While some of the topics addresse are clearly disrputive / revolutionary in nature this book merely skims the surface and offers no real depth of thought on the subjects. The casual thinking represented here would have been interesting two or three years ago. I didn't take away any insights of interest.
8 people found this helpful
★★★★★
5.0
AHHKMMMW37QMPYZ22LXW...
✓ Verified Purchase
IT Predictions come true...
This book came to me in an unusual way. It was lying on a table labeled "Free" as we began a move of offices at my workplace and I began reading it, even though it was dated back to 2007. I began to realize that the very foundations of what Carr is talking about in these first chapters of what I'm reading at the moment have indeed become fact. Computing is now a utility.
I've made my living in computer science for a long time. It can be tempting to believe that the business will always be about devices and how much power sits at your desktop, but I began to see the first cracks in the current distributed model a decade ago when I was attending classes on-line and my professors were talking about the fact that centralized systems would become economic drivers in their own rights when it came to information distribution. While I do believe that a certain amount of local compute power will always be the case with companies having some autonomy due to, say, security concerns and protection of intellectual property that cannot easily be placed on the public cloud, there is certainly room to say that the dominant model of massive cloud systems dominating the bulk of IT distribution will remove the need large swaths of the business world from needing to use their own local compute clusters and storage.
What does this mean in terms of the long-term? I'm unsure as to the total impact, but I do see that a career in computing will mean something completely different to that which I entered in my first experiences so many years ago. I'm becoming convinced that expertise in other subjects outside of computing combined with savvy in how to most effectively leverage the new IT is the answer. Hence, egads!, I find myself willing to consider that fewer programmers may be needed down the road and that the real innovators will be the people using the tools that computers provide and not the makers of the tools themselves. It seems that the tools themselves have become malleable enough to accommodate most of the needs of the masses to such a degree that dependence on skilled IT workers for general purpose tasks is at an effective end. Witness the explosion of applications on smart phones, web site generation tools that virtually eliminate most of the need for a designer to establish a credible presence on the web and the low costs for these technologies and you get the general idea that the internal makeup of IT works themselves are undergoing precisely the changes mentioned in Carr's book.
I'm finishing the book in the next couple of days and thus am only discussing the first third of the book read to date, but felt I needed to comment. Here we are near 2012 and this book is eerily on-target with it's message. I can only wonder about what sort of jobs that the new IT worker will be doing in the next five years. I suspect it will be nowhere near those tasks of even a decade ago. I can't wait to shift my own skills in accordance with these changes. This is a most exciting time in the industry and the one thing I can recommend is this: Stay Nimble and learn fast. You'll need to.
4 people found this helpful
★★★★★
5.0
AF3AJ3ERB3UPWXHF5NLP...
✓ Verified Purchase
The Dark Underside of the Internet
In the 1990's the internet was heralded as a transformative medium that would level society and provide free information. Now after the "dot-com bust", we are seeing a different perspective. Carr describes how the internet is indeed having profound effects, and some of them may not be as benign as we anticipated:
* professionalization dwindling in the wake of internet amateurs/volunteers doing the work.
* "unbundling" of services and media - so that we only look at what's most attractive and ignore other things (which may be actually more important)
* IT departments disappearing as everyone accesses computing services as a generic "utility" provided by an outside vendor.
* our every action on the net is tracked/recorded/compiled, whether we think we are "anonymous" or not, and this info is of intense interest to industry and government.
* cyberspace isn't as immune to censorship and government control as we thought.
Carr's thesis is that "computing" will increasingly be done by outside vendors whom we all will access/interface with. He likens it to the growing acceptance and ubiquity of public electrification as a "utility" that replaced in-house power/lighting sources (generators, gaslight, etc.)
However, I didn't completely agree with his analogy. The big difference between electricity and computing is that computing involves information, which is infinitely more valuable. If someone taps into my electrical power, my lights might brown out. But if someone taps into my data, it could be disastrous and irremediable. For this reason, I have doubts as to whether business, governments and individuals will be quite so willing to rely so heavily on an outside vendor for their computing and critical infrastructure. It's one thing if you're running a business like a photoshop utility or blog. It's quite another when you're handling sensitive financial information or public safety systems which cannot be allowed to fail or be compromised.
However, the book does provide rich food for thought and so I recommend it. I just caution that in the 1990's there was a lot of hubris associated with the "power" of the internet. In this book, it seems like some of the hubris has just assumed a different form and should still be taken with a grain of salt.
4 people found this helpful
★★★★★
2.0
AF2SNWUQIAG3UGWP5OY3...
✓ Verified Purchase
book-length advertisement for web-services
Carr teases us with a fascinating explanation of the development of electricity-as-service in the 19th century, then examines the consequences of the grid in the 20th century with hasty, shallow criticism. The rest of the book is a patchwork of web 2.0 anecdotes and borrowed predictions.
The book is worth reading if you don't know much about computer science or computer commerce, and wonder what all the hubbub is about. Don't let it be the last word you consider on the subject -- this is a seductive book, but don't expect it to stick around for breakfast.
3 people found this helpful
★★★★★
5.0
AEVIUOXNKQFXQX4SDTBZ...
✓ Verified Purchase
A Fascinating Look at What Lies Ahead for Us in Technology
[[VIDEOID:7dc530cfae4259956c70cd6052314cbc]] What does electricity and computing have to do with each other, besides the fact you need electricity to make your computer work? In "The Big Switch" you are taken on a fascinating journey to show how computing is following much the same path as elecricity did when it was first rolled out to the masses. Whereas there was a time when electricity was generated on-site where it was needed, the same can be said of computing power today -- we have our own IT departments and computer installations. In the future the author argues much of this will be moved to the "cloud" where our ideas of how we use computers will change radically. In this video I walk you through some of the concepts of cloud computing and introduce you more to this fascinating book that is sure to have IT-geeks and non-geeks alike reading to find out what lies ahead for us in the not too distant future.
2 people found this helpful
★★★★★
4.0
AFXXTYAQ7LBDSADLEQI7...
✓ Verified Purchase
Really two books in one
For those who know or care about the infrastructure undergirding our technology revolution, this is a must-read book. The thesis is simple: we're at a tipping point where "utility computing" will quickly replace in-house data centers. It sounds simple, but the implications are not. The first half of the book lays out and describes the revolution, sometimes in breathless terms. The second half is much darker, however, detailing projected consequences. The author points out that a number of popular websites these days have nearly zero staff--the content comes from users and the infrastructure is rented utility computing from the likes of Google and Amazon. This means that huge online businesses do not translate to employment. In the past, when industries, such as electrical utilities, have undergone major transformation, people lost jobs, but new jobs were available using different skills. The author has a gloomy outlook here: the lost jobs may not be replaced. I suspect the real outcome will be a bit better. People are inventive and new technologies (perhaps not electronic) will need people. Overall, a great book, but I do think the second half is rather darker than it needs to be.