Site moved to enterrasolutions.com, redirecting in 1 second...

21 posts categorized "Books"

June 04, 2013

A Review of "Big Data: A Revolution that Will Transform How We Live, Work, and Think"

The topic of big data may sound a bit dry and technical; but, Viktor Mayer-Schönberger and Kenneth Cukier, in their book entitled Big Data: A Revolution that Will Transform How We Live, Work, and Think, bring it to life with clear explanations, historical references, and fascinating examples. In order to explain how big "big data" is, they write:

BIG DATA_hres"If it were all printed in books, they would cover the entire surface of the United States some 52 layers thick. If it were placed on CD-ROMs and stacked up, they would stretch to the moon in five separate piles."

There are numerous critics who hate the label "big data" because it's a relative term and ill-defined. Mayer-Schönberger and Cukier simplify the debate by writing:

"One way to think about the issue today -- and the way we do in the book -- is this: big data refers to things one can do at a large scale that cannot be done at a smaller one, to extract new insights or create new forms of value, in ways that change markets, organizations, the relationship between citizens and governments, and more."

The topic of big data includes everything from the collection of data (which comes from everywhere), to the computing power needed to analyze that data, to the software that has been developed to make the analysis possible. When most people write about big data and its value, what they are generally talking about are analytical insights associated than can be drawn from very large sets of data. As Mayer-Schönberger and Cukier note, "The real revolution is not in the machines that calculate data but in data itself and how we use it."

They point out that historically mankind has progressed as our access to data has increased. As mankind's ability to gather and store knowledge has improved, what we have been able to do with that data has also improved. "It's the same with big data," write Mayer-Schönberger and Cukier, "by changing the amount, we change the essence." Part of that "essence" will be the ubiquity of computers and networks used to augment human judgment. They write:

"Big data is all about seeing and understanding the relations within and among pieces of information, that until recently, we struggled to fully grasp. ... Big data is about three major shifts of mindset that are interlinked and hence reinforce one another. The first is the ability to analyze vast amounts of data about a topic rather than be forced to settle for smaller sets. The second is a willingness to embrace data's real-world messiness rather than privilege exactitude. The third is a growing respect for correlations rather than a continuing quest for elusive causality."

One example the authors give of an insight gained through the analysis of big data involves Walmart. Analysis showed that the sales of Pop-Tarts increased dramatically at stores in areas predicted to be hit by a hurricane. As a result, store managers were directed to place displays of Pop-Tarts near the entrance of the store when hurricanes were forecast. It makes sense, of course, that when faced with a potential natural disaster, people would want to stockpile a food source that, in order to eat, doesn't require preparation or electricity and comes in a waterproof pouch. But, as Mayer-Schönberger and Cukier note, it really doesn't matter why that relationship exists in order for a retailer like Walmart to provide its customers what they want and, at the same time, increase their bottom line at the same time, it's the relationship not the cause that is important. That's the beauty of big data analytics.

The authors note that even though big data has been a hot topic for a couple of years, "In some ways, we haven't yet fully appreciated our new freedom to collect and use larger pools of data." They tell an intriguing tale about statistics and analysis that begins three centuries ago and leads up to how today's big data systems are able to do things like sequence DNA. They write:

"Sampling is an outgrowth of an era of information-processing constraints, when people were measuring the world but lacked the tools to analyze what they had collected. ... The concept of sampling no longer makes as much sense when we can harness large amounts of data. ... So we'll frequently be okay to toss aside the shortcut of random sampling and aim for more comprehensive data instead. Doing so requires ample processing and storage power and cutting-edge tools to analyze it all. It also requires easy and affordable ways to collect the data. In the past, each one of these was an expensive conundrum. But now the cost and complexity of all these pieces of the puzzle have declined dramatically. What was previously the purview of just the biggest companies is now possible for most. Using all the data makes it possible to spot connections that are otherwise cloaked in the vastness of information."

Mayer-Schönberger's and Cukier's book has numerous reviews over the past several months -- most of them recommending the book as a good read. Some reviewers, apparently, believe that Mayer-Schönberger and Cukier are cheerleaders for big data. For example, Kirkus Reviews' assessment of the book (one of the more enthusiastic you will read), states, "Plenty of books extol the technical marvels of our information society, but this is an original analysis of the information itself—trillions of searches, calls, clicks, queries and purchases. ... A fascinating, enthusiastic view of the possibilities of vast computer correlations and the entrepreneurs who are taking advantage of them." ["Big Data," Kirkus Reviews, 17 February 2013] When Gil Press asked them if they were cheerleaders for big data, Cukier quickly remarked, "We are messengers of big data, not its evangelists." Mayer-Schönberger added, "The reviewer did not read the book." ["What's to be Done about Big Data?" Forbes, 11 March 2013] Maybe the reviewer only read the beginning of the book. Most of the worrying aspects surrounding the topic of big data are found near the end of the book. As Evgeny Morozov notes in his review of the book, "Fortunately, 'Big Data' isn't just another cyber-utopian tome, and the final section of the book offers a critical look at some of the darker effects of recording and analyzing everything." ["When More Trumps Better," Wall Street Journal, 8 March 2013]

Press calls the book "an excellent introduction for general audiences." I agree with that assessment. He added, "The most important part of the book is the authors' discussion of potential risks and possible ways to address them, providing a launch-pad to a much-needed conversation regarding what’s to be done about big data." Another reviewer, Hiawatha Bray, agrees with Press that Mayer-Schönberger and Cukier recognize that big data has challenges and the potential for misuse. In his review of the book, he writes, "To their credit, the authors are well aware of technology’s relentless erosion of privacy. Even if you strip names and addresses from a database, it’s possible to identify individuals by analyzing enough of the websites they visit or the Google searches they run. ... Mayer-Schönberger and Cukier offer up some sensible suggestions on how we can have the blessings of big data and our freedoms, too. Just as well; their lively book leaves no doubt that big data’s growth spurt is just beginning." ["‘Big Data’ by Mayer-Schönberger and Cukier," The Boston Globe, 5 March 2013]

Big data is an important topic that is only going to grow in importance in the years ahead. Most pundits believe that we are in the infancy stage of big data and that as it matures the uses to which it can be put might surprise us all. That's why I agree with the authors that big data will transform how we live, work, and think. If you want gain a good basic understanding of the subject, Mayer-Schönberger's and Cukier's book is a good place to start.

April 03, 2013

Big Data is Transforming Your Life

A recent book entitled Big Data: A Revolution That Will Transform How We Live, Work, and Think has been getting a lot of press. The book was written by Viktor Mayer-Schönberger, a respected Internet governance theorist, and Kenneth Cukier, a long-time technology journalist who's been with The Economist for many years. In a review of the book, Gil Press writes:

"Big Data is an excellent introduction for general audiences to what has become a topic of conversation everywhere, faster than any other technology-driven buzzword in recent memory. To those who may react to 'big data' as today's incarnation of 'big brother,' Mayer-Schönberger and Cukier offer a comprehensive and highly readable overview of the benefits and risks associated with big data, which they define as 'the ability of society to harness information in novel ways to produce useful insights or goods and services of significant value.' ["What's to be Done about Big Data?" Forbes, 11 March 2013]

Press actually spoke with the authors, who "reacted sharply," when he "asked them if they are cheerleaders for big data, as one reviewer implied. 'We are messengers of big data, not its evangelists,' said Cukier. Added Mayer-Schönberger: 'The reviewer did not read the book.'" Clearly, Cukier and Mayer-Schönberger aren't cheerleaders because cheerleaders never point out the problems and weak spots associated with the teams they support. Cukier and Mayer-Schönberger, on the other hand, do discuss challenges associated with big data. Press insists it is "the most important part of the book." The reason for that, he explains, is that by pointing out both the benefits and risks associated with big data the book provides "a launch-pad to a much-needed conversation regarding what's to be done about big data." Cory Doctorow agrees with Press that "issues of governance, regulation, and public policy" represent "some of the most interesting material in the book." In fact, Doctorow believes that the discussion "probably needs to be expanded into its own volume." ["Big Data: A Revolution That Will Transform How We Live, Work, and Think," boing boing, 8 March 2013]

For companies involved with big data, the greatest challenges they will face will involve privacy. At least those will be the challenges that will create the most public reactions. "Mayer-Schönberger and Cukier point out that perfect anonymization is impossible in the age of big data," writes Press, "they are concerned about two other, less-discussed, risks. One is what they call 'propensity' or using big data predictions to punish people even before they acted. ... The possibility that our fascination with data may become a dangerous addiction is the third risk the authors of Big Data discuss, what they call 'the dictatorship of data.' The potential for abuse of data by people with bad intentions and misuse by blindly admiring people with good intentions is as big as the data itself." Doctorow writes:

"We know that computers make mistakes, but when we combine the understandable enthusiasm for Big Data's remarkable, counterintuitive recommendations with the mysterious and oracular nature of the algorithms that produce those conclusions, then we're taking on a huge risk when we put these algorithms in charge of anything that matters."

Despite all the relevant concerns associated with big data, Press writes, "Mayer-Schönberger and Cukier make it clear in the book that big data does not equal the rise of the machines. Big data, they say, is not about trying to 'teach' a computer to 'think' like [a] human. Nor does it foretell the 'end of theory' or abandoning making and testing hypotheses, the bedrock of scientific progress for centuries. 'In the world of big data,' Mayer-Schönberger and Cukier say, 'it is our most human traits that will need to be fostered — our creativity, intuition, and intellectual ambition.'" But, as the subtitle of their book declares, big data is transforming how we live, work, and think. The following infographic from Intel aptly illustrates why big data is so big.

Internet minute infographic_1080_logoNeither big data nor its analysis is going away. That genie can never be put back in the bottle -- nor should it be. The benefits of big data analysis are simply too great. Nevertheless, Mayer-Schönberger and Cukier are wise in pointing out potential pitfalls that lie in the road ahead. Irving Wladawsky-Berger, like Mayer-Schönberger and Cukier, believes that big data can transform the world. ["Reinventing Society in the Wake of Big Data," Wall Street Journal, 22 March 2013] Wladawsky-Berger cites MIT Media Lab Professor Alex "Sandy" Pentland, a big data pioneer:

"'This is the first time in human history that we have the ability to see enough about ourselves that we can hope to actually build social systems that work qualitatively better than the systems we've always had,' says Pentland. 'That’s a remarkable change. It's like the phase transition that happened when writing was developed or when education became ubiquitous, or perhaps when people began being tied together via the Internet. ... I believe that the power of Big Data is that it's information about people's behavior - it's about customers, employees, and prospects for your new business, ...' he says. 'This Big Data comes from location data from your cell phone and transaction data about the things you buy with your credit card. It's the little data breadcrumbs that you leave behind you as you move around in the world. What those breadcrumbs tell is the story of your life. It tells what you've chosen to do. ... Who you actually are is determined by where you spend time, and which things you buy. Big data is increasingly about real behavior, and by analyzing this sort of data, scientists can tell an enormous amount about you. They can tell whether you are the sort of person who will pay back loans. They can tell you if you're likely to get diabetes.'"

That kind of analysis can obviously be very transformative. Like most other pundits who discuss big data, Wladawsky-Berger is concerned about privacy issues. He writes:

"For big data to realize its potential requires access to vast amounts of personal information, which leads to very serious issues about privacy, data ownership and data control. Pentland strongly advocates that individuals should have the final say about the use of the data collected about them, including the ability to put the data in circulation and turn it into a personal asset by giving permission to share it for value in return. He has been working closely with the World Economic Forum (WEF) to help develop the proper guidelines for the collection and use of personal data in collaboration with private companies, government representatives, end user privacy and rights groups, academics and others."

All of the benefits and challenges of big data come from analysis. Data that sits fallow in a database isn't really the issue. "The value comes from what you do with it," writes Barry Devlin, "not how big it happens to be." ["Big Analytics Rather Than Big Data," SmartData Collective, 7 February 2013] When most pundits write about big data, what they are really commenting on is the analysis that flows from it. For example, Maria Deutshcer writes, "There are so many ways Big Data will impact our lives, from the economy to education." It's clear she is talking about analysis when she explains that "organizations are leveraging analytics to generate business value and, in the case of the Rio de Janeiro municipality, to raise the living standards of over 6 million people." ["Changing the World: Big Data Deep Dive Part 1," SiliconANGLE, 25 February 2013]

The debate about big data is just getting started. Like any large, amorphous concept, big data will be viewed from any number of different perspectives and proponents of those perspectives will support their views using different arguments. It would be unwise to ignore those who hold different perspectives simply because you don't believe in the arguments they make. Most arguments hold a kernel of truth and, if we are to gain the maximum benefit from the analysis of big data, we need to be aware of all of the potential risks associated with that analysis as well the benefits it can provide.

March 19, 2013

The Importance of Location in Targeted Marketing

Walter Loeb recently reported that Macy's has done well merging its retail and eCommerce channels using a strategy it calls "MOM" (My Macy's localization, Omnichannel, and MAGIC selling strategy). Loeb reports that the strategy has resulted in more satisfied customers and increased sales. ["Macy's Loves MOM, And Consumers Do Too," Forbes, 27 February 2013] Loeb briefly explains the MOM strategy:

"For more than three years the company has worked on specific localization – 'My Macy's – which focuses on having each store feature merchandise that is relevant to customers who live and shop in that area. ... The customer has responded enthusiastically to the My Macy’s strategy. The Omnichannel strategy allows for extraordinary service to customers. In addition to warehouse fulfillment of purchases from in-store, on-line or mail order customers, Macy’s now has 292 stores that participate in the fulfillment of orders – to insure that the customer receives better service as purchases are delivered quicker than ever before. By the end of the year management expects to have about 500 stores participating in this program. Magic selling is the third key initiative. Through new training tools, associates are taught how to be more engaged with the customers and how to be more empowered so that they can make decisions on the selling floor. It is an important step ensuring more caring and responsive customer service."

Although every letter in MOM represents an important part of the strategy, in this post I'll focus on just one aspect of that strategy -- tailoring products by location. According to Loeb, Macy's uses big data to understand differences between customers in different locations. He writes, "Whether it is Latino or Asian customers, the selection of fashion merchandise, as well as the taste level and sizing of the clothing, is often very different from store to store, and requires a trained buying staff. In some cases this localization program requires special advertising and displays, and there is much less of a cookie-cutter approach to buying across the company." As noted above, both Macy's and its customers have been happy with the results.

A blogger, who calls herself Beans and writes for ContentsEqualMoney, believes that location is going to drive one of the next big things in advertising. She asks, "What if you could advertise to potential customers based on their vicinity to your product or service?" You don't have to imagine such a scenario, she writes, because "this is one of those 'the future is now moments." ["Location-Based Advertising: Data-Driven Marketing for 2013, CEM Blog, 15 February 2013] Location-tailored, in-store merchandise and location-based, targeted advertising are obviously highly complementary strategies. Beans asserts, "You can combine analytics and principles of personalization with mobile location services to get in on what might be the next transformative trend in mobile marketing." She continues:

"I discussed the recent convergence of technology with overwhelming amounts of information in my recent post on big data. We know that businesses have access to more and more demographic data and huge amounts of analytics, and that the sheer volume can sometimes lead to decision-making paralysis. The same is, in many respects, true for consumers, too. How many ads do you see in a day? How many product reviews do you check out before making a purchase or using a service? So many it's hard to keep track of them, right? Search Engine Watch recently predicted that one of the biggest challenges for 2013 is going to be breaking through all of that growing noise to make your business' voice heard over everyone else's. They also pointed out that consumer purchases continue to involve mobile devices at higher percentages with each new survey, from the research to the buying phase. So, the question is, how do you project the most salient image of your business while integrating mobile technology?"

The answer to breaking through the noise, she asserts, involves making your advertising more mobile, personal, and local. "Those three words," she writes, "hold one of the keys to maximizing your ROI in 2013." She explains:

"The 'mobile' is obvious, of course – ... the mobile market is booming both in terms of usage and consumers using various mobile devices to make purchases. But if you pair it with 'personal' and 'local,' you can ride the mobile reinvention train to a huge boost in sales. A recent Clickz article discussed the concept of 'personalization' in recent years, as we've shifted to integrate multiple devices and huge amounts of technology into our lives. The author suggested that recently, the idea of personalizing a consumer experience has started to shift from the 'if you liked this book, you’ll probably like this one' that we're all familiar with from sites like Amazon, to a model of a 'user experience' that creates a unique system for each individual. This system crosses over domains from one device to another, and in some ways even from the physical world to the digital and vice versa. And therein enters the 'local' aspect. Mobile technology like smartphones and to some extent tablets have already bridged the divide between the physical and digital worlds; we have the internet with us everywhere, safely in our pockets. For businesses, the trick is to now make a personalized digital experience based on a physical location."

The most personalized experience you can offer a customer is making them feel like they are your only customer -- a market of one. For more on that topic, read my post entitled Marketing to the Individual. Beans writes, "Conceptually, location-based advertising sounds great." Do you sense a "BUT" coming on? There is one, but it's not as deflating as you might think. Beans explains:

"Let's say you run a brick-and-mortar store. Your customer gets an alert for a sale at your store, or they receive an awesome coupon that will expire in half an hour, when they come within a mile of your physical location. They throw on the brakes, pay you a visit, make a purchase, and your location-based ad campaign is a resounding success. [A] Forbes’ article on whether or not location-based advertising is the future of mobile marketing complicates the matter a little bit. For the most part, consumers apparently will not stop what they're doing just because they get an alert on their phone saying that they're near one of their favorite stores. The vast majority of purchases are planned, say the authors, not impulsive, so you're not likely to catch that many customers who weren't already thinking about buying from you. And of course, there are myriad privacy concerns when it comes to tracking potential customers' locations. However, location-based advertising remains a very attractive next step in mobile advertising, and it’s already happening. The recent launch of Google Now offers a function to alert Android users to nearby attractions, events, and even photo opportunities. JiWire, a company that specializes in location-based ads, has already brought this kind of targeted advertising to multiple airport hubs in the United States, using WiFi hotspot locations rather than phone-based location services to advertise for nearby businesses."

Given the fact that shoppers on unlikely to stop what they're doing to respond to an advertising alert near them, it would be appear that location-based advertising may best be used in a mall-like setting (i.e., when consumers are already involved in a shopping environment). Nevertheless, Beans insists, "Now is the time to adopt a location-based advertising strategy. The key here is to make sure that your ads are not only relevant to your audience, but are hard-hitting enough to make potential customers stop in at your business as they're driving or walking past it." She offers "a couple of concrete ways that you can optimize your location-based ad campaign." They are:

  • Use predictive analytics. Physical location is a narrow field, sure, but you want to make sure you’re targeting the customers who really want to buy. Pay attention to behavioral data, and use it in conjunction with location data to direct your efforts to relevant consumers.
  • Make strong, short-term offers. Potential customers are more likely to turn into definite customers if they receive a location-based offer that requires them to act now, and provides an impetus to do so.
  • Optimize local SEO. Even if you decide to forego your own campaign, make sure your web presence carries all the necessary information about your location and hours, so that third party searches like Google Now will alert nearby potential customers to your existence.

I would add that you take a page from Macy's playbook and tailor your inventory to your location. It doesn't matter how close to your store a consumer approaches, he or she won't walk in if what you offer does not suit their tastes or preferences. The more tailored the inventory, the more likely it is that a consumer will become a customer. In the Forbes' article cited by Beans, Sense Networks CEO David Petersen, told its author, Steve Olenski, "Some of the early failures in mobile advertising are due to poor timing and sending irrelevant offers to consumers because they aren’t based on behavior and location data." Sending irrelevant offers to consumers is the opposite of what targeted marketing is all about.

March 12, 2013

Business and Artificial Intelligence

"A Singularity has no business model," writes Science fiction author Bruce Sterling, "no major power group in our society is interested in provoking one, nobody who matters sees any reason to create one, there's no there there. So, as a Pope once remarked, 'Be not afraid.'" ["'The Singularity': There's No There There," Edge, January 2013] Sterling may be correct that there is currently no business model for a sentient computer, but that's not surprising since a sentient computer has to be built. There are, however, numerous artificial intelligence (AI) systems, and they have been welcomed by the business world with open arms. My company, Enterra Solutions, is a good example. Enterra is an applied science and technology firm focusing on the development and application of an advanced artificial intelligence system, the Enterra Cognitive Reasoning Platform™ (“CRP” or the “Platform”). The Platform is generalized in structure to work across markets and disciplines; yet designed to be specifically applied to the challenges and data of individual industries and functional areas. We obviously believe there is a business model for our Platform or we have been wasting a lot of time and money developing it. To be fair, Sterling doesn't doubt that a good case can be made for using AI applications in business. He is simply arguing that limited AI applications are being pursued (rather than Artificial General Intelligence -- the so-called Singularity), because they are good enough for what businesses need right now.

Sterling's remark nevertheless started me thinking about some of the things that are being accomplished today with the help of artificial intelligence -- like getting messy students to clean their rooms. Last year Michael Crider reported that a group of nerds "banded together to create MOTHER, a combination of home automation, basic artificial intelligence, and gentle nagging designed to keep a domicile running at peak efficiency." ["MOTHER artificial intelligence forces nerds to do the chores… or else," Slash Gear, 21 February 2012] He explains:

"The aim is to create an AI suited for a home environment that detect issues and gets its users (i.e., the people living in the home) to fix it. Through an array of digital sensors, MOTHER knows when the trash needs to be taken out, when the door is left unlocked, et cetera. If something isn’t done soon enough, she it can even disable the Internet connection for individual computers. MOTHER can notify users of tasks that need to be completed through a standard computer, phones or email, or stock ticker-like displays. In addition, MOTHER can use video and audio tools to recognize individual users, adjust the lighting, video or audio to their tastes, and generally keep users informed and creeped out at the same time. MOTHER's abilities are technically limitless – since it's all based on open source software, those with the skill, inclination and hardware components can add functions at any time. Some of the more humorous additions already in the project include an instant dubstep command. You can build your own MOTHER (boy, there's a sentence I never thought I'd be writing) by reading through the official Wiki and assembling the right software, sensors, servers and the like. Or you could just invite your mom over and take your lumps. Your choice."

If you don't believe that keeping a domicile clean is worth pursuing with the help of AI, perhaps saving lives better suits your sensibilities. Annual traffic fatalities have been trending downward since the 1980s, but there are still around 30,000 deaths occurring each year. Scientists believe that number can be significantly reduced through the use of vehicle-to-vehicle (V2V) communications and AI. "More advanced versions of the systems can take control of a car to prevent an accident by applying brakes when the driver reacts too slowly to a warning." ["Cars Avoid Crashes By Talking To Each Other," by Joan Lowy, Associated Press, Manufacturing.net, 8 June 2012] Lowy reports that the National Highway Traffic Safety Administration "has been working on the technology for the past decade along with eight automakers: Ford, General Motors, Honda, Hyundai-Kia, Mercedes-Benz, Nissan, Toyota and Volkswagen." Scott Belcher, president of the Intelligent Transportation Society of America, told Lowy, "We think this is really the future of transportation safety, and it's going to make a huge difference in the way we live our lives." Lowy continues:

"The technology is already available, said Rob Strassburger, vice president for safety of the Alliance of Automobile Manufacturers. He said what's needed is for the government to set standards so that all automakers use compatible technology. Some of the safety technologies for V2V are already available in cars, although they tend to be offered primarily on higher-end models. Together, the currently available technologies and the future V2V systems may effectively form a kind of autopilot for the road. Said Strassburger: 'The long-term trajectory for these technologies is the vehicle that drives itself -- the driverless car.'"

Adrian Gonzalez cites some sobering statistics about the number of traffic fatalities that involve distracted drivers, then writes:

"Fortunately, automakers are starting to embed technologies from tomorrow's driverless cars into today's vehicles. For example, check out this collision warning and emergency brake system Volvo Trucks introduced last year. The system, using lasers and sensors, detects when a collision is likely to occur and it alerts the driver with visual and audible alarms. If the driver fails to take action, the system will apply the brakes, first gently and then hard. It's amazing to see how quickly the truck comes to a full stop, even while carrying a 40-ton load." ["Beware, Driverless Cars Are Everywhere!" Logistics Viewpoints, 27 February 2013]

Gonzalez concludes, "Until these systems become standard features in all cars and trucks, we’ll just have to keep our hands on the wheel, eyes on the road, and mind on driving because you never know when one of those 'driverless cars' next to you will suddenly veer into your lane or slam on the brakes in front of you." Researchers in England are also working on an AI system that can be used to control traffic signals. ["Artificial Intelligence Used to Create Traffic Control Systems," by Tiffany Kaiser, Daily Tech, 27 August 2012] Obviously, the automobile industry believes it has found a business case for AI.

Artificial intelligence applications are also finding a home in the agricultural sector. It's common knowledge that insect infestations can devastate crops. Traditionally baited traps have been used to monitor for destructive insects. Unfortunately, collecting the data is both time consuming and results have been time delayed. That may change. "Taiwanese scientists are doing it a better way: automating the process with infrared lasers that scan the traps." ["Artificial Intelligence Predicts and Combats Crop-Destroying Fruit Flies," by Colin Lecher, Popular Science, 27 August 2012] Lecher explains how the system works:

"Every time the beam is broken, add one to the tally. That number is radioed to a local station every 30 minutes, where officials can monitor it. The traps are also fitted with weather sensors to help keep track--high levels of humidity or other changes would increase the likelihood of an infestation. So far they've set up 240 traps to regularly monitor the flow of fruit flies. When a trap counts more than 1,024 fruit flies in 10 days, it sets off an alert. But algorithms help it learn what's normal for the area and current weather, letting it adjust to specific situations. By testing that system on data from past traps, they found it could predict an outbreak with 88 percent accuracy."

Global supply chains are another area in which AI applications have found a business model. Steve Banker reports, "ToolsGroup is incorporating machine-learning technology into its demand planning solution. Machine learning, which is a branch of artificial intelligence (AI), uses specially-designed algorithms to generate predictions based on entered data. ToolsGroup is using AI to improve its demand forecasting capabilities." ["Dell Uses Artificial Intelligence at Global Command Centers," Logistics Viewpoints, 19 November 2012] For his post, Banker talked to Owen Panko, the Director of Program and Project Management at Dell’s Global Command Centers. He learned that "Dell has five global command centers, analogous to NASA's mission control center, staffed with about 200 people globally. These control towers use business process management (BPM), AI, and analytics as core tools to support their service delivery supply chain. These command towers provide visibility and process flows to parts, people, call center activity, and their technology resolution experts."

Other areas in which business cases have been proven for AI systems are healthcare and utilities. In the healthcare sector, AI applications are used to make diagnoses and to detect fraud. There have been a number of articles about how IBM's Watson is now being used to help doctors diagnose diseases. To learn more about how AI is being used to detect fraud, read my post entitled Illuminating (and Eliminating) Fraud using Dark Data. In the utility sector, AI systems are being used to help manage loads on the electrical grid (see, for example, "Palm creator’s brain-mimicking software helps manage the smart grid," by Derrick Harris, Gigaom, 29 January 2013] Obviously, I've only scratched the surface of ways that AI can be used in the business world. Frankly, business people do reflect Sterling's ambivalence towards the development of Artificial General Intelligence, but they are excited about the possibilities being opened by limited AI applications.

November 23, 2012

Ray Kurzweil Heats Up Artificial Intelligence Discussions

Ray Kurzweil is a very bright and entertaining individual. He is also, at times, controversial. His new book, How to Create a Mind, has created a bit of stir in the artificial intelligence world. His genius isn't being questioned; but, some of his conclusions are. Gary Marcus writes, "Ray Kurzweil is, by all accounts, a genius. He holds nineteen honorary doctorates, has founded a half-dozen successful companies, and was a major contributor to the field of artificial intelligence. ... Time magazine recently featured Kurzweil on its cover, and Fortune described him as 'a legendary inventor with a history of mind-blowing ideas.' And now he has a new book, with a subtitle that suggests he has found another such idea: 'How to Create a Mind: The Secret of Human Thought Revealed.'" ["Ray Kurzweil’s Dubious New Theory of Mind," The New Yorker, 15 November 2012] It didn't take long for people to react to Kurzweil's latest ideas. Marcus, a professor of psychology at NYU, obviously has a few concerns about what Kurzweil writes. He continues:

"In the preface to the book Kurzweil argues, with good reason, that 'reverse-engineering the human brain may be regarded as the most important project in the universe.' He then presents a theory he calls 'the pattern recognition theory of mind (PRTM)' which he claims 'describes the basic algorithm of the neocortex (the region of the brain responsible for perception, memory, and critical thinking).' Kurzweil suggests that his conclusions are 'inescapable' and that the principles he espouses can be used 'to vastly extend the power of our own intelligence.' That would be big news. But does the book deliver? Kurzweil’s critics have not always been kind; the biologist PZ Myers once wrote, 'Ray Kurzweil is a genius. One of the greatest hucksters of the age.' Doug Hofstadter, the Pulitzer Prize winning author of 'Gödel, Escher, Bach' has been even harsher, saying once in an interview that 'if you read Ray Kurzweil's books … what I find is that it's a very bizarre mixture of ideas that are solid and good with ideas that are crazy. It's as if you took a lot of very good food and some dog excrement and blended it all up so that you can't possibly figure out what's good or bad.'"

From the headline of Marcus' article, it's clear that he believes there are more feces than findings in Kurzweil's latest book. Marcus's biggest objection is that Kurzweil writes more as a new-age sage than as a neuroscientist. Marcus reports, for example, that Kurzweil begins his discussion by offering "vague gestures toward an unusual kind of neuron called spindle cells, ... but offers no references and very little direct evidence." Marcus also takes umbrage with Kurzweil's belief that the mind is machine-like. He writes:

"Kurzweil returns to the business of explicating and defending his main thesis—according to which the part of the brain that is most associated with reasoning and conscious thought, the neocortex, is seen as a hierarchical set of pattern-recognition devices, in which complex entities are recognized as a statistical function of their constituent parts. Kurzweil illustrates this thesis in the context of a system for reading words. At the lowest level, a set of pattern recognizers search for properties like horizontal lines, diagonal lines, curves, and so forth; at the next level up, a set of pattern recognizers hunt for letters (A, B, C, and so forth) that are built out of conjunctions of lines and curves; and at still a higher level, individual pattern recognizers look for particular words (like APPLE, PEAR, and  so on that are built out of conjunctions of letters). The acronym P.R.T.M., for Pattern Recognition Theory of Mind, is new, but to scientists in the field, the basic idea is significantly less new than Kurzweil's subtitle ('The Secret of Human Thought Revealed') lets on. Anyone who knows the history of A.I. will recognize that the basic theory (and even the diagrams that are used to illustrate it) is very much in the spirit of a textbook model of vision that was introduced in 1980, known as neocognition."

Marcus believes that the inescapability of Kurzweil's concepts is undermined by the fact that Kurzweil didn't bother "to build a computer model that instantiated his theory, and then compare the predictions of the model with real human behavior." Ronald Bailey, the science correspondent for Reason Magazine, isn't as offended as Marcus with Kurzweil's arguments. He writes a rather favorable review of the book. ["Head in the Cloud," Wall Street Journal, 16 November 2012] Bailey lays out Kurzweil's argument for pattern recognition theory of mind, as did Marcus, but accepts it much more readily than Marcus did. He writes:

"The insight that brains are built of pattern-recognition modules leads Mr. Kurzweil to argue that it will be possible to design artificial intelligence in the much same way. 'The next step, of course, will be to expand the neocortex itself with its nonbiological equivalent,' he writes. These synthetic neocortexes, he says, will consist of vast parallel sets of pattern-recognition modules. Already by using online resources, Mr. Kurzweil notes, people are migrating their thinking and memories to the computational 'cloud.' And computation is getting ever cheaper and more pervasive; Mr. Kurzweil calculates that, later in this century, 'a thousand dollars worth of computation will be trillions of times more powerful than the human brain.' He foresees our augmenting our biological neocortexes by hooking them up wirelessly to cloud-based synthetic ones. These synthetic add-ons might be composed of trillions of pattern-recognition modules—a sort of transcendent iPhone. By around 2040, Mr. Kurzweil says, we will be able to replicate and upload the entire information content of our brains into the cloud."

Marcus counters, "Does the P.R.T.M. predict anything about human behavior that no other theory has predicted before? Does it give novel insight into any long-standing puzzles in human nature? Kurzweil never tries to find out." He continues:

"Kurzweil compares his theory with the physical structure of the brain, hurling a huge amount of neuroanatomy at the reader, and asserting, without a lot of reflection, that it all fits his theory. A recent paper (more controversial than Kurzweil may have realized) claims that the brain is neatly organized into a kind of three-dimensional grid system. Kurzweil happily takes this as evidence that he was right all along, but the fact that the brain is organized doesn’t mean it is organized as Kurzweil suggests. We already knew that the brain is structured, but the real question is what all that structure does, in technical terms. How do the neural mechanisms in the brain map onto the brain's cognitive mechanisms? Without an understanding of that, Kurzweil's pointers to neuroanatomy serve more as razzle-dazzle than real evidence for his theory."

For his part, Bailey appears to accept Kurzweil's arguments as well as the conclusions he draws from them. He writes:

"Mr. Kurzweil thinks that we will not only augment our own minds but also create conscious, independent artificial intelligences. We will know that they are conscious because they will tell us so in a convincing way. And the super-intelligent machines we create will not replace us. 'This is not an alien invasion from Mars—we are creating these tools to make ourselves smarter,' he says. 'We build these tools to extend our own reach.' Just how far will our reach extend? Exponentially expanding intelligence, Mr. Kurzweil says, will quickly solve minor problems like war, death and material scarcity and then head out to colonize the rest of the universe. He modestly concludes: 'Waking up the universe, and then intelligently deciding its fate by infusing it with our human intelligence in its nonbiological form, is our destiny.' Sounds good to me."

Well it doesn't sound good to Marcus. He writes, "The deepest problem is that Kurzweil wants badly to provide a theory of the mind and not just the brain. Of course, the mind is a product of the brain, as Kurzweil well knows, but any theory that seriously engages with what the mind is has to reckon with human psychology—with human behavior and the mental operations that underlie it. Here, Kurzweil seems completely out of his depth." He explains:

"Not a single cognitive psychologist or study is referred to, and he scarcely engages the phenomena that make the human mind so distinctive. There's no mention, for example, of Daniel Kahneman's Nobel Prize winning work on human irrationality, Chomsky's arguments about innate knowledge that sparked the cognitive revolution, or Elizabeth Spelke's work on cognitive development demonstrating the highly nuanced structure that is present within the mind even from an extremely early age. Similarly absent is any reference to the vast literature on anthropology, and what is and isn't culturally universal."

Marcus concludes that Kurzweil's "secret of human thought" is little more than a generic theory that "has been around since the late nineteen-fifties." He argues that there are too many questions left unanswered for anyone, including Kurzweil, to claim any certitude. He writes:

"What Kurzweil doesn't seem to realize is that a whole slew of machines have been programmed to be hierarchical-pattern recognizers, and none of them works all that well, save for very narrow domains like postal computers that recognize digits in handwritten zip codes. This summer, Google built the largest pattern recognizer of them all, a system running on sixteen thousand processor cores that analyzed ten million YouTube videos and managed to learn, all by itself, to recognize cats and faces—which initially sounds impressive, but only until you realize that in a larger sample (of twenty thousand categories), the system's overall score fell to a dismal 15.8 per cent. The real lesson from Google's 'cat detector' is that, even with the vast expanses of data and computing power available to Google, hierarchical-pattern recognizers still stink. They cannot come close to actually understanding natural language, or anything else for which complex inference is required."

Marcus argues, "The kind of one-size-fits-all principle of hierarchical-pattern learning that Kurzweil advocates doesn't work on its own in artificial intelligence, and it doesn't provide an adequate explanation of the brain, either." Marcus accepts the fact that Kurzweil "knows artificial intelligence," but he asserts that "Kurzweil doesn’t know neuroscience" or "understand psychology." He concludes:

"To truly reverse-engineer the human mind, we may need a real consilience, to borrow a word from the Harvard biologist E. O. Wilson, a coming together of workers in A.I. with researchers who study the human mind from a wide range of perspectives — neuroscientists and cognitive psychologists, and maybe even artists, musicians, and writers, too. The challenge of figuring out how the mind works is too complicated for even the smartest of entrepreneurs to solve on their own."

Having not read the book, I'll leave final judgment of its value to those who have. My suspicion, however, is that the book will be more entertaining than enlightening if all it does is rehash ideas that have presented previously. I agree with Marcus that we have much yet to learn about the mind (as opposed to the brain). IBM just announced that it had created the world's fastest supercomputer that contains 530 billion simulated neurons and 100 trillion simulated synapses, which equals the number of neurons and synapses in the human brain. That computer might help determine whether or not Kurzweil is correct -- but that's topic for a later date.

November 12, 2012

Aging and Asia: Implications for Marketers and Retailers

In a book entitled 100 Plus, author Sonia Arrison writes, "We are at the cusp of a revolution in medicine and biotechnology that will radically increase not just our life spans but also, and more importantly, our health spans." ["Bioengineering Methuselah," by Nick Schulz, Wall Street Journal, 31 August 2011] Schulz, a fellow at the American Enterprise Institute, reports that Arrison claims this revolution in longevity will "change everything, from careers and relationships to family and faith." If the revolution does occur, it will also change the types of products that will be produced and marketed. In previous posts, I've noted that analysts are already describing how manufacturers and retailers are having to deal with a bifurcated market that involves older and younger consumers. Their tastes, shopping habits, and product choices vary widely. Imagine how differentiated products are going to have to become if a significant percentage of the population lives beyond 100.

Before we go too far down this line of reasoning, however, Schulz reminds us that "it's worthwhile to keep in mind the ecstatic predictions a few years ago of the breakthroughs that would be made possible by human-genome sequencing—and the modest gains that have so far resulted." He concludes, "Predictions are easy; science is hard." Schulz concludes his op-ed piece with these thoughts:

"If humans do begin living to 150, then what? If Medicare and Social Security are in trouble now, what happens when they must support multiple generations of retirees? In Ms. Arrison's mind, we'll be living healthy, productive working lives until very near the end. The more pressing concerns, for her, have to do with the strain on natural resources and the added pollution of a swelling world population. Noting that similar worries have been raised whenever technology alters social conditions, Ms. Arrison argues that apocalyptic prophecies are unlikely to be realized. Increasing wealth and mankind's adaptability and ingenuity mean that as new problems emerge, new solutions will be forthcoming. 'In looking at the trends of history,' she says, 'we can see that even when there are downsides to a particular wealth- or health-enhancing technology, the problem is often fixed once the population reaches a point where it feels secure in spending the resources to do so.' Ms. Arrison's sunny outlook is infectious, and surely mankind does have remarkable powers of problem-solving and adaptation. But one can't help wishing, a bit ahead of time, for some wise counsel from one of those 150-year-olds she envisions, who might be able to tell us whether all the effort and all the dreaming were worth it."

Regardless of whether the scientific breakthroughs discussed by Arrison occur, two are certain. First, scientists "are working furiously to make it possible for human beings to achieve Methuselah-like life spans"; and, second, "the number of people living to advanced old age is already on the rise." ["Living to 100 and Beyond," by Sonia Arrison, Wall Street Journal, 27 August 2012] It is this latter trend that I want to discuss. As my friend Thomas P.M. Barnett, Chief Strategist for Wikistrat, is fond of saying. China may get old before it gets rich. And China is not the only Asian country encountering demographic challenges. In a paper published by the Asia Development Bank, Jayant Menon and Anna Melendez-Nakamura, wrote, "Within the next few decades, Asia is poised to become the oldest region in the world; reforming policies and creating new structures and institutions to address this challenge is a huge and complex undertaking that requires a big head-start." ["Aging in Asia: Trends, Impacts and Responses," February 2009] Although Menon's and Melendez-Nakamura's advice was primarily aimed at Asian governments, the advice to get "a big head-start" is just as important for businesses. Like Barnett, Menon and Melendez-Nakamura wonder if populations in emerging market countries in Asia will "go over the hill before getting to the top."

Rapidly aging Asian populations are going to have profound implications for the global economy. The Lex Team at the Financial Times writes, "Step back from the eurozone crisis and falling commodity prices for a minute and consider the role of demographics in investments." ["Asian demographics – grey area," 19 October 2012] The team writes:

"In just five years China’s workforce will start to shrink. By 2030 the median age of Japan's population will be 51 and in China it will be 43, according to Deutsche Bank. By 2050 the share of 65 year olds in the population in most of Asia will more than double. Asia's demographic dividend is almost over and its impact is far reaching. Japan's depleting workforce has been directly correlated with the demise of its economic growth for the past 60 years. That is not to say China's economy will stagnate as its workforce shrinks, but it will hurt labour costs further."

The op-ed piece goes on to note, "Ageing populations are not all bad, though. In Japan it is pensioners who are propping up consumption as they spend their hard-earned savings on things such as package tours, convenience food and medical care." That goes directly to my earlier point that manufacturers and retailers are increasingly going to have to deal with a bifurcated market that involves older and younger consumers. Currently in Asia "new products and services for the elderly" are the focus of many opportunistic companies, and "health-care-related businesses are seeing soaring demand." ["Businesses Focus on Region's Aging Population," by Juro Osawa, Wall Street Journal, 21 August 2012] Osawa reports, "In Japan, companies that previously had little to do with the issue of aging have jumped on the bandwagon." Osawa describes the current demographic situation with regards to aging. He writes:

"Japan's population is the world's grayest, according to a 2009 United Nation report, with nearly 30% aged 60 or older. Other parts of Asia, such as China, Taiwan, Hong Kong, South Korea and Singapore, are also anticipating a surge in the percentage of elderly citizens. In China, people over the age of 60 now account for 13.3% of the country's population of 1.34 billion, up from 10.3% in 2000, according to the National Bureau of Statistics, and the aging trend is expected to accelerate.

Combine those demographic factors with Asia's increasing urbanization and you can easily see that the business landscape for manufacturers and retailers is going to change dramatically. If they are to succeed, they need to understand this landscape in a more defined way. As I wrote in a post entitled Tapping the Economic Power of Mega-Cities: "Emerging Big Data analytic solutions can help highlight ... differences [such as age, cultural preferences, etc.] so that manufacturers and retailers get the right products to the right people in the right amounts. Sometimes the differences between regions (or even cities) are difficult to discern. Analytic techniques are now available that can recognize these differences in a deep, substantive way and on many levels. Targeted marketing will be critical if western companies are going to be successful in emerging markets."

If Sonia Arrison is correct and scientific breakthroughs result in longer life and health spans, companies targeting aging consumers are going to have to look beyond health care products and services to a whole range of products and services that cater to a healthier, active group of older adults. There are still a few years available for planning, but I wouldn't wait too long to start. Big data is going to be very useful in helping understand just how older individuals are changing and the lifestyles they want to live. Baby boomers have proven during the current recession that older consumers can be a lifeline for businesses during hard times. That will be as true in the future as it today.

April 20, 2012

Big Data and Language

Since Enterra Solutions uses an ontology in most of its solutions, the topic of language is of interest to me both personally and professionally. That's why two recent articles caught my attention. The first article discusses how Big Data is being used to discover how the use of words has changed over time. The second article talks about how some executives are taking courses aimed at making them more literate in the language of IT.

In the first article, Christopher Shea asks, "Can physicists produce insights about language that have eluded linguists and English professors?" ["The New Science of the Birth and Death of Words," Wall Street Journal, 16 March 2012] To answer that question, a team of physicists used Big Data analytics to search for insights from "Google's massive collection of scanned books." The result: "They claim to have identified universal laws governing the birth, life course and death of words." The team reported its findings in an article published in the journal Science. Shea continues:

"The paper marks an advance in a new field dubbed 'Culturomics': the application of data-crunching to subjects typically considered part of the humanities. Last year a group of social scientists and evolutionary theorists, plus the Google Books team, showed off the kinds of things that could be done with Google's data, which include the contents of five-million-plus books, dating back to 1800."

Whether or not you are interested in linguistics, this effort demonstrates how powerful Big Data techniques can be for producing new insights. According to Shea, the team's research "gave the best-yet estimate of the true number of words in English—a million, far more than any dictionary has recorded (the 2002 Webster's Third New International Dictionary has 348,000)." Shea continues:

"More than half of the language, the authors wrote, is 'dark matter' that has evaded standard dictionaries. The paper also tracked word usage through time (each year, for instance, 1% of the world's English-speaking population switches from 'sneaked' to 'snuck'). It also showed that we seem to be putting history behind us more quickly, judging by the speed with which terms fall out of use. References to the year '1880' dropped by half in the 32 years after that date, while the half-life of '1973' was a mere decade."

This demonstrates the increasing velocity of new knowledge as well as the importance of storing old knowledge. I'm a fan of history and Big Data techniques may eventually help us paint a truer, less biased, history of the world. I'm also a fan of the future and I know that Big Data techniques will help us make that future better. Shea continues:

"In the new paper, Alexander Petersen, Joel Tenenbaum and their co-authors looked at the ebb and flow of word usage across various fields. 'All these different words are battling it out against synonyms, variant spellings and related words,' says Mr. Tenenbaum. 'It's an inherently competitive, evolutionary environment.'"

I'm reminded of President Andrew Jackson's quote, "It's a damn poor mind that can think of only one way to spell a word!" He was joined in that sentiment by Mark Twain, who wrote, "I don't give a damn for a man that can only spell a word one way." I suspect those sentiments are also shared by former U.S. Vice President Dan Quayle who once famously "corrected" elementary student William Figueroa's spelling of "potato" to the incorrect "potatoe" at a spelling bee. Shea continues:

"When the scientists analyzed the data, they found striking patterns not just in English but also in Spanish and Hebrew. There has been, the authors say, a 'dramatic shift in the birth rate and death rates of words': Deaths have increased and births have slowed. English continues to grow—the 2011 Culturonomics paper suggested a rate of 8,500 new words a year. The new paper, however, says that the growth rate is slowing. Partly because the language is already so rich, the 'marginal utility' of new words is declining: Existing things are already well described. This led them to a related finding: The words that manage to be born now become more popular than new words used to get, possibly because they describe something genuinely new (think "iPod," "Internet," "Twitter")."

Although the scientists claim that "higher death rates for words ... are largely a matter of homogenization," I wonder if it isn't also a matter of there being more specialized and less generalized education. Shea continues:

"The explorer William Clark (of Lewis & Clark) spelled 'Sioux' 27 different ways in his journals ('Sieoux,' 'Seaux,' 'Souixx,' etc.), and several of those variants would have made it into 19th-century books. Today spell-checking programs and vigilant copy editors choke off such chaotic variety much more quickly, in effect speeding up the natural selection of words."

Of course, spell checkers aren't perfect. An anonymous poet penned the following poem to make that point:

I have a spelling checker
It came with my PC
It plainly marks for my revue
Mistakes I cannot sea
I've run this poem threw it
I'm sure your pleased to no,
It's letter perfect in it's weigh
My checker tolled me sew.

Shea reports that the database analyzed by the scientists "does not include the world of text- and Twitter-speak, so some of the verbal chaos may just have shifted online." He continues:

"Synonyms also fight Darwinian battles. In one chart, the authors document that 'Roentgenogram' was by far the most popular term for 'X-ray' (or 'radiogram,' another contender) for much of the 20th century, but it began a steep decline in 1960 and is now dead. ('Death,' in language, is not as final as with humans: It refers to extreme rarity.) 'Loanmoneys' died circa 1950, killed off by 'loans.' 'Persistency' today is breathing its last, defeated in the race for survival by 'persistence.' The authors even identified a universal 'tipping point' in the life cycle of new words: Roughly 30 to 50 years after their birth, they either enter the long-term lexicon or tumble off a cliff into disuse. The authors suggest that this may be because that stretch of decades marks the point when dictionary makers approve or disapprove new candidates for inclusion. Or perhaps it's generational turnover: Children accept or reject their parents' coinages."

What I found interesting was that the scientists discovered a "similar trajectory of word birth and death across time in three languages." Even so, they concluded that the field "is still too new to evaluate fully." As is normally the case, not everyone agrees with the conclusions reached by the team. Academics love arguing amongst themselves. Shea reports:

"Among the questions raised by critics: Since older books are harder to scan, how much of the word 'death' is simply the disappearance of words garbled by the Google process itself? In the end, words and sentences aren't atoms and molecules, even if they can be fodder for the same formulas."

In our work at Enterra, we understand that every discipline develops its own special lexicon. That's why we work hard to ensure that our ontology understands words in various settings. The IT world is no different when it comes to creating a specialized language that can sound foreign to the technology-challenged. Jonathan Moules reports that some executives are taking courses to help them understand this specialized lexicon. ["Coding as a second language," Financial Times, 28 March 2012] He reports:

"Alliott Cole sees a large number of tech start-ups in his work as principal in the early-stage investment team of private equity firm Octopus. The trouble is that he often struggles to comprehend what those writing the software that underpins those companies are talking about. 'For several years I have worked hard to understand how new infrastructure, products and applications work together to disrupt markets,' he says, explaining why he recently decided to take a course that claims to be able to teach even the most IT-illiterate person how to create a software application, or app, in just a day. 'While [I am] conversant in many of the trends and the – often confusing – array of terminology, it troubled me that I remained an observant passenger rather than an active driver, particularly in the realms of computer programming.' Mr Cole is not alone."

The course taken by Cole was "created by three former advertising executives – Steve Henry, Kathryn Parsons and Richard Peters – and Alasdair Blackwell, an award-winning web designer and developer, because they felt there was "a widely felt, but rarely discussed, problem. Tech talk is increasingly commonplace in business and life ... but most people, including senior executives, find the language used by software engineers, social media professionals and the 'digital natives' ... baffling." Moules reports that modern technology is changing many industries so even well-educated people need an occasional refresher to "revisit the basics of how technology functions." After spending a day taking the course with a handful of executives, Moules reports that they all were "happy to leave [programming] to the experts – but now, at least, they feel more confident of being able to talk the same language."

As the business world enters the age of Big Data, more specialized words are likely to be invented to describe technologies what cannot adequately be described using the current lexicon. There will also be words made up by marketing departments that will catch on. Only Big Data techniques, especially rule-based ontological analysis, are capable of making the connections and providing the insights that will help us make better decisions in the future -- perhaps even decisions about the words we use.

April 13, 2012

The Importance of Geography and Place

Both mariners and real estate agents have an appreciation and respect for geography, but for very different reasons. Let's begin with the mariners' perspective. Those who go to sea understand that the world remains a very big place and getting from here to there takes time (i.e., you can't load goods on a ship in Amsterdam and deliver them to Shanghai the next day). As maritime carriers adopt super-slow steaming practices (i.e., approximately 12 knots/hour) to save money, distances mean even more. For example, at 12 knots/hour, it takes over 24 days to travel from the Strait of Gibraltar to Singapore taking the short route through the Suez Canal (a distance of 6953 nautical miles). Taking the longer route around the Cape of Good Hope (a distance of 10679 nautical miles), it takes over 37 days.

Real estate agents, on the other hand, aren't so much concerned about distance as they are about location, location, location. Where specifically a home or business is located makes a huge difference in the value of a piece of property and the buildings that are on it. That's why land in Manhattan is worth so much more than a land in Mesquite.

Business leaders have to understand and embrace both perceptions of geography if they are to do well. Obviously, they must have a mariner's understanding of distance so they can determine how much inventory must be produced to account for products held in transit as well as how long it is going to take to get goods from factory to shelf. Business leaders are obviously concerned about location as well. They ask themselves where the best location is for stores, factories, suppliers, and so forth. The point of this brief discussion is that despite being fully involved in the so-called information age that connects and flattens the world -- geography still matters. That is also one of the principal messages of the book All Business is Local: Why Place Matters More Than Ever in a Global, Virtual World by John A. Quelch and Katherine E. Jocz. I was provided a copy of Quelch's and Jocz' book for review by Tiffany Liao, from Portfolio | Penguin Group USA. In the introduction to the book, the authors write:

"With the rise of our increasingly global and interconnected world, marketers are encouraged to focus on the biggest picture possible -- expanding brands throughout the world to achieve a leading global share. There is nothing inherently wrong in this approach, and the advances in technology have made this more practical than ever before. The danger lies when companies forget the importance of all other types of place. Global is glamorous and strategic, but when marketers focus solely on attaining it, they risk becoming irrelevant."

You can only really appreciate the differences between places by actually visiting them. You can only succeed in certain places by appreciating and adapting to those differences. I first started writing on this topic back in 2006 in a post entitled Wal-Mart, Culture & Resiliency. Wal-Mart learned that the world is not a one-size-fits-all place. Quelch and Jocz put it this way:

"In our view, place determines how consumers interact with a product or brand. From the arrangement of breakfast cereals on supermarket shelves to the ease of navigation and checkout in a digital store, place very powerfully and routinely influences our choice of brands -- or whether we buy anything at all."

Unlike mariners and real estate agents, who deal primarily with real geography, Quelch and Jocz discuss four different kinds of "places." They are: psychological, virtual, geographic, and global. They state, "We will show how place is critical to nearly every marketing planning decision, and why place, now more than ever, cannot be an afterthought." They continue:

"In the past few years, we've been given mixed messages about how the world is arranged. Some scholars and pundits tell us the world is flat while others insist it's spiky. Some politicians and observers tell that cultures and values are converging, while others point to cultural divergences that generate world conflict. Some praise globalization, while others point to its dangers. We're told consumers want to live in a digital cloud but sill value the importance of physical touch. Only one thing is certain: competing trends are pulling multinational firms in all directions at once."

Quelch and Jocz insist that it is precisely because all these trends must be taken into account simultaneously that "place" is now so important. They note that "all of the above thinkers can be right, depending on the context, marketing purpose, and business model." Later on they write:

"We believe the ease with which marketing organizations integrate all aspects of place and move seamlessly back and forth from the local geographical to global, from the physical to the virtual, from the functional to the psychological is a source of competitive advantage."

I have pointed out in several past posts that some supply chain pundits believe that, in order to address the growing complexity of supply chains, companies should consider supply chain segmentation. These discussions often focus on segmenting the various ways that orders are filled in a multi-channel retailing environment. Quelch and Jocz suggest that geography should also be considered. They write, "Geographic segmentation serves as a useful surrogate for consumer lifestyles and provides addressability of marketing communications to where people live and shop." Of course, segmentation has to consider much more than marketing. Quelch and Jocz insist that business leaders must be "intelligently local." They note that Joel Kotkin calls this type of strategy "new localism" and it "is why global brands like Pepsi are spending so much time these days on connecting with local communities."

With all this talk of localism, Quelch and Jocz don't ignore "the equity of global brands," which they call "enormous." They acknowledge that "economies of scale are not to be dismissed." They write, however, that "there are comparatively few cases where a global brand pushing a standardized global product is the optimal marketing strategy." By know it should be clear that Quelch and Jocz focus on getting local consumers to buy the products being offered by multinational corporations. They break their discussion into five, easy-to-read chapters (plus a Conclusion) that are filled with anecdotal evidence of the principles they are pushing. They begin their journey in the mind of the consumer (the psychological place "where brand meanings reside and marketing communications are processed"). Many a firm has failed to sell merchandise because it hasn't understood the customer to whom it was trying to sell.

The second chapter talks about "the physical environment" and looks "at the ways in which the physical characteristics of places influence people's wants and needs and how the smart design of physical spaces -- with an emphasis on physical stores -- attracts and motivates consumers to buy." The authors insist that the "physical place still defines most purchasing behavior." The chapter covers everything from the physical place occupied by a store to the physical place where products are displayed. Quelch and Jocz conclude, "Great brands ... find ways to use physical space to optimize positive consumer associations and sales success."

The third chapter builds on the perspectives provided in the first two chapters and applies them to "virtual space." They discuss "both the parallels and the differences between online marketplaces and virtual marketplaces." No discussion of virtual marketplaces would be complete with discussing mobile devices and Big Data and Quelch and Jocz discuss how these factors "lead to delivery of specialized products keyed to where the customer is located." They note, however, that "there is no escaping physical space even in the virtual world." The virtual world is built on servers with a physical location and order fulfillment in e-commerce still requires physical warehouses.

In the fourth chapter, Quelch and Jocz get more physical. They discuss "places as things." That is, they talk about how places can be marketed. They write, "Increasingly, places like nations, states, and towns market themselves to attract visitors and investors." They note, however, that "although marketing of places is commonly associated with tourism promotion, it does not stop there. In addition to promoting tourist attractions, countries and cities are competing for foreign direct investment." They conclude that "place and physical context still remain powerful means for commercial marketers to communicate brand attributes."

In the fifth chapter, Quelch and Jocz offer their "perspective on global place." That is, they "focus on the strategies used by companies that are simultaneously marketing at the national and global levels." They point to the slogan of HSBC -- "The World's Local Bank" -- as something that "perfectly captures the dual requirements for companies with international or global aspirations." They conclude, "Succeeding locally and globally requires intimate knowledge of what local customers value, ... how local customers live and navigate in their communities, ... and a clear understanding of alternative strategies for market expansion."

In the concluding chapter, Quelch and Jocz note that "our place-based instincts remain more local than global." I think they are correct. People like to claim some place as home -- their little place in the world. As the author's state, "The sense of place that comes with being part of community -- not the global village but the local village -- remains an important part of our psyche, and, its continuity, a source of comfort and well-being."

Quelch and Jocz believe that each strand of "place" that they discuss is a business thread and they assert that those threads must "tightly woven together in businesses poised to grow in the global economy." In the end, Quelch and Jocz admit that places are only important if people populate them. They write, "As the leading fashion designer Sir David Tang sensibly put it: 'It's not the place but the people that count.'"

I agree that people and places can't be separated. Because people go (physically or virtually) to certain places, those places matter. Some of those places can be reached instantly, but companies should never forget that when you start moving real things around the real world it remains a pretty big place.

March 30, 2012

Innovation: The Legacy of Bell Labs

Jon Gertner has received some good press for his recently released book entitled The Idea Factory: Bell Labs and the Great Age of American Innovation. Individuals younger than the Baby Boom generation probably don't understand the luster and prestige that was associated with having a job at Bell Labs. It was home to Nobel laureates and others who possessed some of the world's best minds. In a review of Gertner's book, Bob Metcalfe, who teaches innovation at the University of Texas and is a recipient of the Alexander Graham Bell Medal and the National Medal of Technology, agrees that Bell Labs, "AT&T's well-funded research branch, ... had collected many of America's best scientific minds." ["Where the Future Came From," Wall Street Journal, 16 March 2012] One of those great minds was William Bradford Shockley. Metcalfe writes:

"William Bradford Shockley was difficult, but brilliant. For a long time he led solid-state physics research at Bell Telephone Laboratories, and there, in 1947, he invented something very important. Shockley's colleagues pondered what to call it, voting among six possible names, including 'semiconductor triode' and 'iotatron.' The winner came from shortening 'transconductance varistor' to 'transistor.'"

Metcalfe notes that "Shockley's transistor was a tiny, simple, durable and inexpensive solid sandwich. It would turn out to be the most important invention of the 20th century, the essential building block of the Information Age." He continues:

"All of today's mobile telephones, desktop computers, laptops and server farms—and all the routers of the world-wide Internet—are chock-full of transistors, sometimes billions of them in a single microchip. They enable many modern wonders, including Google, YouTube, Facebook, Twitter, the iPhone and The Wall Street Journal on your iPad."

Shockley, however, was not a lone genius. He was surrounded by a brilliant team at Bell Labs. Metcalfe writes, "Jon Gertner's compelling history of Bell Labs, ... suggests that the transistor was invented not once by the brilliant Shockley, but three times by three other people at Bell Labs, with Shockley supervising, annoying, and taking undue credit." He explains:

"First, John Bardeen and Walter Brattain demonstrated the 'point-contact' transistor on Dec. 23, 1947. These two scientists worked in the research group that Shockley led, but they built their transistor with little help from him. Then Shockley broke with Bell Labs' collaboration norms by separately inventing a second, more reliable, 'junction' transistor while he was holed up for several days in a hotel room. Bell Labs would have preferred that he perform his transistor magic in his office, with the door open. Finally, in 1954, Morris Tanenbaum, not Shockley, would invent the third, 'silicon' transistor (the previous designs were germanium). The vast majority of today's transistors are connected in circuits on silicon wafers. Shockley, Bardeen and Brattain would share the 1956 Nobel Prize in physics, but the increasingly estranged Shockley left Bell Labs. Fatefully, he decided to form a start-up near the place where his mother had home-schooled him, Palo Alto, Calif., laying the groundwork for what would become Silicon Valley."

Metcalfe states that "the tale of Shockley's invention of the transistor (or not) holds many lessons about the nature of collaboration and innovation. It's just one of many great stories told in Mr. Gertner's book, which recounts Bell Labs during its heyday, between the 1920s and 1980s." He continues:

"The lab's scientists did pioneering work not only on the transistor but also, amazingly, on the laser; on solar cells; on satellite, cellular and optical communications; and on the modern management of innovation itself. 'The Idea Factory' is a eulogy for the 'most innovative scientific organization in the world,' now a shadow of its former self, owned by Alcatel-Lucent. The book takes the form of a series of biographies of the men who started, shaped, inhabited and eventually left Bell Labs."

In a preview of his book, Gertner wrote an article explaining why he selected Bells Labs as his subject. He wrote, "Why study Bell Labs? It offers a number of lessons about how our country’s technology companies — and our country’s longstanding innovative edge — actually came about. Yet Bell Labs also presents a more encompassing and ambitious approach to innovation than what prevails today. Its staff worked on the incremental improvements necessary for a complex national communications network while simultaneously thinking far ahead, toward the most revolutionary inventions imaginable. ["True Innovation," New York Times, 25 February 2012] Gertner introduced his article this way:

"'Innovation is what America has always been about,' President Obama remarked in his recent State of the Union address. It's hard to disagree, isn’t it? We live in a world dominated by innovative American companies like Apple, Microsoft, Google and Facebook. And even in the face of a recession, Silicon Valley’s relentless entrepreneurs have continued to churn out start-up companies with outsize, world-changing ambitions. But we idealize America’s present culture of innovation too much. In fact, our trailblazing digital firms may not be the hothouse environments for creativity we might think. I find myself arriving at these doubts after spending five years looking at the innovative process at Bell Labs, the onetime research and development organization of the country's formerly monopolistic telephone company, AT&T."

Today's AT&T is a shadow of the old AT&T. Metcalfe provides some historical perspective:

"The American Telephone & Telegraph Co. evolved from various predecessor companies at the end of the 19th century. In its first few decades, AT&T was, as Mr. Gertner puts it, 'close to a public menace—a ruthless, rapacious, grasping "Bell Octopus."' Attitudes toward the company changed after AT&T was converted in 1921 to a government-mandated 'natural monopoly,' exempted by Congress from antitrust laws. Thanks to 'one of the great public relations campaigns in corporate history,' AT&T came to be known 'Ma Bell.' Ma Bell presented herself as the benevolent source of universal telephone service. She organized herself into three main branches. The AT&T operating companies delivered local and long-distance telephone service. Western Electric was AT&T's sole provider of equipment and thus one of America's largest manufacturing companies. And, after 1925, Bell Telephone Laboratories was the exclusive home of research and development for AT&T's other branches."

Gertner focuses on Bell Labs because his book is primarily about innovation. Metcalfe continues:

"Bell Labs' founding president, Frank Jewett, made it the center of American technology. In the years following World War II, other presidents—including Mervin Kelly, Jim Fisk and William Baker—not only steered AT&T's research but were constantly consulted by presidents from Roosevelt to Reagan, becoming key advisers in the Cold War and space race. Telstar, the first satellite to transmit television and telephone, was developed by AT&T in the early 1960s. Mr. Gertner, besides celebrating forgotten figures and seminal discoveries, wants us to re-evaluate our contemporary assumption that innovation can only be brought about by 'small groups of nimble, profit-seeking entrepreneurs.' Think big, the author urges. 'To consider what occurred at Bell Labs, to glimpse the inner workings of its invisible and now vanished "production lines," is to consider the possibilities of what large human organizations might accomplish.'"

Metcalfe asserts that "Gertner grew up in the glow of Bell Labs headquarters in Murray Hill, N.J." and as a result he "romanticizes the place." Gertner certainly was enamored with Bell Labs' accomplishments. He writes:

"Consider what Bell Labs achieved. For a long stretch of the 20th century, it was the most innovative scientific organization in the world. On any list of its inventions, the most notable is probably the transistor, invented in 1947, which is now the building block of all digital products and contemporary life. ... Bell Labs produced a startling array of other innovations, too. The silicon solar cell, the precursor of all solar-powered devices, was invented there. Two of its researchers were awarded the first patent for a laser, and colleagues built a host of early prototypes. (Every DVD player has a laser, about the size of a grain of rice, akin to the kind invented at Bell Labs.) Bell Labs created and developed the first communications satellites; the theory and development of digital communications; and the first cellular telephone systems. What’s known as the charge-coupled device, or CCD, was created there and now forms the basis for digital photography. Bell Labs also built the first fiber optic cable systems and subsequently created inventions to enable gigabytes of data to zip around the globe. It was no slouch in programming, either. Its computer scientists developed Unix and C, which form the basis for today’s most essential operating systems and computer languages."

It's not too difficult to see why Gertner thinks so highly of Bell Labs; regardless of where he grew up. Metcalfe, however, believes that Gertner's assessment of how effective the Labs were is exaggerated. While its accomplishments were myriad, compared to the number of employees working on those accomplishments, Metcalfe concludes that Bell Labs is not a good model for innovation. He writes:

"[Gertner] makes the common mistake of confusing invention with innovation. Mr. Gertner credits Bell Labs with inventing the silicon solar cell in the 1950s. If only they had finished the job. Solar energy remains uneconomic today, more than half a century later—invented but not innovated. Likewise, Bell Labs in the 1960s poured its money and reputation into an early form of videoconferencing, PicturePhone, which flopped when deployed. Mr. Gertner suggests that society would do well to re-create more Bell Labs. But trusting research to corporate monopolies is problematic in two ways. First, their money comes from overcharging customers by using monopoly power. ... Second, a corporate monopoly has little motivation to disrupt a market that it already dominates. AT&T had to be forced, starting in 1968, to let the nascent Internet connect to its telephone network; 'Ma Bell' resisted every step of the way."

Those are points well made. Metcalfe nevertheless concludes that "Gertner's book offers fascinating evidence for those seeking to understand how a society should best invest its research resources." Metcalfe says that rather than invest in corporate R&D the U.S. should consider investing more "in research universities (of which the United States has at least 100)." Before continuing with Metcalfe's arguments, let's finish what Gertner had to say. He wrote:

"So how can we explain how one relatively small group of scientists and engineers, working at Bell Labs in New Jersey over a relatively short span of time, came out with such an astonishing cluster of new technologies and ideas? They invented the future, which is what we now happen to call the present. And it was not by chance or serendipity. They knew something. But what? At Bell Labs, the man most responsible for the culture of creativity was Mervin Kelly. ... Between 1925 and 1959, Mr. Kelly was employed at Bell Labs, rising from researcher to chairman of the board. ... His fundamental belief was that an 'institute of creative technology' like his own needed a 'critical mass' of talented people to foster a busy exchange of ideas. But innovation required much more than that. Mr. Kelly was convinced that physical proximity was everything; phone calls alone wouldn’t do. Quite intentionally, Bell Labs housed thinkers and doers under one roof. Purposefully mixed together on the transistor project were physicists, metallurgists and electrical engineers; side by side were specialists in theory, experimentation and manufacturing. Like an able concert hall conductor, he sought a harmony, and sometimes a tension, between scientific disciplines; between researchers and developers; and between soloists and groups."

I am a proponent of cross-disciplinary innovation. The more perspectives from which you examine a challenge the better the eventual solution. Although you might need a critical mass of such people, Metcalfe seems to be arguing that Bell Labs' mass of scientists and engineers was so far beyond critical that it was inefficient. There was probably a 20/80 rule at work at Bell Labs where 20 percent of the scientists were coming up with 80 percent of the good ideas. But that's just a guess. Gertner pointed out that Kelly had some pretty concrete ideas about how to spur innovation. Gertner wrote:

"One element of his approach was architectural. He personally helped design a building in Murray Hill, N.J., opened in 1941, where everyone would interact with one another. Some of the hallways in the building were designed to be so long that to look down their length was to see the end disappear at a vanishing point. Traveling the hall’s length without encountering a number of acquaintances, problems, diversions and ideas was almost impossible. A physicist on his way to lunch in the cafeteria was like a magnet rolling past iron filings. Another element of the approach was aspirational. Bell Labs was sometimes caricatured as an ivory tower. But it is more aptly described as an ivory tower with a factory downstairs. It was clear to the researchers and engineers there that the ultimate aim of their organization was to transform new knowledge into new things. Steven Chu, secretary of the Department of Energy, won a Nobel Prize in 1997 for his work at Bell Labs in the early 1980s. He once said that working in an environment of applied science like Bell Labs 'doesn’t destroy a kernel of genius, it focuses the mind.' At Bell Labs, even for researchers in pursuit of pure scientific understanding, it was obvious that their work could be used. Still another method Mr. Kelly used to push ahead was organizational. He set up Bell Labs’ satellite facilities in the phone company’s manufacturing plants, so as to help transfer all these new ideas into things. But the exchange was supposed to go both ways, with the engineers learning from the plant workers, too. As manufacturing has increasingly moved out of the United States in the past half century, it has likewise taken with it a whole ecosystem of industrial knowledge. But in the past, this knowledge tended to push Bell Labs toward new innovations."

I don't believe that Metcalfe has a problem with applied science, but I doubt he wants research universities to give up on basic scientific research. I suspect that Metcalfe would also agree with Kelly's open organizational scheme for sharing ideas. After all, Metcalfe trumpets the fact that Shockley was associated with a number of institutions of higher learning including UCLA, Caltech, MIT, Columbia, and Princeton. One thing that Gertner and Metcalfe probably agree on is that some ideas take years to develop. Finding a venue that permits idea incubation is no longer an easy thing to do. Gertner wrote that Bell Labs gave its scientists "lots of time — years to pursue what they felt was essential." He continued:

"One might see this as impossible in today’s faster, more competitive world. Or one might contend it is irrelevant because Bell Labs (unlike today's technology companies) had the luxury of serving a parent organization that had a large and dependable income ensured by its monopoly status. Nobody had to meet benchmarks to help with quarterly earnings; nobody had to rush a product to market before the competition did."

Metcalfe, it would appear, believes that research universities meet many of the criteria that Gertner believes are necessary for creative institutions to possess. Metcalfe asserts that many universities are not well managed; but, he writes, "The saving grace of research universities is that it is their business to graduate students, who have repeatedly proved to be effective vehicles for innovation, especially in their own start-ups." He continues:

"Bell Labs' greatest contribution may have been driving Shockley out. Shockley Semiconductor, his start-up, was a fiasco, but the man was good at recruiting talent, especially Gordon Moore from Caltech and Robert Noyce from MIT. Those two men would later found Fairchild Semiconductor, which over the years spun off Intel, National Semiconductor and Advanced Micro Devices. Those companies, in turn, spun off more start-ups, which grew to become Silicon Valley, the world's pre-eminent innovation machine, clustered around Stanford and the University of California at Berkeley."

Gertner argued, however, that not all innovations (or start-ups) are the same. He concluded:

"One type of innovation creates a handful of jobs and modest revenues; another, the type Mr. Kelly and his colleagues at Bell Labs repeatedly sought, creates millions of jobs and a long-lasting platform for society’s wealth and well-being. ... The teams at Bell Labs that invented the laser, transistor and solar cell were not seeking profits. They were seeking understanding. Yet in the process they created not only new products but entirely new — and lucrative — industries. ... Revolutions happen fast but dawn slowly. To a large extent, we’re still benefiting from risks that were taken, and research that was financed, more than a half century ago."

Gertner and Metcalfe fundamentally agree that innovation, spawned by well-funded research and development, is critical for human and economic progress. They might differ about the size and shape of the organizations that can generate ideas, inventions, and innovations; but they agree that research must continue if the U.S. is going to remain a world leader.

March 23, 2012

Information Age Irony: Books Going Out of Print

"After 244 years," writes Julie Bosman, "the Encyclopaedia Britannica is going out of print." ["After 244 Years, Encyclopaedia Britannica Stops the Presses," New York Times, 14 March 2012] She reports, "In an acknowledgment of the realities of the digital age — and of competition from the Web site Wikipedia — Encyclopaedia Britannica will focus primarily on its online encyclopedias and educational curriculum for schools." Tim Carmody argues, "Wikipedia Didn’t Kill Britannica. Windows Did." [Wired, 14 March 2012] Carmody reminds us that Microsoft developed Encarta and offered to develop "a version of Britannica for PCs in the 1980s, with Windows 1.0. After Britannica turned Microsoft down, Microsoft partnered with Funk & Wagnall's, rebranding it Encarta and focusing on a lean, computer-specific program that could help sell personal computers beyond productivity applications like Word and Excel." Carmody insists, "Encarta is more important to this story than Wikipedia." He explains his reasoning:

"Not because Encarta made Microsoft money (it didn't), or because Britannica didn't develop comparable products for CD-ROM and the web (they totally did, with the first CD-ROM encyclopedia in 1989 and Britannica Online in 1994). Instead, Encarta was an inexpensive, multimedia, not-at-all comprehensive encyclopedia that helped Microsoft sell Windows PCs to families. And once you had a PC in the living room or den where the encyclopedia used to be, it was all over for Mighty Britannica. When Wikipedia emerged five years later, Britannica was already a weakened giant. It wasn't a free and open encyclopedia that defeated its print edition. It was the personal computer itself."

Although I find Carmody's arguments convincing, the popular perspective will remain that Wikipedia wrote the eulogy for the print edition of the Encyclopaedia Britannica. For example, the folks at Open-Site, write, "Facing the realities and the stiff competition from Wikipedia, the Encyclopedia Britannica will now focus primarily on their online services. But even then, it might be too late. Wikipedia has grown to be the number one source for students. In fact, many students will stop research and change topics if it's not on Wikipedia." Jen Rhee directed me to the following graphic which she helped put together about Wikipedia (Provided by: Open-Site.org.)

Wikipedia

For bloggers like me, the fact that Wikipedia is working to improve the accuracy of its content is important. I often turn to Wikipedia for information to include in my posts. Although the death of the print version of the Encyclopaedia Britannica was inevitable, that doesn't mean that it won't be missed. The Wall Street Journal published the following graphic (click to enlarge) about the "online buzz" surrounding the announcement that Britannica was stopping the presses.

Encylopaedia Brittanica

At the end of her article, Bosman writes:

"Many librarians say that while they have rapidly shifted money and resources to digital materials, print still has a place. Academic libraries tend to keep many sets of specialized encyclopedias on their shelves, like volumes on Judaica, folklore, music or philosophy, or encyclopedias that are written in foreign languages and unavailable online. At the Portland Public Library in Maine, there are still many encyclopedias that the library orders on a regular basis, sometimes every year, said Sonya Durney, a reference librarian. General-interest encyclopedias are often used by students whose teachers require them to occasionally cite print sources, just to practice using print. 'They're used by anyone who's learning, anyone who's new to the country, older patrons, people who aren't comfortable online,' Ms. Durney said. 'There’s a whole demographic of people who are more comfortable with print.' But many people are discovering that the books have outlived their usefulness."

Carmody concludes, "The primary reason for Britannica to exist as a set of printed volumes was to serve as a household totem. The PC has long since taken that place, armed with Encarta, then Wikipedia and Google, and now the robust information economy of the entire web." Carmody is undoubtedly correct, but you have to admit that a complete set of the Encyclopaedia Britannica still looks handsome filling up space on a bookshelf.