Site moved to enterrasolutions.com/2013/01, redirecting in 1 second...

« December 2012 | Main | February 2013 »

23 posts from January 2013

January 31, 2013

The Future of Big Data, Part 2

In Part 1 of this series, I discussed the views of several pundits who believe that big data has been overhyped, is descending into disillusionment, and should be done away with as a business term. In the end, however, they all agreed that big data analytics was here to stay and would be a differentiator for successful companies in the future. Marketing technologist Scott Brinker writes, "Let's face it: marketing is in a big data bubble." ["The big data bubble in marketing -- but a bigger future," Chief Marketing Technologist, 21 January 2013] He continues:

"That's both a 'big data' bubble and, more generally, a big 'data' bubble. Everyone is talking about data, big data, data analytics, big data analytics. Vendors, analysts, consultants, pundits, bloggers, etc., are all falling all over themselves to squeeze these terms into their propaganda content marketing. As just one example, three out of the Top 10 predictions for CMOs by IDC revolve around data. How CMOs must have a strategy for how market-driven data will contribute to corporate objectives (#1). How CMOs will be in jeopardy if they fail to produce a robust data analytics function (#4). How 50% of all new marketing hires will now have a technical background as the CMO scrambles for data analytics proficiency in the department (#5)."

Brinker agrees with Gartner analyst Svetlana Sicular who believes that the hype about big data is reaching the peak of the Hype Cycle. It's about time, he asserts, that people start paying attention "to the operational implications of actually using data." On that point, he agrees with all of the pundits cited in yesterday's post. One open question, however, is who should be in charge of big data and overseeing its analysis.

Analysts associated with IT believe that the Chief Information Officer (CIO) should have control over big data. According to Brinker, ICD believes "that the CMO [Chief Marketing Officer] will be given full responsibility for 'data analytics' — and, in some unspecified way, required to tie that to business growth." I'm sure there are arguments to be made for Chief Finance Officers (CFOs), Chief Operations Officers (COOs), and Chief Supply Officers (CSOs) as well. One thing that most pundits do agree on is that people proficient in dealing with big data are going to be in great demand in the years ahead. Brinker also insists that people with "data analytics proficiency" will be required to deal with big data, regardless of the department in which the responsibility for it finally resides. "That's an awful lot of data, analytics, and insight," Brinker writes, "but not much committed action beyond observation and fodder for internal meetings with PowerPoint. ... This feels eerily analogous to the dot-com bubble of the late 90's to me (substitute 'analyze data' with 'capture eyeballs')." He notes that during the dot-com bubble irrational exuberance resulted in exaggerated claims that were never realized. He fears the same thing will happen with big data. "The expectations of what big data will deliver on its own," he writes, "especially in the short-term, are massively overblown. Lots of people are throwing lots of money at the promise that big data will, somehow — I don't know how exactly, it's technical and mathy — tame the fractured, fragmented, frenzied landscape that is modern marketing and crunch all of those complications into more customers. All as a well-oiled machine. Regrettably, it's not that simple."

Brinker is certainly correct on that point. The dot-com era was characterized by a lot of start-up companies that had an idea but no business plan. I agree with Brinker that big data is only going to prove useful when a specific plan for its use is in place and that plan results in a measurable return on investment. Brinker points out, "Data is not the same as information, which is not the same as insight." The big data service provider firms that will do well in the future will be those that can turn data into actionable intelligence and provide decision makers with useful insights. Brinker also believes that its a fallacy to believe that "big data will replace strategic thinking." In an earlier post, I discussed the fact that big data analytics will help decision makers think better rather than replace thinking altogether. In fact, there have been predictions that people will begin to think more not less in the years ahead as a result of the insights that big data will provide. Brinker goes into a lengthy discussion about the limitations of data and cites media researcher Yaakov Kimelfeld, who wrote, "Big data is not all of the data." His point is that people will still be required to know what kind of information they need to collect and why. Even systems equipped with good artificial intelligent agents won't have access to all the data in the world. At the end of his discussion, Brinker writes, "You get the point. Big data is not a panacea."

Although it may sound like Brinker has soured on the whole idea of big data, he has not. He points out that the hype behind the dot.com era was based on the fact that something fundamentally revolutionary was taking place. We all know that the world changed as a result of the dot.com era. Brinker believes the same is true for big data era -- "a truly major revolution is happening." He explains:

"The real data revolution in marketing won't be a sugar-coated miracle pill that anyone can adopt simply by buying some software, hiring a data scientist, and pointing them at a cloud full of data. The real revolution in data will be a change in organizational behavior and culture — and those changes are hard and take time. Many organizations will struggle with the shift, and frankly, many will be usurped by new competitors who grow up natively with this new worldview. So what exactly is the real revolution? It's not data. It's not even data analytics. It's being data-driven. I know, that sounds pointlessly subtle, but bear with me. As Mark Twain would say, this is the difference between the lightening and the lightening bug. Being data-driven is not the same as just embracing data and analytics. Data and analytics are small pieces of a bigger picture. The bigger picture: creating an organization that can confidently experiment, innovate, and adapt on a broad scale. Being truly data-driven embeds this deeply into an organization's culture."

I've written a number of posts about corporate culture as it relates to innovation. In those posts, I've made the same point as Brinker; namely, changing corporate culture is difficult and takes time. He goes on to discuss how "some of the most valuable output from such data analytics will be mere hypotheses — interesting correlations of factors and behaviors in ever-more-finely-sliced customer segments that may have meaningful impact." He calls this going from big data to big testing. You then take the insights learned from that testing and "apply your targeted data and proven tests towards delivering better customer experiences — through the web, mobile devices, call centers, in-store and in-person interactions, etc." This, he claims, leads to the big experience.

Big_data_testing_cex_600
Source: Scott Brinker

As I noted in yesterday's post, Enterra Solutions uses a process very similar to the one described by Brinker. We use artificial intelligence to help frame questions. Our Sense, Think/Learn, Act™ system powers a Hypothesis Engine™ that can propose and explore interesting potential relationships it discovers on its own, and test them much more rapidly than humans can potentially iterate. Our belief is that the current reliance on one-by-one human attempts to question an exponentially growing space of data is the main cause of the disillusionment that Brinker and Sicular believe is affecting discussions of big data. We believe that allowing AI to do the heavy hypothesis lifting our clients will get the big experience discussed by Brinker.

Brinker writes that too much of what is written about big data discusses "analysis, not action." He concludes, "It's by connecting big data to big testing and big experience that you turn that analysis into action. That's modern marketing alchemy: transforming lead (data) into gold (better, more profitable customer experiences)." He calls the addition of big testing and big experience to big data, the "big leap" that is being made by marketing firms. I believe that what Brinker writes about (i.e., transforming big data lead into big data gold) is worth further discussion and that's what will be discussed in the final segment of this series.

January 30, 2013

The Future of Big Data, Part 1

Big Data is getting plenty of attention nowadays. Too much attention according to some pundits. In fact, some pundits believe that there is a Big Data bubble that is going to burst. Other pundits believe that Big Data is already descending into what Gartner calls the "trough of disillusionment." Still other pundits believe that the term Big Data should be unceremoniously done away with. Among the latter group is Leeno Rao. She writes, "Let's banish the term 'big data.'" ["Why We Need To Kill 'Big Data'," TechCrunch, 5 January 2013] She explains why she feels as she does:

"Why have I grown to hate the words 'big data'? Because I think the term itself is outdated, and consists of an overly general set of words that don't reflect what is actually happening now with data. It's no longer about big data, it's about what you can do with the data. It's about the apps that layer on top of data stored, and insights these apps can provide. And I'm not the only one who has tired of the buzzword. I’ve talked to a number of investors, data experts and entrepreneurs who feel the same way."

I agree with Rao that what's really important is what you can do with the data. Unanalyzed data is as useless as an unread book sitting on a library shelf. Rao reports that the term "big data" was first used by "Francis Diebold of the University of Pennsylvania, who in July 2000 wrote about the term in relation to financial modeling." She believes that a decade is long enough for a term to be used, abused, and retired. The reason that the term still has legs, however, is that the data contains the gold and the amount of data that must be mined to find the gold is getting bigger every day. Data is the sine qua non of everything that follows. Rao doesn't disagree that the data is both big and important. In fact, she writes that it is so important that it "is the key to most product innovation." As a result, she asserts, that every company that uses data is a "big data" company, which "doesn’t say much about the company at all." She continues:

"According to IBM, big data spans four dimensions: Volume, Velocity, Variety, and Veracity. Nowadays, in the worlds of social networking, e-commerce, and even enterprise data storage, these factors apply across so many sectors. Large data sets are the norm. Big data doesn’t really mean much when there are so many different ways that we are sifting through and using these massive amounts of data. That's not to under-estimate the importance of innovation in cleaning, analyzing and sorting through massive amounts of data. In fact, the future of many industries, including e-commerce and advertising, rests on being able to make sense of the data."

Rao is looking for a new way to describe what is being done with large data sets (she writes., "let's figure out a different way to describe startups that are dealing with large quantities of data); but, the fact remains that today's data sets are large and that is why the simple descriptor "big" is likely to remain.

Svetlana Sicular believes that there has been so much hype about big data that it "is at the peak of inflated expectations." The only way for those expectations to go is down into Gartner's "trough of disillusionment." ["Big Data is Falling into the Trough of Disillusionment," Gartner, 22 January 2013] If you are not familiar with the Gartner Hype Cycle, read my post entitled Overcoming the Hype: Making Your Supply Chain a Strategic Weapon. Rather than being discouraged about the future of big data, Sicular believes that disillusionment with the subject means that "big data technology is maturing." Like Rao, Sicular understands that the important thing is learning how to unlock the insights that are contained in large data sets. She writes:

"Framing a right question to express a game-changing idea is extremely challenging: first, selecting a question from multiple candidates; second, breaking it down to many sub-questions; and, third, answering even one of them reliably. It is hard. Formulating a right question is always hard, but with big data, it is an order of magnitude harder, because you are blazing the trail (not grazing on the green field)."

At Enterra Solutions we use Artificial Intelligence to help us frame these questions. Our Sense, Think/Learn, Act™ system powers a Hypothesis Engine™ that can propose and explore interesting potential relationships it discovers on its own, and test them much more rapidly than humans can potentially iterate. Our belief is that the current reliance on one-by-one human attempts to question an exponentially growing space of data is the main cause of this disillusionment. These kinds of technologies will help big data climb out of the trough of disillusionment as they continue to mature. In the meantime, Sicular reports, "According to the Gartner Hype Cycle, the next stop for big data is negative press." Not everyone agrees with Sicular that big data is headed into the trough of disillusionment. In fact, Patrick Campbell believes big hype is moving from "fad to favor in 2013." ["Big Data Matters—CIOs Taking Charge!" Enterprise Tech Central, 10 January 2013] Campbell cites a statement by Thomas H. Davenport, a Visiting Professor at Harvard Business School and a Senior Advisor to Deloitte Analytics, that he found enlightening.

“When SAP generates more money from [Business Intelligence] BI and analytics than from its transactional suite, a major transition has taken place. When IBM has spent close to $20 billion on analytics-related acquisitions, it's a permanently changed ball game.”

Campbell agrees with the other analysts cited above that what really matters is what you do with your data. "The price of Big Data and BI analytics—the ROI," he concludes, "all depends on how well you implement your strategies and have access to the tools appropriate for your 'Big Data.'" Ann Grackin, an analyst with ChainLink Research, notes that it is not surprising that big data is capturing a lot of headlines given the fact that so much data is being collected every second of every day. "The problem," she writes, "is that accumulating all this data takes space. And analyzing it takes software. ... The theory is that there are things to learn there — about customers, about markets, about innovation — that can mean bigger opportunities for us." ["Big Data," ChainLink Research, 17 April 2012] When considering how to deal with big data one CIO told Grackin:

"What I care about is source, size, security and sense — that is making sense of it, or analytics. Just because there is data all over the place, I am not sure of where it comes from and if it tells us anything useful about our customers. And size? That is how much money I need in the budget to deal with all the databases the business users want. And security. I don’t want users downloading stuff with malware. My main issue with all these is what's the point? Does all this data matter to us?"

Whether you prefer IBM's four "Vs" (Volume, Velocity, Variety, and Veracity) or the CIO's four "Ss" (Source, Size, Security and Sense), the goal is to make sense of large data sets while ensuring that the information being used is credible. It's important because, as Grackin writes, "The data seems to be piling up."

A database analyst named Tao Lin explained to Grackin, "What matters to users is a small fraction of that data, which is relevant to only him, or her." In some cases, decision makers are only interested in being alerted when something goes wrong. "Some call this exception management," writes Grackin, "but that is really what we are looking for." I agree that management by exception is important; but, it is only one use case for big data analytics. Edward Tufte, whom Grackin calls "the master of envisioning and displaying quantitative data," agrees with Tao that data relevancy is essential for obtaining useful insights. He told Grackin, "People are chasing huge databases, but there is truly only one bit that might be important to know, track, and chart." In other words, they agree with Rao that what is important is what you can do with the data (i.e., data management) not the size of the data set. Unfortunately, size does matter and the larger the data set the more difficult it is to manage and analyze. Grackin continues:

"On the upside, we have noticed a very strong correlation between data management strategies, in general, and improved performance in business. Successful firms such as Amazon, Walmart, Dell, Apple, and many modest-sized organizations ... embrace the value of information as a source of wealth, and not just for what it can tell us about the future. These corporations also find ways to make data actionable. They do not have aimless data collection schemes. Rather, their data collection is application driven, and, therefore, pertinent to managing their business processes. These firms have better cash positions and seem to have been in control of their supply chain due to adherence to data standards and communications technologies ... which allows them to reduce their information cycle times. This, of course, contributes to the management of all that data. Conclusion: Big Data Is Big Business."

That is probably the best five-word description of the future of big data. In the final two segments of this series, I'll look at some thoughts on the future of big data offered by marketing technologist Scott Brinker. He agrees that there may be a big data bubble, but he claims that the future of big data is even bigger.

January 29, 2013

Are Our Innovation Models Wrong?

According to two prominent innovation gurus, companies have pursued innovation using flawed models in the past. Columbia Professor Bill Duggan asserts that "the vast majority of the methods used to try and come up with innovative ideas are based on a flawed understanding of how the brain works." ["99 Percent Of Innovation Methods Are Based On A Brain Model We Rejected A Decade Ago," by Max Nisen, Business Insider, 2 January 2013] And Gary Hamel insists, "We're never going to build a truly innovative company without a gene-replacement therapy." ["Gary Hamel On Innovating Innovation," by Steve Denning, Forbes, 4 December 2012] Duggan's assertion was made during at interview at the Columbia Business School. ["From Intuition to Creation," Ideas at Work, 28 December 2012] During that interview he stated:

"I was surprised to discover that 99 percent of innovation methods that people use today are based on a model of the brain that neuroscientists abandoned more than a decade ago. In essence, these innovation methods tell you to do some kind of research or analysis, and then you brainstorm to come up with your innovation idea. The theory of brainstorming is that you turn off your analytical left brain, turn on your intuitive right brain, and creative ideas pop out. But neuroscience now tells us that there is no right or left side of the brain when it comes to thinking. Creative ideas actually happen in the mind, as the whole brain takes in past elements, then selects and combines them — and that's how creative strategy works."

I remember a cartoon that showed a scientist writing at a blackboard. On the left side he had written all sorts of equations that looked very impressive. He had then drawn an arrow from those equations that pointed to a black box in the middle of the blackboard. Finally, he had drawn another arrow from the black box that pointed to a text box containing the words, "The magic happens." Duggan believes that brainstorming is akin to such black box thinking. He stated, "When it comes to the actual idea for your innovation, these methods leave it to the magic of the creative right side of the brain." I'll talk more about brainstorming later.

In Denning's interview with Hamel, Hamel stated, "A huge amount has been talked about and written on innovation over the last ten years or so. Most of us understand that innovation is enormously important. It's the only insurance against irrelevance. It's the only guarantee of long-term customer loyalty. It's the only strategy for out-performing a dismal economy. So people know that it's important. But people also sense that there's a huge gap between the rhetoric and the reality." That is basically the same message that Duggan is trying to get across. Like Duggan, Hamel doesn't believe that innovation is something that you can conjure up through magic. He states:

"Innovation isn't like something that we'd like to have and so we just go out and get it. Organizations were built to do things that are antithetical to innovation. Organizations were built around principles that deify conformance, control, alignment, discipline and efficiency. The principles that organizations have at their core are antithetical to innovation. It often feels like we're trying to get a dog to stand on its hind legs. You can do that, but you can't get a dog to do that for long. It's a DNA level problem. A dog has the DNA of a quadruped. Once you turn your back on the critter, and you put away the treats, the dog is back on all four legs. Any innovation effort that doesn't start by acknowledging that innovation itself is deeply counter-cultural. We're never going to build a truly innovative company without a gene-replacement therapy. Without that you're going fail. You're going to try something and then be disheartened when you discover that three months or six months later, the dog is still peeing on lamp-posts rather than doing the tango."

Hamel doesn't believe that companies can become innovative overnight. It takes time to create a corporate culture of innovation and to train people in innovation techniques. "You have to train people how to be business innovators," Hamel claims. "If you don't train them, the quality of the ideas that you get in an innovation marketplace is not likely to be high." If you want to have innovative company, he says, you have to have "training, funding, and accountability for it at every level of the organization. Not one out of a hundred companies has done this." Denning notes that 1 in a 100 is a pretty grim statistic. Hamel agrees. He claims the statistic is so bad because innovation takes systematic effort and most companies don't exert that kind of effort. One reason, he believes, is "that leaders still think of innovation in these mystical terms." Such notions, Hamel states, are hogwash. He claims to have "taught thousands of people how to innovate" and that "they have created billions of dollars of market value."

I suspect that Hamel would agree with Duggan that brainstorming is not a very useful tool for generating innovative ideas. It's too simplistic and doesn't really require a company to invest the kind of effort that Hamel believes is necessary. Duggan states:

"Brainstorming works fine when you don't need an innovation. People brainstorm mostly to solve problems they already know how to solve with their current expertise, at least as a group. When you brainstorm, you really throw out ideas from your personal experience — these come to mind fastest and strongest. If you have a problem that the total personal expertise of six people can solve, then brainstorming is very efficient. But if the solution actually lies outside their personal expertise, brainstorming is a trap — you toss out ideas and get conventional wisdom, not an innovation."

Duggan implies that corporate culture is critical if a company is going to be more innovative. "People should cultivate curiosity about how exactly things succeed," he states, "and cultivate 'presence of mind' where they deal calmly with problems and let their minds wander freely rather than look for quick answers because they feel stress." He continues:

"For firms it's harder, because they have myriad procedures in place already that can work against creative strategy. For example, in a typical planning cycle, does the firm start with the question 'Where would we like to innovate if we could?' If not, it's already too late, because everyone has already started down a road of goals, initiatives, timelines, and deadlines. Everyone gets busy; now the only 'creative' time they can spare is two-hour or two-day brainstorming. Instead, a company should do creative strategy well before the start of its planning cycle, so by the time it has to plan, it actually has an innovation to implement. But that requires a company to turn its whole cycle of procedures upside down. That's a tall order for any company."

Catherine Courage, a leader for a Silicon Valley product design group, agrees that corporate culture is critical to becoming more creative. In a TEDx Kyoto talk she offers three suggestions about how to go about changing your corporate culture to become more creative ["TEDx and Creativity: How To Transform Corporate Culture," by Penelope Rivas, volunteermatch.org, 4 January 2013]. The first suggestion deals with environment. Cruz writes:

"According to [Catherine], environment is the, 'foundation of creativity'. Many modern business environments appear plain and simple. You want to create spaces that are conducive to imagination, ideas and original thinking. Take a cue from a child's classroom, filled with a variety of colors, shapes and spaces. Companies like Google and Microsoft have already jumped on the bandwagon by creating work environments that are more colorful and open. How is your current corporate environment encouraging creativity?"

The second of Courage's recommendations deals with experimentation. Cruz writes:

"Experimenting is the key to innovation. For example, Thomas Edison made a thousand failed attempts before inventing the lightbulb. [Catherine] talks about the dangers of creating a culture where employees are afraid to fail. Instead, embrace failure by recognizing its role in success. Encourage your employees to take risks, think outside the box and of course, fail."

Courage's final recommendation deals with storytelling. Cruz explains:

"The art of storytelling can be incredibly influential in the business world. Presentations filled with dull bullet points, lack of emotion and no context can decrease employee engagement. The next time you give a presentation, think about the ways that you can incorporate storytelling. Effective storytelling can increase employee engagement by encouraging conversations which can lead to new, innovative ideas."

You can watch Courage's full presentation below.

Although Courage seems to accept the left brain/right brain paradigm dismissed by Duggan, she agrees with him and Hamel on two points: First, traditional models of innovation need to be revised; and, second, innovative companies have an innovative culture. It's part of their DNA. That doesn't mean, however, that they don't have to continue working at it. Hunting is in a lion's DNA, but that doesn't make finding its next meal much easier.

January 28, 2013

The Benefits of Big Data

Rick Neil from Percepture directed me to a great infographic (shown below) that was created by neolane (www.neolane.com). The infographic is entitled "Transforming Big Data into Actionable Insight." In my discussions about Big Data, I've persistently pointed out that having access to large data sets doesn't mean anything if you don't have a way of analyzing that data and presenting the results to decision makers in a user-friendly format (i.e., transforming Big Data into actionable insights displayed on an easy-to-use dashboard). The infographic provides an overview of what is encompassed by the term "Big Data" and points out that a lot of firms don't know what to do with it ("60% of marketers don't have or are unsure if their company has a big data strategy").

In previous posts, I've discussed the fact individuals with the right mathematical skills are going to be in high demand by corporations and marketing firms (see, for example, a post entitled Future Jobs: The Times They are a Changin'). The infographic notes that 2018 there will be a shortage of 140,000 to 190,000 of such folks by 2018. I've also discussed the fact that privacy concerns are going to remain the elephant in the room whenever people discuss big data and how it can be used for targeted marketing. The infographic notes that a staggering 81% of marketers are "either somewhat or not very prepared to handle the new rules and regulations of marketing data governance." That includes privacy rules and regulations.

The most important part of the infographic, however, is its description of the benefits of big data. It breaks the benefits into five categories: relevancy, timeliness, context, consistency, and personalization. Companies that offer big data services must master all five of those areas if they are going to add value to the business models of the clients they serve.

Transforming Big Data

For Consumer Packaged Goods (CPG) manufacturers and retailers, I believe that targeted marketing is going to be the most important use of big data. Many analysts believe, however, that big data is going to touch almost every aspect of a business; which is why, business leaders need to become familiar with big data and what it means for their company. McKinsey & Company analysts James Manyika, Michael Chui, Brad Brown, Jacques Bughin, Richard Dobbs, Charles Roxburgh, and Angela Hung Byers write, "Leaders in every sector will have to grapple with the implications of big data, not just a few data-oriented managers. The increasing volume and detail of information captured by enterprises, the rise of multimedia, social media, and the Internet of Things will fuel exponential growth in data for the foreseeable future." ["Big data: The next frontier for innovation, competition, and productivity," May 2011] They offer seven key insights about big data. The first, and most obvious, insight is that big data has "swept into every industry and business." The second insight is that "there are five broad ways in which using big data can create value." Those ways of using big data are:

"First, big data can unlock significant value by making information transparent and usable at much higher frequency. Second, as organizations create and store more transactional data in digital form, they can collect more accurate and detailed performance information on everything from product inventories to sick days, and therefore expose variability and boost performance. ... Third, big data allows ever-narrower segmentation of customers and therefore much more precisely tailored products or services. Fourth, sophisticated analytics can substantially improve decision-making. Finally, big data can be used to improve the development of the next generation of products and services. "

The third insight identified by the McKinsey analysts is that "the use of big data will become a key basis of competition and growth for individual firms." The fourth insight is that "the use of big data will underpin new waves of productivity growth and consumer surplus." The fifth insight is that not all sectors will benefit equally from big data analytics. The analysts explain:

"While the use of big data will matter across sectors, some sectors are set for greater gains. We compared the historical productivity of sectors in the United States with the potential of these sectors to capture value from big data (using an index that combines several quantitative metrics), and found that the opportunities and challenges vary from sector to sector. The computer and electronic products and information sectors, as well as finance and insurance, and government are poised to gain substantially from the use of big data."

I'm surprised that the utility sector wasn't mentioned. It appears that sector is well on its way to benefiting from big data as well. The sixth insight is one mentioned earlier in the infographic: "There will be a shortage of talent necessary for organizations to take advantage of big data. By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions." The final insight is that policies are going to have to catch up with technologies in order "to capture the full potential of big data." The authors assert, "Policies related to privacy, security, intellectual property, and even liability will need to be addressed in a big data world." Together the infographic and McKinsey's insights provide an excellent overview of big data and why there currently is so much hype about it.

January 25, 2013

Artificial Bee Intelligence

"Honey bees are fascinating creatures," writes Ben Coxworth. "They live harmoniously in large communities, divided into different castes, with some of the worker bees heading out on daily expeditions to gather nectar and pollen from flowers." ["Scientists hope to put artificial bee brains in flying robots," Gizmag, 2 October 2012] Those daily pollen-gathering expeditions have fascinated scientists for years. Bees have evolved the ability to make those trips to and from the hive as efficient as possible and, as Coxworth notes, "A study has suggested that the efficient method in which bees visit those flowers could inspire the improvement of human endeavors such as the building of faster computer networks." As a result, there has been a spate of news stories about how scientists are trying to recreate some of the bee's most beneficial talents. Coxworth describes one such effort. He writes: AI Bee Brain

"Scientists from the Universities of Sheffield and Sussex hope to build a computer model of the honey bee's brain, with the ultimate hope of using it to control tiny autonomous flying robots. The project is called Green Brain – a tip of the hat to IBM’s Blue Brain Project, the aim of which is to create a computer model of the human brain. The Green Brain team, however, aren't actually trying to recreate all of a bee's mental processes. Instead, they're focusing on the systems that control its vision and sense of smell. Also, unlike the Blue Brain scientists, they're not using supercomputers to create their model. In order to get the performance they'll need out of desktop PCs, they are using high-performance GPU (graphics processing unit) accelerators."

Coxworth asks a good question, "Why would anyone want a bee-brained flying robot?" It turns out there are, in fact, good reasons. He explains:

"In the same way that honey bees can sniff out and visually identify flowers, it is hoped that the autonomous robots could be used to trace odors or gases to their sources. Not only could this have applications in fields such as environmental monitoring, but it could also prove useful for things like search-and-rescue operations."

Coxworth also notes that if bee colonies continue to collapse, artificial bees may be needed to pollinate crops. Coxworth's Gizmag colleague, Dana Borghino, reports that researchers at Queen Mary University of London are also studying bees and how they can "fly from flower to flower and then come back to their hives expending the least amount of time and energy." They hope to apply their bee knowledge to the well-known "traveling salesman" problem. ["How bumblebees might get you faster overnight deliveries," 24 September 2012] Borghino notes that the traveling salesman problem is "employed by all courier companies in day-to-day operations to deliver and pick up goods using the least amount of time and fuel while still visiting all the required destinations; electronics manufacturers use it to create better and more powerful microchips; and geneticists use it to sequence DNA." He continues:

"Computationally, this problem is as hard as they come. The most direct, 'brute force' approach to finding a solution would be to try out every possible combination but, in practice, this is unfeasible: with just 20 cities on the itinerary, there would already be 60.8 quadrillion comparisons to carry out – impractical even for today's computers. ... If we want to do better, we need to look elsewhere. ... Whenever researchers are stumped by a problem, looking at nature's solutions is often a good option. ... Now, judging from the recently announced findings, it looks like individual bees might also be able to contribute something to the field. A group of researchers from the Queen Mary, University of London has recently shown that the common bumblebee can solve a problem analogous to the traveling salesman's by using a simple iterative approach that requires no real number-crunching – only a tiny bee's brain."

It turns out that bees use a combination of trial and error and selective memory to discover and remember the shortest path between hive and flower. Borghino writes, "The process is remarkably similar to the widely adopted ant colony optimization algorithms, in which the increased probability of updating the shortest route is substituted by a stronger pheromone trail." He concludes that when bee behavior is modeled appropriately it "could result in an algorithm that is more generalized, much more flexible and well-suited to situations in which the number of resources, their spatial configuration and their reward values are changing over time."

Peter Rothman reports that "a full connectome map of the honey bee brain has been developed." As a result, "Artificial insect minds may be the first true Artificial General Intelligences (AGIs) available for commercial use." ["The Buzz of the Hive Mind: Artificial Bee Brains," hplusmagazine.com, 2 October 2012] Rothman continues:

"Researchers at the Institut fur Biologie-Neurobiologie Berlin, Mercury Computer Systems GmbH, SRI International, and the Zuse-Institut Berlin have developed a three-dimensional average-shape atlas of the Honeybee Brain. This atlas could allow the development of a full robotic insect capable of imitating the full social capabilities of an insect operating in its natural environment. This would open the door to entirely new ways to study insects using a robotic insect to enter and participate in activity in the hive while transmitting information back to human scientists in real-time. Robotic insects could also be used to help endangered biological insect colonies more efficiently locate food sources or offer enhanced protection against predators, disease, etc., for example to rescue specific bee colonies in danger of collapse. An important point to understand here is that the honey bee brain is a general purpose bio-information processor capable of complex spatial perception and recognition of visual and olfactory stimuli in open environments. The bee brain can perform real world pattern recognition and categorization in the presence of noise and interfering or competing signals, it provides goal based guidance and control of a winged flying body, and bees operate in groups demonstrating group goal based behaviors such as directed and optimal search by using social communication and 'swarm' organization. ... The widespread availability of bee based AGI will allow these problems to be solved using this same approach providing better solutions for numerous problems in routing, path planning, and resource allocation."

Harvard University researchers have been working on a robotic bee for the past five years. Evan Ackerman reports:

"Five years is a long time in the fast-paced world of robotics, but when you're trying to design a controllable flying robot that weighs less than one tenth of one gram from scratch, getting it to work properly is a process that often has to wait for technology to catch up to the concept. The RoboBee has been able to take off under its own power for years, but roboticists have only just figured out how to get it to both take off and go where they want it to. Or at least, they're getting very, very close." ["Harvard RoboBees Learn to Steer, Mostly," ieee spectrum, 8 October 2012]

Obviously, there is more than novelty and curiosity spurring the development of robotic bees powered by artificial brains. Anytime, however, that someone raises the possibility of releasing masses of autonomous robots into the real world, someone is going raise the spectre of catastrophe. Perhaps the next big disaster film will be "The Swarm II: Robot Bees." If you've forgotten the move entitle "The Swarm" you're probably not alone. The following trailer gives you a pretty good indication of why it was forgettable.

Personally, I'm not predicting that a swarm is coming or that artificial insects will represent a threat to mankind. I'll let others, like researchers at the Centre for the Study of Existential Risk (CSER), worry about such things (see my post entitled Artificial Intelligence: Is there Peril in Deep Learning?). If teams creating insect-size robots and insect AGI cooperate, don't be surprised if you one day see an interesting looking bug coming to save your life during a natural disaster.

January 24, 2013

Is U.S. Manufacturing on the Move? -- Part 2

"The sharp slowdown in U.S. manufacturing that began last spring appears to be over," writes Don Lee, "setting the stage for moderate expansion in the factory sector in coming months — with a little boost from companies bringing overseas production back to America." ["Manufacturers group forecasts moderate growth in coming months," Los Angeles Times, 10 January 2013] Lee's optimism stems from the results of a "quarterly survey by the Manufacturers Alliance for Productivity and Innovation, a trade group." He continues:

"The Manufacturers Alliance report found that 17% of companies with manufacturing operations abroad had brought some of that work back to the U.S. in the last two years. Such reshoring — or insourcing, as some people call the return of manufacturing from overseas — has been reported in isolated cases over the last few years, but there's been little in the way of comprehensive data about this phenomenon. The Manufacturers Alliance's survey sample was small — 42 large companies with operations abroad — but its findings nonetheless reveal what looks to be a continuing trend."

Like most other articles on the subject of reshoring, Lee details some of the reasons that companies are opting to return manufacturing to the U.S. They include rising wages in China and increased transportation costs. Lee also reports that supply chain risk management is playing an ever-larger role in such decisions. He explains:

"Apart from labor rates, U.S. companies moving production back also cited increasing shipping costs and, most notably, 'a desire to reduce supply chain uncertainty,' according to the Manufacturers Alliance study. Concerns about supply chain stability grew after Japan's massive earthquake and tsunami in March 2011 knocked out vital parts makers in auto, semiconductor and other electronics industries. Lower U.S. natural gas prices and other energy costs weren't mentioned in the reshoring."

Like the analysts cited in Part 1 of this series, Lee notes that workers shouldn't "count on reshoring to create a whole lot of domestic jobs or investment any time soon." He concludes, "Still, any little bit will help. American manufacturing employment has been trending down for decades amid productivity gains and growing foreign competition." As noted in the first segment of this series, a renaissance of American manufacturing is not guaranteed. In that post, I noted seven steps that Hal Sirkin, a senior partner at Boston Consulting Group, recommends America take to help the foster manufacturing's revival. But Sirkin isn't the only analyst who has given the matter some thought. Analysts at The Brookings Institute have also been busy considering the challenge. Mike Cassidy reports, "Brookings has come out with three policy briefs that look not at what is, in terms of manufacturing, but what could be, if the right steps are taken." ["Brookings Institution has some big ideas about helping U.S. manufacturing grow," SiliconBeat, 14 January 2013]

In the first of these policy briefs, Bruce Katz and Peter Hamp state that they believe "a 'race to shop' competition for advanced manufacturing should be initiated in order to expedite the transition toward a more innovative, productive, inclusive, and globally competitive American economy." ["Create a "Race to the Shop' Competition for Advanced Manufacturing," January 2013] They go on to state:

"The competition would challenge U.S. states and metropolitan areas to align their policies and investments to meet the distinct labor demands of their primary advanced manufacturing sectors and clusters. Winning applicants would not only receive resources for planning and implementation, but also increased flexibility in the use of existing federal workforce development skills and training funds."

They believe a competition is required because "the United States still lacks a coherent, overarching manufacturing strategy." The competition would require states and metropolitan areas to assess seriously exactly what constitutes their assets and challenges with regards to manufacturing. In past posts, I've noted that not all locations possess the same assets when it comes to establishing industrial clusters. Without a true assessment of what is likely to work in a particularly area, states and metropolitan areas are destined to be tossed about in a sea of uncertainty. The Brookings' proposal is for "an annual $150 million Race to the Shop competition to reform and modernize federal investments in workforce education and skills training for advanced manufacturing in the United States." Although Brookings has traditionally been considered a liberal think tank associated most closely with the Democratic Party, this proposal should be welcomed by both sides of the political spectrum. It strengthens state and local efforts, seeks targeted programs aimed at reducing unemployment and welfare rolls, and fosters the recovery of the American middle class.

In a second policy paper, Robert D. Atkinson and Stephen Ezell recommend designating "20 institutions of higher education as 'U.S. Manufacturing Universities.'" ["Support the Designation of 20 'U.S. Manufacturing Universities," January 2013] These federally-designated "Manufacturing Universities" would have to "revamp their engineering programs" by placing "particular emphasis on work that is relevant to manufacturing firms while providing engineering students with real-world work experience." In past posts, I have reported that manufacturers claim that new employees hired directly out of college face a skills gap (for example, see my post entitled Manufacturing and the Skills Gap). In those posts, I've recommended that the kind of public/private partnership being recommended by Brookings be established to address this challenge. I have also recommended that apprenticeships, like those that have traditionally supported German industry, be established. Atkinson's and Ezell's recommendation that students be provided with real-world work experience would go a long way towards achieving that goal. In fact, they write that designated universities "would view doctoral training as akin to high-level apprenticeships ... and would not allow the conferral of a Ph.D. unless one has done some work in industry." Atkinson and Ezell conclude that "it is simply impossible to have a vibrant national economy without a globally competitive traded sector." They also recognize how important urban areas are for the future of the country. They note that "over 80 percent of manufacturing jobs and 95 percent of high-tech manufacturing jobs" are located in urban areas. Brookings' analysts believe that the designation of manufacturing universities would help move revive manufacturing in urban areas that critically need employment opportunities.

In the final policy paper, Devashree Saha and Mark Muro recommend the U.S. Government build "a national network of advanced industries (AI) innovation hubs, expanding on the modest beginnings now being made through the Department of Energy's Energy Innovation Hubs program and the Department of Commerce's National Network for Manufacturing Innovation (NNMI) initiative." ["Create a Nationwide Network of Advanced Industries Innovation Hubs," January 2013] To learn more about the faltering start of the Energy Innovation Hubs program, read my post entitled Innovation Hubs and Regional Innovation Clusters. Saha and Muro concentrate on "advanced industries" because they claim such industries "punch well above their weight in building and expanding national and regional economic competitiveness." They therefore propose "that Congress authorize the build-out of a national network of advanced industries innovation hubs by funding at least five more Energy Innovation Hubs and supporting the creation ... of at least 20 institutes for advanced manufacturing innovation as proposed in the NNMI initiative." They conclude:

"Such centers will tackle the toughest problems with the biggest commercial pay-offs in technology and process development, technology deployment, and platform establishment. Because they will be regional and intensely collaborative, with strong private-sector participation, the hubs will produce substantial economic spillovers into the regional advanced industry clusters amid which they will be sited."

Taken together, Brookings' analysts believe these policy proposals will go a long towards ensuring that the U.S. has a robust advanced industries manufacturing future. Although each of these policy proposals comes with an initial price tag, all good investments do -- but good investments always pay off in the end. As Mike Cassidy concludes, "You've got to spend money to make money, as they say. And it turns out manufacturing jobs are particularly good jobs that produce products that can be exported, thereby reducing the nation's trade deficit. Manufacturing also begets innovation as those who make things work both on ways to make better things — and on better ways to make them."

Another analyst who offers some recommendations about manufacturing in America is Stephen Hoover, CEO of Xerox's PARC. He believes "we have an opportunity in the US to create and own the future of manufacturing." He calls it "Manufacturing 2.0." ["How the U.S. Can Reinvent Manufacturing," Techonomy, 8 September 2012] He writes:

"To realize this vision, businesses must start exploring new manufacturing technologies and business models, and US government needs to begin developing coordinated policies to support R&D, public education, and further investment in this new approach to manufacturing. There is great enthusiasm about exciting new developments in manufacturing including 3D printing, robotics, and printed electronics. These are important technologies, but we believe they are elements of a larger, end-to-end change in manufacturing, representing a radical shift from traditional approaches."

By and large, Hoover agrees with the analysts at Brookings. However, he describes a broad vision for manufacturing 2.0. He writes:

"A whole new ecosystem is arising, which will include social design, social funding, flexible and distributed supply chains, and more. This shift will ripple through the industry and likely threaten today's vertically integrated, large-scale manufacturing industry—much as the PC revolution threatened the mainframe computer industry. These democratizing technologies are a tremendous fount of innovation opportunities. As with most disruptive changes, new ways to fund, conceive, design, and build products means we will see entirely new markets develop, with brand new types of jobs originating right here in the US."

Hoover believes that new computer technologies will fuel the revolution in manufacturing; especially, "advances in computational reasoning, decision-making, and control that are quickly reaching human skill levels." He continues:

"A similar advance will soon enable 'intelligent software assistants' to work with human designers to convert design concepts into functional designs that can be manufactured at low cost. These capabilities will empower all kinds of people to design products and leverage complex production value chains. These automated assistants will frontload the design process, so mistakes can be made in the software, rather than in production. We will understand the actual manufacturing process in advance, including what will be made, how components will fit together, and whether the parts will work together safely and correctly and are manufacturable at a reasonable cost. If we get the computational interfaces and reasoning right, there can arise a massive, distributed network of manufacturers able to work together to create a dynamic supply chain for complex products like vehicles, airplanes, and consumer electronics—without needing a central organizing entity. These can be manufactured with traditional production technologies or newer methods such as printed electronics and other additive manufacturing techniques, and then shipped directly to end customers through third-party distribution channels. Such a shift has the potential to dynamically connect all types of manufacturers across the US and other countries to create whole new opportunities, markets, and jobs."

Hoover seems much more optimistic about job creation than most other analysts, even though he doesn't claim that all new jobs will be created in new factories. He concludes, "After years of recession, high unemployment, and fear of America's innovation and scientific downfall, Manufacturing 2.0 is on the horizon." He sees that as a good thing -- a very good thing.

January 23, 2013

Is U.S. Manufacturing on the Move? -- Part 1

Financial Times' columnist Sebastian Mallaby asserts that "manufacturers using 'big data' are setting the scene for a revival" in the United States. ["American industry is on the move," 8 January 2013] To back this assertion, he cites General Electric's Jeff Immelt, who "declared that outsourcing was 'mostly outdated as a business model." As most people are aware, GE is spending over a billion dollars to construct new plants in the U.S. that will build water heaters and washing machines. Mallaby continues:

"President Barack Obama has trumpeted this wave of 'insourcing', while Hal Sirkin of the Boston Consulting Group foretells a US 'manufacturing renaissance'. Even as the news from Washington reeks of heedless brinkmanship, the news from the people who actually make stuff sounds refreshingly hopeful."

Mallaby knows that there is a difference between hope and reality. He asks, "How real is this renaissance?" Since the U.S. "has experienced a steady relative decline" in manufacturing starting in 1980, Mallaby writes, "It is tempting to dismiss it out of hand." He goes on to note, however, that things are beginning to change. For one, "in 2000 US wages were almost 22 times higher than China's. By 2015 that multiple will have declined to four." For another, "between 1996 and 2009 ... American manufacturers piled up [a] productivity gain of 69 per cent." Add to that the fact that "the US joined the North American Free Trade Agreement and the World Trade Organisation, and its continent-sized economy generates plenty of internal competition," as well as "a jolt from technology," and Mallaby sees a foundation for a U.S. manufacturing renaissance. He believes "the more important technological jolt comes under the heading of 'big data'."

He strengthens his arguments by citing a study written by Nick Bloom and John Van Reenen that claims "US companies were, on average, better managed than foreign rivals." He continues:

"A striking conclusion of their study is that US manufacturers continue to get better, particularly when it comes to capturing and analysing data on everything from customer behaviour to production-line efficiencies. And there is plenty of scope to improve further. A minority of survey respondents embraced most state-of-the-art management incentives and monitored performance against clear targets. But a quarter of respondents adopted fewer than half of these practices."

When all of these factors are considered together, Mallaby claims "the stage is at least half set for a US manufacturing revival." He believes the stage is only half set because obstacles remain, including "poor education [and] poor infrastructure."

A manufacturing renaissance might sound great, but Mallaby claims that a manufacturing revival doesn't necessarily equate to improved unemployment statistics. He is not alone in this assessment. A study from the McKinsey Global Institute concludes, "Manufacturing cannot be expected to create mass employment in advanced economies on the scale that it did decades ago." ["American manufacturing is coming back. Manufacturing jobs aren't," by Neil Irwin, Washington Post, 19 November 2012] The reason for this counterintuitive conclusion is that the use of robots is on the rise. As Irwin reports, "It is a story of robotics and other technologies improving at a remarkable rate, eliminating the need for factory floors crowded with workers doing manual labor." David Wessel agrees. "Manufacturing alone," he writes, "isn't going to put America back to work." ["The Factory Floor Has a Ceiling on Job Creation," Wall Street Journal, 12 January 2012] Wessel concludes:

"There are good reasons to cheer for domestic manufacturing. Expanding factories have beneficial side effects. 'If you get an auto-assembly plant, Wal-Mart follows,' says Ron Bloom, ... President Barack Obama's [former] manufacturing czar. 'If you get a Wal-Mart, an auto-assembly plant doesn't follow.' Modern factory jobs, many of which require more brainpower and computer know-how than muscle, often pay well and are secure. Research and development — the key to maintaining the U.S. edge in innovation — sometimes migrate abroad when production does, a good reason to strive to keep production at home. But manufacturing employment isn't going to grow nearly enough to return the U.S. to full employment. It isn't going to be the chief source of jobs for the next quarter-century. And, given the demands of the modern factory, it isn't going to be the ticket to the middle class for unskilled workers who haven't gone beyond high school. Pretending otherwise is foolish."

Mallaby agrees with Wessel that, despite the fact that a return of manufacturing doesn't mean a lot of new jobs, "a manufacturing turnaround is clearly desirable." I'm assuming he is also looking at the tail of support that manufacturing creates. He concludes:

"A US manufacturing renaissance is possible, not certain. But Americans are right to celebrate the early indicators – from Siemens, which has just begun shipping US-made turbines to Saudi Arabia; from Toyota, which exports US-made cars to 21 countries; and of course from that chief insourcer, GE's Mr Immelt."

Robert J. Bowman, managing editor of SupplyChainBrain, agrees that a manufacturing revival is possible but not certain. However, he believes that there are seven steps that can be taken to make it more of a certainty. ["U.S. Manufacturing Revival: Seven Steps to Locking It," 22 October 2012] The seven steps were discussed by Hal Sirkin, a senior partner at Boston Consulting Group, during the Supply Chain Council's 2012 Executive Summit in Indian Wells, Calif. They are:

"- Adjust tax policies to favor continued reshoring, including a reduction in corporate tax rates and the elimination of loopholes. (The latter is a favorite promise of politicians, but it never seems to happen, does it?) In addition, provide targeted tax credits for U.S. job creation, as well as a 'dollar-for-dollar' tax credit for the repatriation of funds - as long as all of the money is used to create jobs. (Good luck with that. Corporations used the last repatriation tax holiday to build up cash reserves, buy back stock, acquire other companies and lay off thousands of workers

"- 'Level the playing field with China' by treating it as a developed economy. Address China's subsidizing of state-owned companies, which has contributed to the hollowing out of U.S. industry. (Although a new breed of Chinese companies doesn't necessarily fit into this category.) Press for the enforcement of intellectual property protection. (This one might well be the most important factor of all.) Increase the use of dumping penalties in cases where China is under-pricing its exports.

"- Focus on building and keeping 'the world's best talent.' We've heard a lot during this presidential campaign season about the idea of stapling green cards to the diplomas of foreigners who study here, so that they can remain in the U.S. and hire Americans. For the longer term, create a network of vocational colleges, offering two years of liberal arts and two years of training in practical areas such as welding, plumbing and electrical work.

"- Rethink regulation, with an eye toward balancing the need for a clean environment and safe products with concerns over U.S. competitiveness. 'Many regulations on the books from the past are no longer needed,' claimed Sirkin. (Much easier said than done. Who wants to explain how relaxed regulation, in the interest of 'competitiveness,' led to the next outbreak of tainted food?)

"- Take a page from China's playbook, and promote the formation of industry clusters. These initiatives group manufacturers, their suppliers, training facilities and infrastructure within one geographic location, providing everything that's needed to produce and export key products. (It's worth noting, I suppose, that the clusters concept has been called 'modern-day snake oil' by at least one pundit.)

"- Focus on foreign manufacturers that want to produce in the U.S., or use the country as a platform for their global exports. Here's where the notion of the U.S. as a relatively low-cost market comes strongly into play.

"- Boost awareness of changes taking place in China's economy, and the resulting opportunities in the U.S. China, said Sirkin, 'should not be the default location. 2015 is different from 2010.' Companies should be encouraged to 'do the math' before they make the decision to offshore production."

Bowman believes that labor-intensive industries will continue to produce products offshore. He writes that "even Sirkin is only predicting that 20 to 30 percent of the goods produced in China will be shifting back to the U.S." He concludes:

"So doubts and qualifications abound. Reshoring won't take hold without a proactive response by government and the private sector. Still, Sirkin was adamant that change is in the air. 'We're at the beginning,' he proclaimed, 'of a new manufacturing renaissance in the U.S.'"

Bill McBeath of ChainLink Research agrees with Bowman that China isn't going away and insists that the decision to reshore manufacturing is a complex decision. ["Reversal of the Offshoring Tide," 8 January 2013] He concludes, "A shift of manufacturing back to the US will certainly be welcome here. But it can't be just because 'it would be nice.' It has to make good business sense for each business that makes that decision." You can read his article to learn about what considerations he believes need to be taken into account. Overall pundits appear to be more optimistic than pessimistic about the revival of manufacturing in the United States. That's good news for the economy. Tomorrow we'll see what other pundits think about the future of American manufacturing and see if that optimism is widespread.

January 22, 2013

Supply Chain Risk Management Still not Receiving Enough Attention

Looking back on 2012, Adrian Gonzalez observes, "Some companies took proactive action to define and implement supply chain risk management practices, many more did not." ["Rethinking Supply Chain Risk Management," Logistics Viewpoints, 2 January 2013] Gonzalez isn't alone in his observation (see my post entitled Supply Chain Disruptions Are Growing More Serious but Risk Management isn't Keeping Pace). Gonzalez reports that when "Hurricane Sandy hit in October ... companies that had failed to 'walk the talk' on risk management experienced significant supply chain disruptions, and they were reminded once again why they can't afford to ignore this critical dimension of supply chain management anymore." Gonzalez may be correct that companies can't afford to ignore risk management, but Tim Burt reports that the priority it receives depends on in large measure on how recently a major supply chain disruption occurred. The further such events fade into the past the lower risk management descends on the priority list. ["Have we seen a watershed in risk management?, Procurement Leaders, 3 January 2013] He draws that conclusion from the results of survey conducted last year, although he acknowledges, "Why this attention slipped is open to debate." He reports that John Walker, global CPO of the Swiss-based Buhler Group, speculates that the downward trend "could demonstrate that recent risk mitigation efforts are taking root and actions could be paying off." That would be good news.

Burt believes that events of 2011 and 2012 created a watershed for supply chain risk management and Johnson agrees, at least for Chief Procurement Officers. He stated, "I believe that CPOs are again realising that in a global world – even in the absence of an event – they must be prepared. ... All stakeholders – investors, boards, customers and employees are asking: how do we maintain and protect revenue? It's evident to me that we are now focused on managing risk and protecting revenue generation; I predict this trend will not go away." As I noted in my previous post referenced above, supply chain disruptions are more likely to increase in the years ahead than decrease. Burt agrees. He states, "Global events aren't going to go away." Gonzalez notes, "Although natural disasters like hurricanes and floods grab the headlines, the reality is that supply chains face a whole range of risks that are always present." He details a few:

  • "Supply shortage due to a quality problem, supplier bankruptcy, or other issue. A recent example is Ford Australia and General Motors Holden racing in to underwrite a supplier’s $6.5 million debt to prevent their own production lines from shutting down.
  • "The continued rise of trade protectionism, which is increasing the cost of imports and exports, as well as dampening demand for goods and limiting supply. In a speech last summer, the Director General of the World Trade Organization, Pascal Lamy, said that 'the accumulation of these [new] trade restrictions is now a matter of serious concern.' Last March, for example, the US, EU, and Japan filed a formal complaint with the WTO accusing China of keeping rare earth prices low for its domestic manufacturers and pressuring foreign firms to move their operations there.
  • "The impact of currency rates on supply chain costs and product demand. In 2011, for example, Sharp Corp. announced that it was localizing more of its solar-panel production outside Japan because the strong yen was making exports too expensive, especially compared to Chinese products. 'We need to change the way we manage our businesses so that foreign exchange movements won’t affect us as much,' said Sharp President Mikio Katayama in an interview.
  • "Disruptions caused by IT service failures or security breaches. This past November, for example, United Airlines flights were grounded for several hours due to a computer glitch. And last summer, Amazon.com’s EC2 service went down twice, affecting clients such as Instagram, Pinterest, and Netflix, and hackers broke into LinkedIn’s site and stole more than six million of its customers' passwords.
  • "Social media: Can what people say on Facebook, Twitter, YouTube, and blogs bring your supply chain operations to a halt, or even put your company out of business? You bet it can, as the 'Pink Slime' incident showed last year."

Gonzalez asserts that there is "a great foundation of knowledge and experience" in the area of risk management that companies can use to get started. He also indicates that there are some "new ideas and developments" that are starting to emerge. He offers four ideas to consider, starting with corporate culture. He writes:

"Make thinking about supply chain risk part of the corporate DNA. This was one of my key takeaways from an executive 'think tank' session I attended last summer on supply chain risk management. The goal is to incorporate risk in the decision-making process at all levels of the supply chain, just like cost is today. In other words, supply chain professionals need to get to the point where talking and thinking about risk is as common and instinctual as talking and thinking about cost and service. Unfortunately, at many companies today, risk rarely enters the conversation or analysis. Some of my other takeaways from the session were:

  • "Focus less on individual risks and more on the capabilities to deal with risks. Also, think about risk management as a program, not a project.
  • "Key metrics associated with risk management are Time-to-Recovery and Revenue-at-Risk. Outperforming the competition on these metrics creates a competitive advantage.
  • "You need to 'dollarize' risk in order to have meaningful conversations with Sales and Operations Planning (S&OP), Marketing, C-level executives, and other internal and external stakeholders."

If supply chain risk management is really in your corporate DNA, risk managers won't have to seek out S&OP or marketing team members or C-level executives to hold "meaningful conversations" because risk management conversations will be a part of every appropriate meeting. Gonzalez' next idea deals with training.

"Supply chain professionals need more training in quantitative risk concepts. In a thought-provoking HBR blog posting, 'Why Quants Should Manage Your Supply Chain Risk,' Carlos Alvarenga argues that 'anyone who claims to be managing supply chain risks without understanding subjects like real options, hedging, Value at Risk models, financial simulation, and so on, is more like a security guard than a real risk manager.' Simply put, supply chain professionals can learn a lot from the financial, insurance, and other industries where managing risk is a core focus and discipline."

Since more and more companies are insuring themselves against supply chain disruptions, they should be able to find help in this area from their insurer. Insurers would rather help prevent or mitigate disruptions than make big payouts. Gonzalez' next idea deals with social media.

"Leverage social media as a risk management tool. Social media provide you with more timely and insightful insights about emerging risks and events, enabling you to take corrective action sooner and thus prevent (or minimize the impact of) a supply chain disruption. For example, according to an October 2011 Wall Street Journal article, 'When Virginia's magnitude 5.8 earthquake hit [in August 2011], the first Twitter reports sent from people at the epicenter began almost instantly at 1:51 p.m.— and reached New York about 40 seconds ahead of the quake's first shock waves … The first terse tweets also outpaced the U.S. Geological Survey's conventional seismometers, which normally can take from two to 20 minutes to generate an alert.' The article also highlights how researchers and firms are mining Twitter messages 'to monitor political activity and employee morale, track outbreaks of flu and food poisoning, map fluctuations in moods around the world, predict box-office receipts for new movies, and get a jump on changes in the stock market.'"

In an earlier post, I cited one source who reported that a manufacturer realized that one's of its suppliers was in serious financial trouble when workers in the area started to comment on social media sites that the supplier's parking lot was looking emptier every day. Gonzalez' final idea deals with supply chain mapping.

"Start by mapping your supply chain. Do you know where the manufacturing facilities of your suppliers (and their suppliers) are physically located? Which parts are manufactured at each location? Do you track the history and frequency of disruptions that occur at each facility and geographic region, due to either natural forces (hurricanes, floods, earthquakes, etc.) or other factors (labor strikes, power outages, quality issues, etc.)? The bad news is that few companies gather and track this information; the good news is that there are new supply chain mapping and risk management software solutions available that companies can use to facilitate the process."

For more information about this topic, read my post entitled Risk Management: Mapping Supply Chain Risks. Gonzalez concludes, "The bottom line is that supply chain management is about managing risks. And since risks are dynamic in nature, with new ones emerging all the time, companies must continuously study the landscape and determine which risks are worth addressing now and how." Noha Tohamy, research vice president with Gartner, recommends that companies use a proactive rather than reactive approach to risk management. ["Supply-Chain Risk Management: An Essential Competency," SupplyChainBrain, 10 January 2013] Like Gonzalez, Tohamy believes that risk management must become part of a company's DNA. "In my experience, she says, "risk management has to be sponsored and understood by executives across the entire organization." She also believes that prevention is better than cure. The article concludes:

"In recent years, she has seen a greater awareness of the issue among supply-chain managers. 'I think we're getting better,' she says. 'Most of the companies I work with are starting to talk about how to make the supply chain more resilient.' To do that, they need to acquire a deep understanding of their products, critical components, and sourcing networks. They also need to engender better collaborative relationships with multiple tiers of suppliers. In the debate over stressing prevention of future disasters versus building resiliency to what actually happens, Tohamy leans toward the latter. In fact, she says, good risk-management can provide a means of boosting competitiveness and coping with emerging markets. 'The way I look at risk,' she says, 'is as just another opportunity out there.'"

As I've noted before, risk management processes are not cost free; but, the alternative to risk management (doing nothing) can end up costing a company a lot more.

January 21, 2013

Cognitive Computing

Researchers at the Cognitive Computing Research Group at the University of Memphis claim that cognitive computing, like the Roman God Janus, has two faces. In the case of cognitive computing, they claim there is a science face and an engineering face. "The science face fleshes out the global workspace theory of consciousness into a full cognitive model of how minds work. The engineering face of cognitive computing explores architectural designs for software information agents and cognitive robots that promise more flexible, more human-like intelligence within their domains." ["Cognitive Computing Research Group," University of Memphis] Frankly, the business world is more interested in the engineering face of cognitive computing (i.e., how artificial intelligence can help companies better understand the world in which they operate); however, you really can't have one face without the other. That's why commercial firms as well as academic institutions are pursuing cognitive computing.

Mark Smith, CEO & Executive Vice President of Research at Ventana Research, claims that IBM's "Watson blends existing and innovative technology into a new approach called cognitive computing." ["IBM Watson Advances a New Category of Cognitive Computing," Perspectives by Mark Smith, 11 December 2012] However, Roger Kay asserts, "Watson, the reigning jeopardy champ, is smart, but it's still recognizably a computer." He believes that cognitive computing represents "something completely different." ["Cognitive Computing: When Computers Become Brains," Forbes, 9 December 2011] On some level, both Smith and Kay are correct. Smith writes, "At the simplest operational level [cognitive computing] is technology for asking natural language-based questions, getting answers and support appropriate action to be taken or provide information to make more informed decisions." Watson, he notes, "relies on massive processing power to yield probabilistic responses to user questions using sophisticated analytical algorithms." On the other hand, Kay writes, "Cognitive computing, as the new field is called, takes computing concepts to a whole new level." Cognitive computing goes beyond calculating probabilities to thinking. Smith continues:

"A cognitive system like Watson accesses structured and unstructured information within an associated knowledge base to return responses that are not simply data but contextualized information that can inform users' actions and guide their decisions. This is a gigantic leap beyond human decision-making using experience based on random sources from the industry and internal sets of reports and data. This innovative new approach to computing is designed to aid humans by working with natural language – English in the case of today's Watson."

Smith goes on to provide a brief primer about cognitive computing. He writes:

"For those of you who are not used to the word cognitive, the foundation of cognition is the sum of all the thinking processes that contrib­ute to gaining knowledge for problem-solving. In computational systems these process­es are modeled using hardware and software; machine-based cognition thus is a step toward imbuing an arti­fi­cial system with attributes we typically consider human: the abilities to think and learn. Watson builds on a foundation of evidence from preexisting decisions and knowledge sources that it can load for reference in future inquiries. The evidence-based reasoning that Watson employs to answer question is part of the big deal in its approach."

Smith notes that "Watson supports three types of engagement – ask, discover and decide – that take natural language questions, find facts to support a decision process, then gets probabilistic guidance on decisions." Eventually, learning systems, like Watson, go beyond guessing (i.e., selecting the most probable answer) to knowing. Since such systems ingest massive amounts of data, crunch it with enormous computing power, and can do that on a continuous basis, there are good reasons for people like Shweta Dubey to call cognitive computing a disruptive technology. ["Is Cognitive Computing the Next Disruptive Technology?, The Motley Fool, 28 December 2012] Dubey believes that Watson represents a breakthrough because IBM has figured out how to monetize its cognitive capabilities. Smith agrees cognitive computing technologies have a bright future in business. He writes:

"This goes beyond search and retrieval technology; machine learning and processing of questions using very large volumes of data, commonly referred to as big data, is the foundation on which Watson as a cognitive system operates. Most important is the continuous learning method and what I would call adaptive intelligence. While machine learning and pattern-based analytics are part of the cognitive system, the ability to process big data efficiently to provide a probabilistic set of recommendations is just the kind of innovation many industries need. ... IBM has a huge opportunity to bring innovation to business through the use of Watson, and has been experimenting with a number of deployments to test its potential. ... IBM has been working with organizations in healthcare and financial services, but believes Watson could be useful in just about every industry that must have what I call better situation intelligence that must accommodate current conditions and preexisting information to determine the best answer."

Dharmendra Modha, Manager of cognitive computing at IBM Almaden Research Center (who has been called IBM's "Brain Guy"), is one of the driving forces behind the company's efforts to create thinking machines. "For more than half a century," he writes, "computers have been little better than calculators with storage structures and programmable memory, a model that scientists have continually aimed to improve. Comparatively, the human brain—the world's most sophisticated computer—can perform complex tasks rapidly and accurately using the same amount of energy as a 20 watt light bulb in a space equivalent to a 2 liter soda bottle." ["Beyond machines," IBM] Creating a computer that is as efficient and even more effective than the human brain is the ultimate goal of cognitive computing. Modha calls cognitive computing "thought for the future." He continues:

"Making sense of real-time input flowing in at a dizzying rate is a Herculean task for today's computers, but would be natural for a brain-inspired system. Using advanced algorithms and silicon circuitry, cognitive computers learn through experiences, find correlations, create hypotheses, and remember—and learn from—the outcomes. For example, a cognitive computing system monitoring the world's water supply could contain a network of sensors and actuators that constantly record and report metrics such as temperature, pressure, wave height, acoustics and ocean tide, and issue tsunami warnings based on its decision making."

Although there are dreams of about global networks of things, the reality is that such networks are currently cost prohibitive. The cost/benefit analysis of cognitive computing networks will be a different matter. Companies that don't take advantage of cognitive computing will find themselves at a severe disadvantage in the years ahead. Modha reports that "IBM is combining principles from nanoscience, neuroscience and supercomputing as part of a multi-year cognitive computing initiative" being funded in part by the Defense Advanced Research Projects Agency (DARPA). The IBM site contains links to a number of videos that discuss the IBM project.

Will true cognitive computing ever be achieved? That remains an open question depending on whether you are talking about Artificial General Intelligence (AGI) or more limited artificial intelligence applications. No one questions that cognitive computing applications for limited purposes are going to play an important role in the business environment in the years ahead. Frank Buytendijk ponders what business intelligence and business process management may look like in the future if cognitive computing becomes ubiquitous. ["Can Computers Think?" BeyeNETWORK, 27 December 2012] He writes:

"If computers can think, even be self-aware, and if datasets can have a certain individuality, computers might as well express their opinions. Their opinions, as unique individuals, may differ from the opinions of another data source. Managers need to think for themselves again and interpret the outcome of querying different sources, forming their particular picture of reality – not based on 'the numbers that speak for themselves' or on fact-based analysis, but based on synthesizing multiple points of view to construct a story."

I suspect that most people believe that cognitive computing systems will require them to think less not more. If Buytendijk is correct, that won't necessarily be the case. On the subject of post-modern business process management, he writes:

"It would not be possible to define and document every single process that flows through our organization. After all, every instance of every process would be unique, the result of a specific interaction between you and a customer or any other stakeholder. What is needed is an understanding that different people have different requirements, and structure those in an ontological approach. In a postmodern world, we base our conversations on a meta-understanding. We understand that everyone has a different understanding. Of course, as we do today, we can automate those interactions as well. Once we have semantic interoperability between data sets, processes, systems and computers in the form of master data management, metadata management, and communication standards based on XML (in human terms: 'language'), systems can exchange viewpoints, negotiate, triangulate and form a common opinion. Most likely, given the multiple viewpoints, the outcome would be better than one provided by the traditional 'single version of the truth' approach."

If we have entered the post-modern era, Buytendijk concludes, "One thing is clear: Before we are able to embrace postmodernism in IT, we need to seriously re-architect our systems, tools, applications and methodologies." This shouldn't be change we fear but change we embrace. Cognitive computing opens a new world of possibilities for making life better.

January 18, 2013

The Problem with Focus Groups

At first glance, it appears that Gianfranco Zaccai, co-founder and president of the global design and innovation consultancy Continuum, is on a quest to rid the world of focus groups. "Think about it," he writes, "how many great ideas have you had sitting around a table? If you are like most people I know, not many." ["Why Focus Groups Kill Innovation, From The Designer Behind Swiffer," Fast Company Co.Design, 18 October 2012] He continues:

"Yet, time after time, companies looking for a winning idea gather a group of people around a table to ask them what they would like. Other times, companies may actually develop innovative ideas--but then their impulse is to convene a focus group to critique them and, more often than not, undermine them."

Zaccai insists that during his 40 years in the design field he has "never seen innovation come out of a focus group." He writes even more emphatically, "Let me put it more strongly: Focus groups kill innovation. That's both because of what they do and what they don't do." He explains:

"As Steve Jobs famously asserted, true innovation comes from recognizing an unmet need and designing a creative way to fill it. But focus groups can't identify those needs for the simple reason that most people don't know what they are missing until they experience it. A focus group can work in adding incremental improvements to an already existing product or service. But for truly game-changing ideas, they are more likely to cast doubt and skepticism upon them just because they are unfamiliar."

Zaccai's point is a good one. Truly innovative ideas often time for their full potential to be realized. Even great innovators don't fully understand how their game-changing products are going to be used until they are put into the hands of consumers. For example, people have done some amazing things beyond playing video games with the technology used in Microsoft's Kinect system. I agree with Zaccai that a focus group probably wouldn't have come up with any of those ideas. He goes on to describe how a few famous products would have been stillborn had his company listened to focus groups. He writes:

"When Continuum pitched an idea to Reebok for a new basketball shoe that would use inflated air to better support the ankle, thereby reducing injuries, the brand manager for basketball shoes said he wasn't interested because he had never heard about a need for that from a focus group. When we proposed the idea to a high school basketball team, the response was even worse -- the players openly laughed at the concept. But when the team members actually used an early 'experiential model' of the shoe during practice, they were won over by how cool it was to have a shoe form-fitted to their feet. Over time, they were even more enthusiastic as they realized they could play more confidently without fear of injury. Like that, the Reebok Pump was born."

He reports that Herman Miller had a similar experience when he debuted his now iconic mesh chair. "At the time, office chairs were made one way -- with lots of padding, and the more of it the better. A chair with a simple mesh backing looked ugly and uncomfortable. It was through experience using the chair that people realized how revolutionary it was, both in terms of comfort and style." Two things are clear from the anecdotes presented by Zaccai. First, the innovative ideas weren't generated by focus groups. Second, users were nevertheless essential in process. In both cases, consumers were convinced of the value of a product after they had tried it. So it appears that "testers" of some kind do have some benefit. It is a matter of timing. Zaccai admits as much. "All of this may sound easy," he writes, "and of course, it's not. So what do you do in place of the all-important focus group?" His first recommendation is: "Consider not just the act of using the product but the total experience around it." He explains:

"Most cleaning product companies, for example, look at the act of cleaning a floor. When Continuum developed the original idea for the Swiffer, we looked at the entire cleaning experience, including buying, using, washing, storing, and discarding the product. That extra research led to a truly game-changing product. Similarly, with the Reebok Pump, we looked not only at the experience of the athletes on the court but also at the mom buying basketball shoes for her son every few months because the shoes no longer fit, or the basketball player getting benched because he got injured from ill-fitting shoes."

David Kelly, founder of IDEO, agrees with Zaccai that good designers are great listeners and observers. Like Zaccai, he believes that designers need to observe consumers actually using products in their normal environment under actual conditions. Zaccai's next recommendation is to "go beyond the obvious to what cannot be seen." He explains:

"When we designed the Swiffer, we conducted a microscopic analysis of the dirt on the floor before and after cleaning and discovered that most of the problem was dust, and that dust is best removed without water. We found that most people spent extra time sweeping the floor before they mopped it. Then, they spent more time cleaning the mop head than they did cleaning the floor. The Swiffer combined sweeping and mopping into a single mess-free act, ending up with a cleaner floor overall."

Zaccai's third recommendation -- "test new products out in the field" -- is closely tied to his first recommendation. He explains:

"Just because an idea is a good one doesn't mean that people will immediately jump for joy the first time they hear about it. You need to test early, and you also need to test in context, directly with the people for whom it’s intended. That's what we did with the Reebok Pump and the basketball team, and it gave us different feedback that we could use to refine and improve the product."

The folks at IDEO call this the "deep dive." If you are unfamiliar with IDEO's "principles of design thinking," you might want to take a few minutes and watch the following ABC Nightline piece called "The Deep Dive." Although the clip is about redesigning the humble grocery cart, it is the process not the product that is important.

Zaccai's final recommendation is to "invest in leaders who recognize the importance of calculated risks." He writes:

"The Reebok Pump wouldn't have happened if it weren't for the green light from Reebok's president, who recognized the possibility of a truly revolutionary concept, and then made the decision to follow through with development. At the end of the day, you can't make decisions based solely on dollars or because of what people are saying; you have to make decisions based on your gut about what you feel is the right thing to do."

My only caveat to the advice about using "your gut" instincts is that you need to be able to admit you were wrong if your gut lets you down. Too many leaders follow their gut instincts to ruin because they can't admit they made a mistake. In response to Zaccai's attack on focus groups, Paul Marsden claims that "Zaccai is attacking a straw man." ["Do Focus Group Kill Innovation?" Brand Genetics Blog, 31 October 2012] Marsden continues:

"Sure there are some innovation consultancies out there that misuse consumer groups for polling and screening innovation ideas. But any agency worth commissioning will know that this is a pointless misapplication of the focus group, and belies a pitiful ignorance of qualitative research. Focus groups are not for mini-polling. Focus groups generate the understanding that acts as creative stimulus for people who understand innovation and understand consumer behaviour. The output of a focus group is input into the creative and analytical process, not an output. In other words, focus groups don't tell you what to think, they tell you what to think about. So, if you are simply taking at face value what consumers say in a focus group, you are missing the point. Likewise, if you are listening to consumers’ explanation for why they think or do things; an exercise based on the myth that people have some privileged understanding of their motivations. That’s not what focus groups are for. Consumers have genuine expertise … as consumers. It is in their capacity to stimulate innovation and improvement of ideas in experts and trained professionals that their value lies."

Zaccai admits, "Focus groups aren't useless. They can be insightful for fine-tuning something for the short term." In a follow-up article, he writes, "I believe that focus groups have a place in the process. In fact, in my career as a designer, I have seen focus groups used to great benefit -- but only when applied at the right time and in the right way. The trick is knowing how and when to use them." ["Focus Groups Are Dangerous. Know When To Use Them," Fast Company Co.Design, 9 January 2013] He goes on to discuss the three steps that he believes are involved in the innovation process. The first step involves engaging "with people in a one-on-one context." He explains:

"Rather than a focus group, we call this a 'contextual focus.' It's learning what people do in a particular context and the value that has in their life. The context may be their car, home, or job, and even in the life of significant others. In a sense, you could call this deconstructing the focus group. Rather than a group, you are using a focal point to better understand real people communicating valuable information in response to stimuli in their real lives. A focus point may also involve what cannot be seen but impacts people’s experience; that is, exploring the physics, chemistry, or economics of a problem; learning what dirt in a home really is and what removes it most effectively."

In other words, the first step involves the deep dive. Zaccai's second step in the innovation process involves coming "back to the table to make sense of what is uncovered in step one." He writes:

"This is the time to ideate and figure out how to resolve problems and address unconscious needs; conceptualize unexpected but meaningful innovation while still embedding it in the familiar. During this stage, we rapidly prototype a lot of different ideas and test them in a controlled environment, looking to fail quickly if they don't work, but learning from each failure. We call this the 'focus filter.'"

His final step in the innovation process involves taking the refined product "to a focus group."

"Once you get closer to the real thing and have a truly innovative product, then you can go to a traditional focus group to help you figure out how to place and position it. ... Focus groups are about fine-tuning for mass appeal -- about evolving the truly revolutionary ideas to the point where they will be embraced by the majority of consumers, while at the same time not losing the essential points of what made them innovative in the first place. For informing that evolution, focus groups serve a very useful and valuable purpose. Just don't expect them to be where those revolutionary ideas originate."

I agree with Zaccai that focus groups aren't the right mechanism for generating revolutionary ideas. I also agree with him and Marsden that focus groups have their place. Just ask the producers of Broadway shows or motion pictures about the importance of test audience reactions to their productions. More than one of them has changed a script because of test audience reactions. Focus groups are part of the listening and observing process that all good designers use.