Site moved to enterrasolutions.com/2013/08, redirecting in 1 second...

« July 2013 | Main | September 2013 »

22 posts from August 2013

August 30, 2013

Targeted Marketing and Predictive Analytics, Part 2

In the first segment of this two-part series on predictive analytics, I discussed the potential of analytics for providing a better ROI for marketers as well as how to select the right databases to analyze and what to do with them once they were identified. McKinsey & Company partners Jonathan Gordon, Jesko Perrey, and Dennis Spillecke report:

Predictive Analytics"Some companies are already turning that Big Data promise into reality. Those that use Big Data and analytics effectively show productivity rates and profitability that are 5 – 6 percent higher than those of their peers. McKinsey analysis of more than 250 engagements over five years has revealed that companies that put data at the center of the marketing and sales decisions improve their marketing return on investment (MROI) by 15 – 20 percent. That adds up to $150 – $200 billion of additional value based on global annual marketing spend of an estimated $1 trillion." ["Big Data, Analytics And The Future Of Marketing And Sales," Forbes, 22 July 2013]

Andrew Gill, CEO of Kred, was first impressed with the potential of predictive analytics when he saw a presentation about how law enforcement organizations were using Big Data to predict potential criminal activity and then used those predictions to make arrests leading to convictions. It convinced him that "the marketing community can use cues from social big data and purchase history big data to predict future purchase patterns." ["Using Big Data to fight crime and predict what products consumers might purchase in the future," London Calling, 4 June 2013] He concluded:

"I believe that if individual brands start to harness the power of big social data (and that means becoming a social business), then they can start to pull ahead of their competition. Angela Ahrendts, CEO of Burberry, was quoted in a Capgemini consulting report recently as saying 'Consumer data will be the biggest differentiator in the next two to three years. Whoever unlocks the reams of data and uses it strategically will win.' Placing a bet on big data is not for the feint hearted. Those brands that will lead the big data race have already started though."

Alex Bulat offers "some practical ways of applying predictive personalization." ["Predictive Personalization as a Way to Increase Conversion," Template Monster Blog, April 2013] His first suggestion is to "provide relevant content." Targeted marketing (i.e., the implementation of predictive personalization) is all about providing the right offer to the right person at the right time in the right circumstance. In other words, it involves both content and context. Concerning content, Bulat writes:

"Website personalization simplifies the search of the content. This technology adequately identifies key characteristics of each visitor, and categorizes them based on predefined rules. At the same time, visitors feel that website is personalized and enjoy the benefits of 'noise reduction' (irrelevant information) and see only most interesting content."

Targeted marketing, of course, involves more than website content. It also includes targeting advertising. On that subject, Bulat writes:

"This technique means building up ads depending on customer's needs, based on the history of their interaction with the website. It sufficiently increases customer satisfaction and leads to an increase of conversions. While the history of user interaction with the site accumulates, this data can be used to develop unique, relevant offers in [the] future, as well as group visitors with similar interests."

Bulat believes that "all personalization techniques can be divided into two categories." Those categories are:

  • Rule-based and user segment personalization.
  • Personalization based on predictive analytical algorithms.

Concerning rule-based and user segment personalization, Bulat writes:

"This type [of personalization technique] is based on the rules (i.e., the practice of using history data, behavioral data and environmental data for creating unique proposals based on those predefined rules). Typical personalization rule takes the following form: 'If a visitor makes a follow-up, show the X offer.' One example of easily tracked and segmented client's characteristics is geographic location. If a customer visits the site selling cloths in New York, user will be offered personalization ads based on his IP address and will see coats and jackets, but if the IP address belongs to Las Vegas they will be offered sandals and slippers."

The location-based example that Bulat provides is a good example of why context as well as content matters. However, some location-based "personalization" has created angst for the companies using it. As I noted in a previous post, Staples, the office supply company, became the poster child for this kind of personalization strategy when Jennifer Valentino-DeVries, Jeremy Singer-Vine, and Ashkan Soltani revealed that "the Staples Inc. website was displaying "different prices to people after estimating their locations. More than that, Staples appeared to consider the person's distance from a rival brick-and-mortar store, either OfficeMax Inc. or Office Depot Inc. If rival stores were within 20 miles or so, Staples.com usually showed a discounted price." ["Websites Vary Prices, Deals Based on Users' Information," Wall Street Journal, 24 December 2012] The article noted that Staples wasn't the only culprit, other companies mentioned included: Discover Financial Services, Rosetta Stone Inc. and Home Depot Inc. The revelation was generally met with consumer outrage. The reporters were quick to point out that "offering different prices to different people is legal, with a few exceptions for race-based discrimination and other sensitive situations." On the subject of personalization based on predictive analytical algorithms, Bulat writes:

"This presumes the use of mathematical systems to monitor visitor behavior to develop predictive models and deliver most relevant content for each visitor. In contrast to the targeting strategy based on rules, algorithmic targeting creates and connects larger, and potentially infinite number of computer-generated micro-segments all of which develop when the model learns."

Cognitive reasoning systems are being developed that take advantage of machine learning. As these systems mature, predictive analytics will undoubtedly play an ever-larger marketing role. Nevertheless, Bulat cautions, "In behavioral targeting there is no such option as 'set and forget'. All targeting efforts should ... be checked at regular intervals and periodically compared to the control group (which was not personalized to verify the effectiveness of your efforts)." Brian Kardon, Chief Marketing Officer for Lattice Engines, agrees that predictive analytics will become more important. "Right now," he writes, "virtually all of our marketing data is backward-looking. Clicks, Web visits, open rates, downloads, and tweets all happened in the past. What if we could take this data and use it to predict what customers were going to do next?" ["Predictive Analytics: The Power Behind Next-Gen Marketing," CMO.COM, 14 August 2013] He continues:

"It is not science-fiction. Right now, the marketing organizations at companies such as ADP, Dell, SunTrust, and Microsoft are doing just that. They are using a variety of statistical techniques to analyze current and historical data to make predictions about the future. It's called predictive analytics. ... Fields as diverse as baseball, insurance, national security, logistics, and (thank you, Nate Silver) presidential elections can now be predicted with stunning accuracy."

Kardon believes that "predictive analytics is gaining traction for three main reasons." The first reason is that so much data is being created. He writes:

"Simply put, until recently we didn’t have enough marketing data to confidently predict the future. The amount of data the world produces every two days is equal to all the data produced from the beginning of civilization up to 2003. Today, companies and individuals are spewing out massive amounts of information in social networks, on the Web, and in internal systems (such as CRM and purchase histories). The sheer volume presents an unprecedented opportunity for businesses to gain insights on current and future buying behavior."

The second reason that predictive analytics is gaining traction is that new technologies are being developed every day to take advantage of Big Data. Kardon explains:

"Advances in technology now allow us to cost-effectively capture, store, search, share, analyze, and visualize data. There have been giant technological advances in computer hardware–faster CPUs, cheaper memory, and massively parallel processing (MPP) architectures. New technologies (Hadoop, MapReduce, and text analytics) can process both structured and unstructured big data. Today, exploring big data and using predictive analytics is within reach of more organizations than ever before."

Kardon's final justification for why predictive analytics will find greater future use in marketing has to do with "the democratization of the math." He explains:

"Until recently, big data and predictive analytics were almost exclusively the domain of highly skilled data scientists. Today, software makes even the most exotic of techniques within sight–from simple linear and multivariate regression to classification and regression trees (CART), conditional mutual information algorithms, random forests, and neural networks. While the range of statistical techniques had widened, the availability of graduate students and software has made it more accessible to more organizations. You do not need a small army of PhDs, but you will need to have some familiarity with these methods."

Kardon concludes, "The next generation of marketing leaders will be those who effectively harness the power inherent in big data, and the early adapters are already embracing predictive analytics. If you were an early adapter of marketing automation, then I predict that you'll also be an early adapter of predictive analytics." In the future, don't be surprised when you're shown an offer for something you didn't even know you wanted, but, once you've seen it, fall in love with. That's the power of predictive analytics.

August 29, 2013

Targeted Marketing and Predictive Analytics, Part 1

Seth Gottlieb believes that using predictive analytics to target customers "is going to feel invasive" as it becomes more common. However, he continues, he is "hoping that predictive analytics will help marketers target their message to receptive customers who can genuinely benefit from the product or service. Maybe this science will even help companies discontinue programs that nobody would want." ["Predictive Analytics for Marketing," Content Here, 25 March 2013] There is always going to be the risk of a "creepiness factor" when targeted marketing is employed. But implemented tastefully and ethically, the creepiness factor can be muted. Alex Bulat reminds us that back in 2002, when the movie Minority Report was released, it contained a scene where passengers riding the subway viewed "large wall-sized screens" that showed each rider a different advertisement tailored to their preferences and tastes. "When the film was shot," he writes, "this type of advertising sounded quite Sy-Fy, the product of a distant future, but today, 11 years later, predictive personalized advertising is absolutely real." ["Predictive Personalization as a Way to Increase Conversion," Template Monster Blog, April 2013] Bulat provides this definition of predictive personalization:

Predictive Analytics"Predictive personalization is defined as the ability to predict customer behavior, needs or wants – and tailor offers and communications very precisely. Social data is one source of providing this predictive analysis, particularly social data that is structured. Predictive personalization is a much more recent means of personalization and can be used well to augment current personalization offerings … [and] discover highly relevant content requiring minimal effort to find. Both bet that people still value content and try to serve up good stuff for those moments in which they have nothing else to do. Both attempt to provide a machine-augmented curated media experience."

Bulat believes that "predictive personalization improves the quality of our lives"; but, he also understands that privacy concerns are going to temper any consumer enthusiasm for Big Data analytics. In the end, he believes the benefits will outweigh the concerns. He concludes, therefore, "Marketers who see the future in personalized ads should not fear. [Since] predictive advertising helps consumers save money, such ads will be called for and will be really effective." McKinsey & Company partners Jonathan Gordon, Jesko Perrey, and Dennis Spillecke assert, "Big Data is the biggest game-changing opportunity for marketing and sales since the Internet went mainstream almost 20 years ago." ["Big Data, Analytics And The Future Of Marketing And Sales," Forbes, 22 July 2013] They go on to note, however, that many marketers don't know how to make it happen.

Meta S. Brown, an analytics consultant, writer, and speaker, believes that the best place to start is choosing the right data sets to analyze. ["Selecting Big Data Sources for Predictive Analytics," SmartData Collective, 8 April 2013] She writes:

"The value of any dataset is determined by the quality of information you can extract from it. The key to value in big data is the detail. In other words, the value of big data is in the small stuff. ... The promise of big data is in the details. You want the data to give you the information you’d get if you observed each customer in person. You want to know what each person does. You want to know how each responds to a variety of things – products offered, pricing, presentation, and so on. You only realize value from data if you do something valuable with it."

So how do you go about selecting the right data sets? Brown suggests that you must first answer the question, "What do you want to accomplish?" She explains:

"You must know what kinds of action you have the option of taking. Can you offer new products, change the selection you offer, or must you work within the bounds of what you have now? Can you develop new ads, new offers? Now, imagine that you have the same goal, and the same options, in a face-to-face situation. What information would you want? Knowing that, you are ready to look for data sources that meet your needs."

When addressing the topic of where to look, Brown recommends staying close to home. "Start with the data you already own," she writes. "Your transaction records are a treasure chest of behavioral data." She continues:

"You know when each transaction takes place, what is purchased, at what price. If you have a loyalty program or house credit card, then you also know who was buying. Your own data is more valuable to you than anything you could buy, and it's already paid for. And this data is yours alone, giving you a unique information advantage over your competitors. If you do business online, get an understanding of the information collected in your web activity logs. These logs contain revealing details about shopping behavior, including details on the behavior of non-buyers. Only when you’ve thoroughly investigated the possibilities of your internal data sources should you look beyond your walls. Once you have a clear idea of what you want to know, and the limits of your own data, can you shop selectively, and shrewdly, for information that fills in the blanks."

Gordon, Perrey, and Spillecke agree with Brown that data is important, but they note, "Data on its own ... is nothing more than 1s and 0s." Their research shows that "companies that succeed today do three things well" with that data. The first activity involves analytics.

"1. Use analytics to identify valuable opportunities. Successful discovery requires building a data advantage by pulling in relevant data sets from both within and outside the company. Relying on mass analysis of those data, however, is often a recipe for failure. Analytics leaders take the time to develop 'destination thinking,' which is writing down in simple sentences the business problems they want to solve or questions they want answered. These need to go beyond broad goals such as 'increase wallet share' and get down to a level of specificity that is meaningful."

It's clear that many of the questions that Brown suggests should be asked when selecting data are also valuable when it comes to setting up the analytics for that data. The McKinsey partners assert, "Using data to specifically unlock new opportunities requires looking at data in a new way." Their second recommendation deals with a customer's path to purchase.

"2. Start with the consumer decision journey. Today's channel-surfing consumer is comfortable using an array of devices, tools, and technologies to fulfill a task. Understanding that decision journey is critical to identifying battlegrounds to either win new customers or keep existing ones from defecting to competitors."

They indicate that "marketing and sales leaders need to develop complete pictures of their customers so they can create messages and products that are relevant to them." For more on that topic, read my post entitled "The "Person" is the Most Important Part of Personalization Marketing." The McKinsey partners conclude, "Personalization can deliver five to eight times the ROI on marketing spend and lift sales 10 percent or more. Becoming ever more effective with this kind of targeting, we believe (and hope), will mean the death of spam." Their final recommendation is to keep your approach simple. They write:

"3. Keep it fast and simple. Data worldwide is growing 40 percent per year, a rate of growth that is daunting for any marketing and sales leader. Companies need to invest in an automated 'algorithmic marketing,' an approach that allows for the processing of vast amounts of data through a 'self-learning' process to create better and more relevant interactions with consumers."

Gordon, Perrey, and Spillecke call this "a pivot-point moment for marketing and sales leaders. Those who are able to drive above-market growth, though, are the ones who can effectively mine that gold." I'll finish the discussion about how to mine the gold in the next post.

August 28, 2013

Some Thoughts About Disruptive Innovations

Helge Tennø, a Digital Director at Dinamo AS in Oslo, Norway, recently penned an interesting blog in which he connected Kevin Kelly's thinking about technology with Clayton Christensen's thinking about disruptive innovation. He writes, "By downplaying technology's role in shaping businesses and industries, companies make themselves vulnerable to emerging threats." ["Disruption and imagination," 180.360.720, 7 August 2013] Tennø notes that we need to think broadly about the term "technology." It refers to things "as big as the alphabet, science, communication and as small as shoes, chairs or Bluetooth." Most importantly, disruptive technologies have two overarching effects. They change "peoples' habits and behaviors" and they disrupt "business ideas and business models."

Tennø believes that disruption occurs "when a firm uses technology in a new way and manages to offer customers relevant value outside the current comfort zone and abilities of the established players." He points, for example, to Henry Ford's application of the production line to produce cars that his employees could afford. He writes that "more current examples are booksellers, airlines and travel, banks and finance, media, hotels, education, insurance and electrical providers." He writes:

"Disruption is a very hot topic today because we are at a time when technology is infiltrating industries core infrastructure and value propositions. The cloud / connected internet, mobile, participation, etc. proposes in a lot of cases such a radically different way of offering or thinking about an offering that disruption is often more than ripe – just waiting to get plucked by small companies, sometimes even unconsciously, digging at an industry from the bottom (which is the trademark of disruption – 'it's not the one you see that kills you').

    'The innovators who create products at "hackathons"aren’t even trying to disrupt your business. You're just collateral damage.' – Larry Downes and Paul F. Nunes, Big Bang Disruption, HBR March 2013

"Unfortunately, even if most companies today use the Internet, I suggest they are mostly doing it to replicate existing infrastructure (many company websites are digital copies of existing content or services already rendered by Customer Service) – companies are not imaginative enough to see how the technology enables them to think differently about their offering, relationship, market, business model etc.:

    'The only thing keeping most big companies from creating new categories is their lack of imagination – their inability to see beyond what they're selling today.' – Eddie Yoon and Linda Deeken, Why It Pays To Be A Category Creator, HBR, March 2013"

Management guru Gay Hamel sees things the same way that Tennø does. He claims they get in that conundrum by boxing themselves in with the wrong question. He told Steve Denning:

"Both Amazon and Apple are classic examples of what I wrote about in the article with C.K. Prahalad in 1990 entitled, The Core Competence of the Corporation. That article argued that the building block for growth is not so much strategy as these deep capabilities that you build and that are very hard to replicate. ... A few years ago, nobody I knew in the idea industry was predicting that Amazon would be the most successful player in Cloud computing, or that Apple would move from computers to music devices to mobile phones to tablets. Yet it is completely logical if you have a competence view of the firm and you see how they continue to leverage that core competence in new ways. Apple and Amazon never made the mistake of getting stuck on the famous question, 'What business are we in?' Once you ask that question, you start asking: 'What assets do we have? And where do we leverage them next?' Once you frame the question that way, the game is almost over." ["Gary Hamel On Innovating Innovation," Forbes, 4 December 2012]

One of things I liked most about Tennø's article was the discussion of something he calls "the Opportunities Matrix." He notes that the Matrix "offers some ideas [about] where new opportunities arise as technology enables new human behavior or habits."

The-opportunities-matrix

Too often people fail to see opportunities because they continue to view life from the same perspective. Creativity coaches teach techniques that force people to see challenges from different perspectives so that they can consider different approaches for solving it. The Opportunities Matrix does a similar thing. Each red hexagon in the matrix offers a new window (or perspective) through which businesses can consider how they can leverage their core competencies in new ways. It provides them the opportunity to be the disruptor rather than the disrupted. The Matrix helps you overcome the Innovator's Dilemma described by Christensen (that is — how to serve your core business while finding new markets and watching out for new entrants that could be lurking in your blind spot).

Tennø rhetorically asks, "Why is disruptive innovation important?" The simple answer is that disruptive innovation is going to prove fatal for one business or another; and that death is going to be sudden not lingering. Tennø concludes:

"Most companies thrive by incrementally upgrading their existing offering and keeping all competition at bay. This is true, generally speaking an estimate 70% of current return on shareholder value will most likely be the result of incremental innovation (efficiency or sustaining), but if we look at companies future income, and if we look at where they will most likely be disrupted, it is through the innovative use of technology – seeing new needs and new customer demands and working hard at accommodating it. This (a company’s future income) is why disruption – and imagination – might be the most important investments companies look into in these times of great technological opportunity. Now what they need are better, and faster tools for seeing, and acting, on those opportunities."

Sissel Waage bolsters Tennø's arguments. She writes, "Disruptive innovation is in vogue – for good reason. Businesses need to stay ahead in an environment of continual evolution and change." ["Disruptive innovation is key for a sustainable economy," The Guardian, 22 July 2013] She continues:

"We need to allow our brains to think the unthinkable – that is, suspending the pressures of what can be done today and shifting to the question of 'what if?'. ... Suspending judgment and innovating has occurred in countless home offices, garages and living rooms as it has in corporate meeting and board rooms. Yet, given the need for more (and more rapid, large-scale) innovation as we face climate change and other ecological issues, it seems apt to ask how we go about supporting innovation. ... What if there were a centre focused on disruptive innovation, and transforming major societal systems sustainably (with robustness and resilience in mind) particularly where IT, energy, food, agriculture and finance meet? What if prospective innovators came out of their day-to-day urban lives – into a forested or grassland landscape, with blended human and ecological health and resilience that is (mindfully) at the core? And what if the process that prospective innovators went through in such a new centre would submerge them within a space of systems analysis of intended and unintended consequences associated with their emerging new inventions and business ideas? ... What if we really took seriously the innovation needs that our societies face – such as, climate change, ecosystem impacts, biodiversity loss, inequitable access to education, information and capital, and many other issues – and crafted more unusual settings to foster, pressure test and launch those innovations? What if Einstein was right and the thinking (in this case, about innovation) that got us into this situation is not the same thinking that will get us out of it? What if a new way of thinking about built systems could help us innovate to a new climate-adaptive, resilient world? Why not innovate that new centre into being?"

I'm a big fan of "what if" thinking because it is another tool that forces us to change perspective and think differently, and ask good questions. I agree with Waage that disruptive innovations are essential if the global economy is going to grow and new jobs are going to be created.

August 27, 2013

Messing with Your Mind

"The era of robots learning and reacting the same way humans do may still be in the realm of science fiction," writes the editorial team at No Camels, "but a young Israeli researcher is making impressive advances in the way computers understand and interact with humans." ["Will Computers And Humans Make Decisions Together In The Future?," 23 April 2013] The article continues:

"Dr. Ya'akov (Kobi) Gal from Ben-Gurion University of the Negev‘s research focuses on designing 'intelligent agents' – algorithms that run on computers, robots and smart phones – and interact with other people as well as computers. The goal of his research is to enable computers to understand how humans make decisions and to use this information to foster cooperation between robots and humans in domains such as strategic negotiation and education. Basically, Gal is not trying to teach computers to think for humans, but rather, to think with them. Gal's thesis is that the future role for intelligent agents is not to replace human professionals, a fear that is often expressed, but rather 'share the problem solving' with humans in a way that best utilizes people's abilities."

Gal's research may be trying to "get into the heads" of individuals to see how they think and act, but his intelligent agents aren't biologically intrusive. Ongoing research elsewhere quite literally wants to get inside people's heads. Nick Bilton reports, "Scientists haven't yet found a way to mend a broken heart, but they're edging closer to manipulating memory and downloading instructions from a computer right into a brain." ["Computer-Brain Interfaces Making Big Leaps," New York Times, 4 August 2013] Bilton explains:

"Researchers from the Riken-M.I.T. Center for Neural Circuit Genetics at the Massachusetts Institute of Technology took us closer to this science-fiction world of brain tweaking ... when they said they were able to create a false memory in a mouse. The scientists reported in the journal Science that they caused mice to remember receiving an electrical shock in one location, when in reality they were zapped in a completely different place. The researchers weren’t able to create entirely new thoughts, but they applied good or bad feelings to memories that already existed. 'It wasn't so much writing a memory from scratch, it was basically connecting two different types of memories. We took a neutral memory, and we artificially updated that to make it a negative memory,' said Steve Ramirez, one of the M.I.T. neuroscientists on the project."

 

Mit-riken-memory-manipulation-in-mice-0
Source: Riken

 

If you think that creating fake memories doesn't sound very useful (or even sounds a bit creepy or sinister), you could be right. On the other hand, the research could lead to some important treatments for people with mental disorders. "The ability to learn and remember is a vital part of any animal's ability to survive," explains Brian Dodson. "In humans, memory also plays a major role in our perception of what it is to be human. A human is not just a survival machine, but also reads, plans, plays golf, interacts with others, and generally behaves in a manner consistent with curiosity and a need to learn." ["Inception: Artificial memories implanted in mice," Gizmag, 1 August 2013] He continues:

"Forgetting where we put the keys is a standard part of the human condition, but in the last few decades our knowledge of more serious memory disorders has grown rapidly. These range from Alzheimer's disease, where the abilities to make new memories and to place one's self in time are seriously disrupted, to Post-Traumatic Stress Disorder, in which a memory of a particularly unpleasant experience cannot be suppressed. Such disorders are a powerful force driving research into discovering how healthy memory processes function so that we can diagnose and treat dysfunctional memory function."

Mind manipulation has been a staple of Hollywood movies for years. Total Recall, Eternal Sunshine of the Spotless Mind, The Matrix, and Inception all featured a form of mind manipulation in their plots. Bilton writes:

"In the movie, 'Eternal Sunshine of the Spotless Mind,' a character played by Jim Carrey uses a service that erases memories to wipe his brain of his former girlfriend, played by Kate Winslet. But it seems the movie's screenwriter, Charlie Kaufman, was selling science short. 'The one thing that the movie "Eternal Sunshine of the Spotless Mind" gets wrong, is that they are erasing an entire memory,' said Mr. Ramirez of M.I.T. 'I think we can do better, while keeping the image of Kate Winslet, we can get rid of the sad part of that memory.' Hollywood and science-fiction writers, of course, have had fun with memory manipulation over the years. In the film 'Total Recall,' which is based on a short story by Philip K. Dick, a character played by Arnold Schwarzenegger receives a memory implant of a fake vacation to Mars. In 'The Matrix,' characters can download new skills like languages or fighting techniques to their mind, much like downloading a file to a computer. Far-fetched? Perhaps, and we're not yet fighting our robot overlords as the humans were in 'The Matrix,' but researchers really are exploring ways to upload new information to the brain."

In the movie Inception, Leonardo DiCaprio plays a skilled corporate spy who uses technology to steal business secrets from the minds of executives while they sleep. None of these plotlines exactly follow the course being pursued by the Riken/MIT researchers. Dodson explains:

"The MIT team genetically engineered the hippocampal cells of a new strain of mouse so that the cells would form a light-sensitive protein called a channelrhodopsin (ChR) that activates neurons when stimulated by light. This involved engineering the mice to add a gene for the synthesis of ChR, but that gene was also modified so that ChR would only be produced when a gene necessary for memory formation was activated. In short, only neurons actively involved in forming memories could later be activated by light. Initial work using the genetically engineered mice focused on determining what neurons in the hippocampus are associated with forming a new, specific memory. There were at least two schools of thought on how memory engrams were stored – locally or globally. They discovered that a memory is stored locally, and can be triggered by optically activating a single neuron. 'We wanted to artificially activate a memory without the usual required sensory experience, which provides experimental evidence that even ephemeral phenomena, such as personal memories, reside in the physical machinery of the brain,' says lead author Steve Ramirez. The new results came from a chain of behavioral experiments. The researchers identified the set of brain cells that were active only when a mouse was learning about a new environment. The genes activated in those cells where then coupled with the light-sensitive ChR. These mice were then exposed to a safe environment in a first box, during which time the neurons which were actively forming memories were labelled with ChR, so they could later be triggered by light pulses. Next the mice were placed in a different chamber. While pulsing the optically active neurons to activate the memory of the first box, the mice were given mild foot shocks. Mice are particularly annoyed by such shocks, so this created a negative association. When the mice were returned to the first box, in which they had only pleasant experiences, they clearly displayed fear/anxiety behaviors. The fear had falsely become associated with the safe environment. The false fear memory itself could be reactivated at will in any environment by triggering the neurons associated with that false memory."

It really doesn't take much imagination to see how development of brain-writing techniques could be used for either good or evil purposes. Imagine being able to learn new subjects, like a new language, while you sleep. In the wrong hands, the technology could be used to create ideological zealots capable of carrying out terrible acts. As Bilton reports, "Writing to the brain could allow us to interact with our computers, or other human beings, just by thinking about it." He concludes:

"But some researchers don't appear to be worried about that sort of thing. In his book, 'Beyond Boundaries: The New Neuroscience of Connecting Brains with Machines — and How It Will Change Our Lives,' Dr. Nicolelis said he believes it is possible that humans will be able to communicate wirelessly without words or sound, where brain waves are transmitted over the Internet. 'I think this is the real frontier of human communication in the future. We already can get our monkeys, and even humans, to move devices just by thinking,' he said. 'Once you can write to the brain, I can imagine the same type of logic working for communication where your thoughts and a message will be communicated to another human being and they will be able to understand it.' It looks like mending that broken heart, through manipulation of our memories, might be here closer than we think."

Transhumanists believe that connecting human and artificial intelligence is inevitable and isn't something that should be feared. Clearly, this research has the potential to help sufferers of mental conditions. The jury is still out on how it could be used by people with evil designs.

August 26, 2013

Preparing for Supply Chain Disruptions

"After tsunamis, protests, wildfires, and riots — to name just a few recent major disruptions — few managers can be unaware of companies' vulnerability to the vagaries of politics and extreme weather," writes Mary Driscoll. She editorially adds, "You'd think." ["Research: Why Companies Keep Getting Blind-Sided by Risk," Harvard Business Review Blog Network, 18 July 2013] "Yet," she reports, "three quarters of the 195 large companies surveyed recently by APQC got hit by an unexpected major supply chain disruption in the last 24 months." Driscoll's question is: Why are these disruptions unexpected in light of recent history? In many cases, she notes, "C-suite executives had to get involved in the fix-it process for a sustained period of time." She finds it even more puzzling that these disruptions were "unforeseen" considering that "these are the same senior executives and middle managers that have supposedly been embracing formal enterprise risk management (ERM) for some time." Which raises another question: "Why did these systems fail so spectacularly?" The quick answer is: Shortsightedness. Driscoll explains:

"Part of the problem stems from the familiar gap between the talk and the walk. Survey findings indicate that most organizations' leaders did indeed express concern about the impact of political turmoil, natural disasters, or extreme weather. But the findings also show that the people at the front lines of the business were hamstrung by a lack of visibility into risk. Nearly half said they lacked the resources needed to adequately assess business continuity programs at supplier sites. Many relied on the suppliers filling out perfunctory, unreliable checklists. It's likely that the push to protect profits during the recession made matters even more difficult for supply chain operators. Seventy percent of the respondents to the APQC survey say their organizations pruned their lists of suppliers over the past five years, with the intent to reduce costs. Moreover, nearly three-quarters (74%) of the companies over the period added suppliers physically distant from their facilities, with 63% acknowledging that their suppliers are located in areas of the world known for high-impact natural disasters, extreme-weather events or political turmoil. It appears the urge to source in low-cost regions clouded the cost-versus-risk calculus for some."

In several previous posts on the subject of supply chain risk management, I've noted that the "push for profit" can have a negative impact on supply chain resiliency. For example, in one previous post, I wrote:

"Time and again the issue of balance between 'lean' and 'resilient' supply chains is raised by risk management experts. In his doctoral dissertation entitled Supply Chain Resilience: Development of a Conceptual Framework, an Assessment Tool and an Implementation Process, Timothy J. Pettit, included a graphic that, in a very general way, illustrates why balance is the key.

Pettit%20zone%20of%20balance

"Pettit's point is that resiliency does come with a price that can erode profits. A lack of resiliency, however, can also affect profits and even expose a company to total failure. Hence, finding what Pettit calls the 'Zone of Balanced Resilience' is essential. In his abstract, Pettit writes, 'The business environment is always changing and change creates risk. Managing the risk of the uncertain future is a challenge that requires resilience – the ability to survive, adapt and grow in the face of turbulent change. ... Findings suggest that supply chain resilience can be assessed in terms of two dimensions: vulnerabilities and capabilities.'"

Analysts at the Strategic Sourceror state the problem this way:

"As a business attempts to become more efficient, it may make its supply chain increasingly vulnerable to risk. Businesses use strategies such as outsourcing, supplier consolidation and low cost sourcing to improve efficiency, but these practices can add risk and a supply chain is only as strong as its weakest link, [David Oxland and Richard Kettle from] Supply Management stated. Risk analysis in strategic sourcing is crucial, and failure to identify and minimize risks can lead to profit loss." ["Supply chain risk management important to business success," 30 April 2013]

Driscoll calls this the triple whammy: Lengthened supply chains; pruned supplier lists; and doing business in risky areas. The Strategic Sourceror article goes on to state that most companies enter outsourcing arrangements with their eyes wide shut. "Ninety percent of firms fail to perform a risk assessment before outsourcing, Supply Management found." Catherine Bolgar agrees with Driscoll that companies really have few excuses for being blind-sided by risks. "Keeping tabs on supply-chain risks sometimes seems like removing weeds in your garden," she writes, "every time you get one area under control, a new risk pops up." ["Emerging Risks," Supply Chain Risk Insights, 8 April 2013] Bolgar continues:

"Some companies still take a reactionary approach to supply-chain disruptions; more mature companies take a proactive approach. The best companies look farther out, toward emerging risks, for full resilience, says Nick Wildgoose, global supply chain product leader at Zurich Global Corporate, based in London. 'You're looking at future threats — or future opportunities. If you can cope better as an organization, you can perform better.' The more effective organizations carry out analysis with a long horizon."

Let's be clear, even with great analysis, no company can make itself immune to risks and supply chain disruptions. The best companies can do is make themselves more resilient. Last year Dr. David Simchi-Levi, a professor at MIT and founder of the consulting firm OPSrules, introduced something he calls the Risk Exposure Index. Simchi-Levi's methodology helps companies calculate the financial impact of supply chain disruptions as well as the estimated time to recovery or TTR. ["Risk Exposure Index Starting to Gain Traction, Change Supply Chain Thinking, David Simchi-Levi Says," Supply Chain Digest, 24 April 2013] Simchi-Levi told Supply Chain Digest editor-in-chief Dan Gilmore that "the effort to collect information on TTR across the supply chain changes a company's approach to risk management. First, such companies realize they don't have this data, and when they do collect the information there are usually some surprises. Second, the approach then often spurs companies to find ways to reduce TTR, and thus the financial impact."

Lloyd's, the world's largest specialty insurance market, notes, "As production networks and supply chains become increasingly globalised, localised incidents can have a number of major effects on every level of business at home and abroad – from manufacture to distribution to sales." ["Building Supply Chain Resilience," 8 May 2013] Tom Teixeira, a partner in the global solutions consulting group at Willis, told Lloyd's, "What people forget is the supply chain now is a global network and that’s why there really is a need to get into quite a lot of analytics to understand what the pinch points are." In other words, doing their homework should be an essential task for any company that relies on a global supply chain. Driscoll agrees. She writes:

"Supply chain disruption risks often [get] painted as ... operations-level risks and for that reason never [make] it onto the list of 15 or so major strategic/enterprise risks assessed and managed by the Chief Risk Officer's formal ERM process. Many ERM assessments focus on risks related to competitive strategy or the customer experience. The result is that too many boards don't think to ask about — and are not briefed on — the risks of, say, sourcing key components in risky regions of the world. They wind up blind, therefore, to many crucial strategic risks. 'The important thing is to figure out what might be a severe disruption and to do this you have to look down into the different tiers of supply. People at the top need to ask: "What might be out there that we are not currently aware of,"' says Dr. Paul Walker, an expert in ERM at St. John's University in New York."

There are no silver bullet solutions for assessing and mitigating supply chain risk. I like Bolgar's weed analogy: "Every time you get one area under control, a new risk pops up." Vigilance, analysis, and awareness are the keys to addressing the challenge.

August 22, 2013

Nasdaq Technical Glitch Highlights System Vulnerabilities

Trading today came to a sudden halt on the Nasdaq exchange as the result of a technical glitch. Jacob Bunge, Kaitlyn Kiernan, and Tomi Kilgore, reported that the"unexplained technical issue paralyz[ed] the market for thousands of securities and rais[ed] new questions about the robustness of U.S. trading systems following a series of high-profile glitches." ["Nasdaq Market Halts Trading," Wall Street Journal, 22 August 2013] They continue:

NASDAQ"The outage saw a large chunk of the U.S. stock market effectively come to a standstill at midday, freezing prices in stocks, exchange-traded funds and options listed on Nasdaq and prompting other trading venues to stop trading those securities. Dark pools and other electronic trading platforms were also forced to suspend trading in Nasdaq-listed stocks, since there were no publicly quoted prices on those securities, traders said. Traders said there was confusion about what stocks were affected, and that phones were lighting up across trading desks as investors tried to figure out what was happening."

The halt to trading lasted over three hours and Nasdaq shares dropped more than 3 percent. Bunge, Kiernan, and Kilgore concluded their article by noting that this "problem is the latest in a string of technology-related mishaps affecting exchanges and brokers as markets over the past two decades have migrated to electronic systems." As the following Bain & Company graphic shows, the financial services sector is enormous in terms of the amount of data it involves.

Bain & Company 01

Since the global economy took a nosedive in 2009, there has been a lot of chatter about organizations that are "too big to fail" without dire consequences. If ever there is a sector that is too big to fail without such consequences, the financial services sector is the poster child. Nowadays it is impossible to isolate the effects of financial failure. Regulators should use this latest wake-up call to force financial services organizations to conduct a vulnerability assessment and take measures to close vulnerability gaps. Enterra Solutions has developed a perfect methodology for such a task — patented Enterprise Resilience Management Methodology (ERMM)TM..

Today’s typical business organization operates in an environment of extreme complexity and enterprise stress. This is certainly true in financial services sector. Generally, companies continually face: Ongoing demands of new requirements, competition and operational performance; compliance pressures (e.g., regulations, directives, and policies); security threats (e.g., corporate espionage, cyber-intrusions, internal criminal activity, and natural disasters; and other business issues associated with investors, industry partners and all levels of domestic and international government organizations. Meeting these demands requires organizational systems that provide them with a high degree of visibility, insight, control, and responsiveness. These systems must also provide real-time information about external events and about internal processes; the ability to effectively intervene in those events and processes to minimize negative impacts; efficiently marshal information from any point in an organization and direct it to any other point; and redirect and adapt an organization's resources as needed when a threat arises or an opportunity emerges.

Most organizational systems fall short of this ideal and typically only provide static solutions to dynamic challenges. Enterra’s ERM Methodology takes a holistic approach so that an organization can identify and protect its most valuable assets. In addition, the enterprises’ existing legacy environments do not readily interface nor is their data easy to integrate (from a technical or security standpoint). These systems are often built around outdated policies that are continuously re-written and updated with the expectation that the technology will be able to immediately exploit them − nothing could not be further from reality.

The ERM Methodology diagnoses the security, compliance and performance requirements, and risks of organizations and determines how to make them resilient to those risks. Enterra’s approach to accomplish the goals for a strategic risk management program follows a four-step phased approach. Phase involves the initial assessment. During this phase, assets or nodes within an organization or a network that are critical for competitiveness and sustainability are identified along with the critical processes and functions that enable the critical assets. Additionally, security, compliance, and requirements that apply to each of the processes and functions are analyzed; along with business opportunities and prioritized business objectives. The analysis is performed at a level of detail such that the rules and processes may be later codified as needed into an automated SaaS solution.

The next phase is a design and build phase. During this phase, a design is developed for migrating information and converting business policies into rules sets and workflow logic that operate across systems and functional organization. The third stage involves solution delivery and the final phase is the operational phase. When fully implemented, the Enterprise Resilience Management Methodology integrates performance optimization, compliance, and security into a truly seamless and enduring solution that is embedded in advanced cloud service delivery.

Quantum Computing and Machine Learning

"The brain performs its canonical task — learning — by tweaking its myriad connections according to a secret set of rules," writes Natalie Wolchover. "To unlock these secrets, scientists 30 years ago began developing computer models that try to replicate the learning process. Now, a growing number of experiments are revealing that these models behave strikingly similar to actual brains when performing certain tasks." ["As Machines Get Smarter, Evidence Grows That They Learn Like Us," Scientific American, 24 July 2013] How well are machines learning? The answer to that question really depends upon what you are asking them to learn. Dominic Basulto reports, "Not only are machines rapidly catching up to — and exceeding — humans in terms of raw computing power, they are also starting to do things that we used to consider inherently human. They can feel emotions like regret. They can daydream." ["Humans Are the World's Best Pattern-Recognition Machines, But for How Long?" Big Think, 24 July 2013]

Such reports might give the impression that super computers, powered by artificial general intelligence, are about to make humans obsolete (or least an inferior species to machines). Iain Thomson, however, reports that, in spite of the dramatic advances being made by computer scientists, current artificial general intelligence systems are about "as smart as a somewhat-challenged four-year-old child." ["IQ test: 'Artificial intelligence system as smart as a four year-old'," The Register, 16 July 2013] He explains:

Machine learning clear"Researchers at the University of Illinois at Chicago have applied an IQ test to MIT's ConceptNet 4 artificial intelligence system, and determined it's about as smart as a somewhat-challenged four-year-old child. The team used the Weschsler Preschool and Primary Scale of Intelligence Test on the system and found it performed reasonably well on vocabulary and recognizing similarities, but scored very poorly on comprehension. 'ConceptNet 4 did dramatically worse than average on comprehension - the "why" questions,' said Robert Sloan, professor and head of computer science at UIC, and lead author on the study. 'If a child had scores that varied this much, it might be a symptom that something was wrong.' ConceptNet 4 has now been replaced with a smarter AI system, ConceptNet 5, but Sloan said its predecessor's performance highlighted one of the fundamental problems with generating true artificial intelligence. Such systems are have great difficulty generating what humans call common sense, since that all-too-rare capacity requires not only an extensive amount of factual knowledge, but also subjective facts we learn in life."

George Dvorsky sees the UIC research as little more than a publicity stunt. The fact that computers lack common sense, he insists, "is exactly why the AI is not nearly as smart as a 4-year old. It's just a glorified calculator at this point — crunching numbers, running scripts, and making probability assessments." ["No, we didn't just create an AI that’s as smart as a 4-year old," io9, 16 July 2013] He continues:

"What it's not doing are all those things that make a 4-year-old so brilliant: living in an environment and learning from experience. What's more, the AI is not embodied, nor does it have the biological inclinations that drive human tendencies. It's also important to remember that a four-year-old's brain is in full-on developmental mode; it's a work in progress that's being forged by experience. Intelligence is not something that's constructed, it's something that develops over time. Sometimes I get the feeling that AI developers simply want to create an end-product AI and say, 'voila, here's an intelligent entity right out of the box.' But that's not how intelligence comes about, and that's not how it works — at least not in the human sense of the term."

Okay, so computers aren't about to take over the world. Nevertheless Basulto believes that as computing power increases and machines become more adept at pattern recognition, their ability to learn will increase rapidly. He explains:

"The future of intelligence is in making our patterns better, our heuristics stronger. In his article for Medium, Kevin Ashton refers to this as 'selective attention' — focusing on what really matters so that poor selections are removed before they ever hit the conscious brain. While some — like Gary Marcus of The New Yorker or Colin McGinn in the New York Review of Books, may be skeptical of [Ray] Kurzweil's Pattern Recognition Theory of Mind, they also have to grudgingly admit that Kurzweil is a genius. And, if all goes according to plan, Kurzweil really will be able to create a mind that goes beyond just recognizing a lot of words. One thing is clear — being able to recognize patterns is what gave humans their evolutionary edge over animals. How we refine, shape and improve our pattern recognition is the key to how much longer we’ll have the evolutionary edge over machines."

Wolchover reports that one promising algorithm is "used by a computer model called the Boltzmann machine, invented by Geoffrey Hinton and Terry Sejnowski in 1983." She reports that it "appears particularly promising as a simple theoretical explanation of a number of brain processes, including development, memory formation, object and sound recognition, and the sleep-wake cycle." Sue Becker, a professor of psychology, neuroscience, and behavior at McMaster University in Hamilton, told Wolchover, "It's the best possibility we really have for understanding the brain at present. I don’t know of a model that explains a wider range of phenomena in terms of learning and the structure of the brain." Wolchover notes that "the Boltzmann machine, bears the name of 19th century Austrian physicist Ludwig Boltzmann, who developed the branch of physics dealing with large numbers of particles, known as statistical mechanics. Boltzmann discovered an equation giving the probability of a gas of molecules having a particular energy when it reaches equilibrium. Replace molecules with neurons, and the Boltzmann machine, as it fires, converges on exactly the same equation."

Devin Powell reports that a new algorithm is being added to the machine learning kit. "In a series of papers posted online this month on the arXiv preprint server," he writes, "Seth Lloyd of the Massachusetts Institute of Technology in Cambridge and his collaborators have put a quantum twist on AI. The team developed a quantum version of 'machine learning', a type of AI in which programs can learn from previous experience to become progressively better at finding patterns in data. Machine learning is popular in applications ranging from e-mail spam filters to online-shopping suggestions. The team's invention would take advantage of quantum computations to speed up machine-learning tasks exponentially." ["Quantum boost for artificial intelligence," Nature, 26 July 2013] As stated, a quantum computer is necessary to take advantage of the algorithm. Powell explains:

"At the heart of the scheme is a simpler algorithm that Lloyd and his colleagues developed in 2009 as a way of quickly solving systems of linear equations, each of which is a mathematical statement, such a x + y = 4. Conventional computers produce a solution through tedious number crunching, which becomes prohibitively difficult as the amount of data (and thus the number of equations) grows. A quantum computer can cheat by compressing the information and performing calculations on select features extracted from the data and mapped onto quantum bits, or qubits. Quantum machine learning takes the results of algebraic manipulations and puts them to good use. Data can be split into groups — a task that is at the core of handwriting- and speech-recognition software — or can be searched for patterns. Massive amounts of information could therefore be manipulated with a relatively small number of qubits. 'We could map the whole Universe — all of the information that has existed since the Big Bang — onto 300 qubits,' Lloyd says. Such quantum AI techniques could dramatically speed up tasks such as image recognition for comparing photos on the web or for enabling cars to drive themselves — fields in which companies such as Google have invested considerable resources. (One of Lloyd's collaborators, Masoud Mohseni, is in fact a Google researcher based in Venice, California.) 'It's really interesting to see that there are new ways to use quantum computers coming up, after focusing mostly on factoring and quantum searches,' says Stefanie Barz at the University of Vienna, who recently demonstrated quantum equation-solving in action. Her team used a simple quantum computer that had two qubits to work out a high-school-level maths problem: a system consisting of two equations. Another group, led by Jian Pan at the University of Science and Technology of China in Hefei, did the same using four qubits. Putting quantum machine learning into practice will be more difficult. Lloyd estimates that a dozen qubits would be needed for a small-scale demonstration."

Powell's comment about Google's "considerable" investment in machine learning may be a reference to its recent purchase of a D-Wave quantum computer. As I noted in a post entitled Quantum Computing: Is the Future Here?, Google's primary interest in quantum computing is advancing research into machine learning. As breakthroughs continue to be made in the area of quantum computing, machine learning should advance as well.

August 21, 2013

Supply Chain Management and the Future

"Supply chain management, as a practice in commercial operations," writes Lora Cecere, "is now thirty years old." ["Why?" Supply Chain Shaman, 19 June 2013] As she looks back over the past three decades, she finds the following:

  • "We have made Improvements in Productivity. Due to improvements in connectivity, 90% of industries have made improvements in productivity (revenue/employee). The chemical and consumer electronics industries have made the most progress.

  • "Balance Remains an Issue. Companies are stalled on improving customer service and forecast accuracy.

  • "Complexity has Grown. It comes in many flavors – increase in inventory, changes in sales policies, new product lines – all add to the complexity. Supply chains have not morphed to manage the complexity at the same cost, quality and level of customer service.

  • "Cycle Management is Stalled. The only industry that has made progress in inventory management is consumer electronics."

Fifteen years ago, Andrew Cox and Richard Lamming also noted that some aspects of the supply chain (like purchasing) had changed very little over the decades even though manufacturing had seen a number of dramatic changes. ["Managing supply in the firm of the future," European Journal of Purchasing & Supply Management, June 1997] The abstract for their article noted:

Supply Chain Management 03 clear"A fundamental, imminent change is taking place in the way firms approach the management of relationships with other firms and within themselves, and that this challenge is being faced by managers with one hand tied behind the back—there appears to be a lack of conceptual understanding. Building this understanding into practice requires the dismantling of traditional and existing perspectives about managing supply."

Daniel Dumke reports that in Cox's and Lamming's paper, "four key themes emerged in the conceptual development of the management of 'supply chains'." ["Managing supply in the firm of the future," Supply Chain Risk Management, 27 August 2012] Those themes were: It is necessary to take a total supply chain view; the chain is an unsatisfactory metaphor: the firm is part of a network; the firm concentrates on its core competencies and outsources everything else; and the firm is an unsatisfactory unit of analysis. Dumke goes on to provide key insights he was able to draw about the first theme (i.e., it is necessary to take a total supply chain view). They were:

  • "'The consumer is perceived to be at the end of a supply chain—a series of value-adding events and activities that leads to the provision of a desirable—valuable—product or service.'

  • "'Supply chain management is […] a process of realignment of activities, from each firm's point of view, in order to reduce value losses, so that the output from the total chain satisfies the consumer and results in the success of all parties to the chain.'

  • "'Within the [value] stream are barriers—interfaces between companies. The management imperative is thus to design those interfaces with minimum impediment, to allow value to flow.'"

Fifteen years on analysts like Cecere believe that an even broader view of the supply chain needs to be taken — one that encompasses a company's supplier's supplier to its customer's customer. Analysts with this broader view of the supply chain also recommend that the best way to break down barriers between companies is to foster collaboration.

Next, Dumke identifies the key insights from the second theme (i.e., the chain is an unsatisfactory metaphor: the firm is part of a network). They were:

  • "'The so-called "chains" often contain looped relationships (where the customer is also a supplier to the supplier), lateral links (where the supplier is a supplier to both the customer and another supplier), dependencies (where the performance of one supplier is intrinsically linked to that of another) and other non-linear facets which deny the convenience of thinking in simple terms.'

  • "'Adopting a network perspective can lead to perceiving supplier relationships as indistinguishable from customer relationships.'

  • "The contribution of the network metaphor to understanding the matter to managing value comes from its method of grappling with complexity.'

As noted above, supply chains have grown even more complex over the past decade and a half and it is no longer necessary to point out to most people that supply chains are really networks. Although the metaphor of a chain has survived, analysts have tried to stress the importance of supply chain management by talking more about value chains rather than supply chains. For example, Cecere writes:

"My definition [of supply chain management] is wide. In my mind, the processes of supply chain cross over and overlap at the ends of the supply chain. Yes, I believe that supply chain overlays on top of the sales and marketing organizations and the procurement function. I feel that these organizations have defined their roles too narrowly. Too few are focused on the role of the organization in driving value networks to improve value-based outcomes. Most lack the understanding and they are just not incented to build value networks. The focus has to be market to market. I believe that the supply chain team can make a difference, but it requires reschooling. It is about much more than trucks and sheds. Or pumps and valves. Or contracts and negotiations. In my mind, it is the end-to-end process: from the customer's customer to the supplier's supplier. ["Yes, I Am a Zealot. I Sill Believe. Do You?" Supply Chain Shaman, 28 September 2012]

Key insights that Dumke drew from Cox's and Lamming's paper about their third theme (i.e., the firm concentrates on its core competencies and outsources everything else) are:

  • "'Managers have, for some time, been encouraged to view their firms as a combination of "core" competencies – those which it is deemed essential to own in order to compete in a market – and, by process of elimination, "non-core" competencies.'

  • "'[…] the issue of whether to "make or buy" is not straightforward; it is always a problematic issue for the firm which may be resolved either through vertical integration or through outsourcing. The key strategic decision for the firm is to decide what the boundaries should be between the two extremes of internal or external contract.'"

I daresay that the jury remains out on this topic. The debate about what is a "core competency" and what can be outsourced remains active.

Finally, Dumke draws insights from Cox's and Lamming's paper on their final theme (i.e., the firm is an unsatisfactory unit of analysis: the flow of value takes place in a loosely aligned array of assets and competencies over which no one commercial organization has ultimate control). They are:

  • "'The further away from the core competencies of the firm, the less there is a need for medium asset specific skills to be vertically integrated, and thus the more support may be expected for outsourcing the activity.'

  • "'[…] firms are best viewed as a "nexus of contracts". The importance of this interpretation is that it forces us to see firms not as fixed entities, existing as objects within a static market structure, but as potentially fluid and flexible constructs whose internal structures and external boundaries may change as circumstances dictate and opportunities require.'"

Although Cox and Lamming may have been correct that the firm is an unsatisfactory unit of analysis, today most analysts discuss the importance of corporate alignment. For example, Cecere writes:

"Companies have focused on vertical processes. There is a need for cross-functional alignment through horizontal processes. Companies with strong horizontal processes of revenue management, new product launch, Sales and Operations Planning (S&OP), Supplier Development and Corporate Social Responsibility have higher performance. ... By and large, organizations are not aligned to drive cross-functional performance. Based on recent research, we find that companies that have invested in Supply Chain Centers of Excellence ... rate themselves higher on cross-functional alignment. The presence of these centers is relatively new. Without them, the gaps are large."

In other words, even though there are numerous parts of the supply chain that may need to be measured, the disparate parts of corporation must be aligned to achieve maximum performance and profitability. Rich Sherman and Bob Sabath, discipline experts in supply chain management with Trissential, believe that in order to make quicker progress in the field of supply chain management rules must be broken. ["Breaking the Rules of Supply-Chain Management," SupplyChainBrain, 25 July 2013] The article explains:

"Sherman says innovation isn't possible without a certain amount of rule-breaking. 'It’s all about doing things differently – creating new and disruptive technology, processes and ways of doing business.' ... How can companies determine which rules need breaking? 'If it's something that everybody does, and people can move from company to company and plug in how it's done, often it's worthwhile to sit back and say, "Do we ever have to do that?"' says Sabath. Every procedure should be subjected to a rigorous cost-benefit analysis, he adds."

Cecere agrees that each company must look at its circumstances, its value chain, and it practices and determine how policies, procedures, processes, and practices should be adapted to fit. "I strongly feel that we have evolving practices," she writes. "I have seen too many companies adopt practices that were not a good fit because they were recommended as best practices." The bottom line is that the topic of supply chain management is large and there is plenty of room under the tent for adaptation and experimentation.

August 20, 2013

Additive Manufacturing and the Future of the Supply Chain

Sweeping statements are rarely true. That's why a headline that declared "Today’s complex global supply chains are poised to be dismantled" caught my eye. The summary of the article, which was written by Paul Brody, states, "Thanks to the growth of 3D printing, intelligent robots, and open-source hardware, tomorrow's supply chains will be faster, smaller, cheaper, and local." [Gigaom, 21 July 2013] There is certainly a kernel of truth in that statement, but I'm certain that not all global supply chains are going to be dismantled. The question really is: How disruptive is additive manufacturing going to be to supply chains? Noted MIT professor Yossi Sheffi, writes, "The additive manufacturing revolution is underway, and product supply chains lie directly in its path of creative destruction. Which ones, if any, will survive?" ["Does 3D Printing Doom the Supply Chain?" Supply Chain @ MIT, 18 July 2013] Brody continues:

3d Supply chain"Supply chains today are big, complex and global. Keeping them humming is an enormous challenge. But does it have to be that way? We think the world is entering the era of small, simple and local supply chains, powered by a new generation of manufacturing technologies such as 3D printing, intelligent assembly robotics and open-source hardware – also known as the Software Defined Supply Chain."

Clearly, all of those advances are going to affect supply chains as well as manufacturing. Professor Sheffi provides a glimpse of how some of the changes could play out. He writes:

"Some supply chains will become obsolete as a result of this flexibility. For example, 3D printers in auto repair shops and retail outlets could make certain auto components on site, eliminating the need for these items to be delivered by suppliers. Many expedited shipments will not be necessary as the technology matures. When a production line goes down, for instance, the part needed to fix the problem might have to be shipped from a faraway supplier using expensive same-day delivery services. Simply printing the part in situ avoids this costly transportation option. Scenarios like these do not auger well for express delivery companies. But the news is not all bad because alternative business opportunities will open up. Delivering the raw materials that feed 3D printers is such a possibility."

Unlike many forecasts about what the future holds, Brody claims that his predictions are going to become reality in the near-term "The 3D printing revolution is not a decade or more away," he explains, "it's going to start showing up in mass production within the next five years. Despite skepticism, research demonstrates 3D manufacturing improvements combined with the expiration of key patents will lead to a 79 percent reduction in average cost to print objects in five years, and a total of nearly 90 percent over the next 10 years." Brody's mention of patents raises the real fly in the ointment when it comes to additive manufacturing. Michael Weinberg, Senior Staff Attorney and Innovation Evangelist at Public Knowledge, explains:

"3D Printing has all the makings of a great disruptive technology. ... It also raises some interesting legal issues. As we have seen from the rise of the internet, the ability to easily create and share goes hand-in-hand with the ability to copy and distribute. ... Copyright is historically used to protect creatively conceived works that serve no functional purpose. That means that while many objects that come out of a 3D Printer — the sculptures and decorative baubles — will be protected by copyright, many more will not. As a result, copying those useful objects will not infringe on anyone's copyright. ... That does not mean that there is no way to protect these useful objects. Patent gives protection to many of the useful articles that are beyond the scope of copyright. ... While copyright protects creative expression the moment that it is fixed, someone with a patentable idea needs to make an affirmative decision to apply for a patent. That takes both time and money, and requires a showing of novelty and usefulness. ... If 3D Printing does gain wide adoption, the real secret will be to consider intellectual property concerns with an open mind and to ask a few simple questions. Is this really a new problem? Can the existing intellectual property regime cope with this problem? If not, what is the specific shortcoming? What are the wider effects of addressing that shortcoming? These questions should help us focus on what is truly new about 3D Printing, and what is just the status quo wrapped up in a fancy new technology."

The editors at Bloomberg report, "3-D printing is already having a demonstrable effect on the economy." ["How 3-D Printing Could Disrupt the Economy of the Future," 14 May 2013] They point out that additive manufacturing has historically been used to produce prototypes; but, last year, "28.3 percent of the $2.2 billion global 3-D printing market was tied to the production of parts for final products rather than prototypes." They agree with most other pundits that additive manufacturing represents a disruptive technology. They conclude, "Disruption can be dangerous and scary. It can also lead to wondrous new businesses and ways of life. Perhaps more importantly, it's inevitable -- so get in front of it while you can." That's really the same message that Brody and Sheffi are trying to get across. Perhaps the biggest change that additive manufacturing will introduce is mass customization. While that may sound a bit oxymoronic, what it really means is that the average consumer will have access to affordable customized products. Sheffi explains:

"Customization offers another example of how the technology will close some doors and open others in the supply chain domain. 3D printing makes it much easier to tailor products to customer needs, even down to the individual level. By tweaking the computerized blueprint and maybe altering the mix of materials, manufacturers can produce a limitless number of design variations. This newfound versatility is likely to trigger a dramatic increase in the number of product SKUs, which adds complexity and hence cost to supply chains. The proliferation of SKUs will pose a major challenge for companies. On the other hand, 3D printers are smaller and more compact than traditional manufacturing installations, and require fewer and less skilled operators. As a result, they can be located closer to consumer locations. This close proximity to markets, coupled with the short lead times made possible by 3D technology, shortens supply chains and reduces the need for large inventories. Service levels can be improved since additive manufacturing is ideally suited to just-in-time operations. These are only the possibilities that we can imagine in this early stage of the technology’s evolution."

Ken Cottrill, a Global Communications Consultant at MIT's Center for Transportation & Logistics, believes that hype over additive manufacturing could be creating a false dawn. "3D printing is being hailed as a breakthrough technology that will revolutionize manufacturing and supply chain management," he writes. "This may be the case, but we should avoid repeating the mistake of relying on hype to judge its value." ["3D Printing: Let’s Not Manufacture False Dawns," Supply Chain @ MIT, 23 May 2013] He notes there are still a lot of questions that need to be answered about additive manufacturing. In addition to intellectual property rights concerns mentioned earlier, he indicates that questions remain about subjects such as quality, life cycles, and trust. He concludes, "Posing questions like these does not discredit a potentially paradigm-shifting technology; it helps us to take a step back and evaluate its evolutionary track dispassionately." Professor Sheffi probably agrees with his colleague; but, he still believes that it is important to envision what could be possible. He concludes:

"Imagine global networks of additive manufacturing machines that are attuned to local markets and can be reconfigured in real time as demand patterns change. Such a network would take supply chain agility to new levels. Or distribution centers that store and supply product blueprints rather than physical products, located 'in the cloud' or in server farms. Of course the world can be altered further if home-based 3D printing becomes the norm. In this world, every home is equipped with a printer capable of making most of the products it needs. Supply chains that support the flow of products and parts to consumers will vanish, to be replaced by supply chains of raw material. It's a compelling vision, but a long way off. Even assuming that consumers want to become micro manufacturing centers, the technology is many years away from such mass market applications. Meantime, 3D printing is a disruptive technology that will destroy many traditional manufacturing models. But reports that the concept of a supply chain will die at the hands of additive manufacturing are exaggerated."

We may well be at the dawn of new age of manufacturing. Nevertheless, it is too soon to draw sweeping conclusions about how this new age will affect supply chains.

August 19, 2013

The Next Billion Consumers

Bain & Company analysts Wlademir Gomes, Louis Lim, Robert Schaus, and David Cooper, report, "The global marketplace is minting a new set of consumers that's bigger than the current shopping base of the US and Europe combined." ["Getting ready to profit from the 'next billion' consumers," Insights, 14 September 2012] Who are these billion new consumers and where do you find them? They explain:

144280944"They're younger. They're literate. They've got increasing access to the Internet. They're in Ukraine. They're in the Philippines. They're in Algeria. They're in China and India. They're the 30-year-old Brazilian woman living with her mother in a favela, who owns a TV and a cell phone with prepaid service but has never traveled by air. They're the Russian retiree who opts for cheaper Western durable goods over new domestic brands. They're the 23-year-old Indonesian woman with low brand loyalty who buys cosmetics in small amounts but buys them frequently, and sees a TV as her next big purchase. Meet the 1.2 billion people who will move out of subsistence poverty by the year 2020. They're the world’s newest consumers, those living in households where annual disposable incomes will surpass $5,000 for the first time. It will be their initial experience with discretionary income and they'll have distinct ideas about how they want to spend it. These new consumers are already starting to develop tastes and demonstrate preferences in some categories."

Robert J. Bowman, managing editor of SupplyChainBrain, reports that there are even more emerging market consumers ready to follow the lead of the next billion. "According to Ernst & Young," he writes, "the global middle class is set to burgeon from its current level of 1.8 billion to nearly 5 billion by the year 2030." ["Tomorrow's Global Consumer: Smack Dab in the Middle," 16 January 2012] Bowman indicates that the Ernst & Young report "locates what it calls 'the next three billion' members of the middle class in India, Brazil, Indonesia, Turkey, Eastern Europe and even parts of Africa." Like the Bain analysts, the Ernst & Young analysts see the majority of these new consumers as younger than today's consumers. "One key driver is the high percentage of young people in those developing countries, says Maria Pinelli, global vice-chair of strategic growth markets for Ernst & Young. As they enter consumption age, these citizens will be demanding good, steady paychecks, which will translate into disposable income."

Billions of new consumers represent a spectacular opportunity for many businesses. The Bain analysts warn, however, "consumer products companies that don't act quickly enough will risk losing out to faster global or local competitors." They claim that even for companies that sell products that "consumers won't develop a taste for ... for years ... now may be the time to set the stage for attracting them when they're ready." To attract them, however, businesses must understand them as well as the circumstances in which they live. Only big data analytics will be powerful enough to provide the kinds of insights necessary to sell to consumers living in specific neighborhoods in urban environments in emerging markets. Simply knowing macrotrends will not be good enough.

Macrotrends do provide a good place to begin understanding the next billion or so consumers. An earlier Bain report, for example, concluded that the "new middle class will be considerably poorer than today's middle class in the advanced economies. In China, for example, peak income will average about $18,000 per year in current dollars — more like a giant Poland than another US." ["The next billion consumers," by Karen Harris, Austin Kim, and Andrew Schwedel, 9 September 2011] Among their conclusions about what this means for business, they wrote:

• "This is a large market but at a much lower price point for many purchases. Due to the new consumers' relatively lower incomes, the overall basket of goods and services will differ from what consumers in advanced economies purchase.

• "Companies will need to target emerging markets with a different cost structure. Expect price points to remain at a lower level rather than assuming migration upwards across all products.

• "Marketers will have a transient opportunity to impact the tastes of those moving into the middle class."

Most everyone knows that China and India (two of world's most populous countries) will see their share of new consumers enter the market. But the latest Bain report claims to have "found two important insights that can help companies as they pursue the next billion consumers." They are:

"First, the next billion opportunity extends beyond China and India. While China and India still will be the major developing markets, an army of about 350 million new consumers will come from more than 50 other countries, everywhere from Peru to Nigeria to Uzbekistan. That's a population as big as the US. Companies not yet on the path to leadership in China and India can consider going straight to these markets. Not only are the consumer populations on the verge of expansion, but there also are attractive opportunities to acquire local players. Second, our research found that in more product categories than anticipated, consumers behaved in a similar fashion across very disparate countries. ... That isn't the case for all categories, however, so companies need to look at their portfolio and determine which categories can be rolled out with the same strategies and which require different strategies from country to country. Understanding how shopping baskets differ across countries and income brackets can help prioritize when and how to reach the next billion."

The report goes on to stress the importance of data analysis because "each country is unique when it comes to the profile of its emerging consumer class." In addition, analysis needs to include "three dimensions: country, category and a company's own capabilities." The analysts point out, for example, that life expectancy could play a major role when companies are deciding where to invest as well as the types of products they might want to consider offering. They explain:

"Lifetime purchasing per individual among members of the next billion in Mexico and Ukraine is expected to be twice as long as it is for their counterparts in Indonesia and Vietnam — and more than three times longer than that of Nigeria. While most consumer goods firms are aware that their core market is, on average, getting older or younger, few fully quantify this by determining how many years of remaining spending there are for the average consumer. More spending years are certainly better."

The Bain analysts also report that the next billion consumers are going to be more technically savvy and literate than their current emerging market consumers. "There's no question," they write, "that consumer goods companies will have unprecedented access to the next billion as Internet penetration is booming and adult literacy rates are accelerating in most places, making it easier to establish a brand or win a loyal following than it has ever been in a developing market. But in prioritizing countries, companies are carefully evaluating how access to mind share (and thus commercial opportunity) varies, market by market." They maintain that understanding similarities and differences is important if companies are going to enter emerging markets successfully. They offer four recommendations: 1) Get in Early; 2) Look Ahead; 3) Earn Your Premium; and 4) Know What's Different.

Ernst & Young's Maria Pinelli agrees that a good understanding of local conditions is essential. "It's local production, with an intimate knowledge of consumer needs, that will have the upper hand," she told Bowman. The Ernst & Young report also emphasized that companies should be open to new opportunities. They may find that current offerings really aren't a good fit with emerging market preferences, tastes, or incomes. As a result, "a concerted effort to serve specific emerging markets is just as likely to lead to an entirely new product or service, boosting a manufacturer's total revenues."

Clearly, most analysts see the greatest opportunities in Asian and Latin American markets. But, as I noted in a post entitled The African Continent: Emerging Markets Full of Potential and Challenges, African countries shouldn't be overlooked. A Boston Consulting Group report concluded, "A new consumer class is emerging across Africa — one with increasing purchasing power and a hunger for products and services that once seemed unattainable." ["Marketing to the Emerging Consuming Class of Africa," SupplyChainBrain, 30 January 2013] The message here is simple: Companies that want to attract their share of emerging market consumers need to start their efforts quickly. Although 2020 or 2030 may seem to provide a long lead-time until these new consumers emerge, the analysts cited above agree that waiting to move is a bad strategy.