Site moved to enterrasolutions.com/2013/05, redirecting in 1 second...

« April 2013 | Main | June 2013 »

23 posts from May 2013

May 31, 2013

A Thought Probe Series on Tomorrow's Population, Big Data, and Personalized Predictive Analytics: Part 6, Getting Personal

In the final segment of this series on tomorrow's population, big data, and personalized predictive analytics, I want to get personal. The series has primarily focused on cities because that is where the majority of the world's population lives. We must remember, however, that the world's cities are occupied by millions of individuals. Too often we lump all city dwellers together and treat them as a homogenous group (e.g., we say, "He's a New Yorker"). New York City residents know, however, that the city is a melting pot of cultures and individuals. Understanding these differences is at the very heart of what makes a city smart -- and only big data analytics can provide that understanding.

Big data can be empowering and transformative. Individuals, corporations, and governments all over the globe are generating zettabytes of data every year as they connect to networks using their computers and cell phones. People all over the world can search for and purchase consumer products; make dinner reservations at their favorite restaurants; perform research; conduct banking transactions across country boundaries; interact socially with their friends; perform activities associated with their careers; and deepen the interactions of their lives. And, as recent events have demonstrated, newly connected individuals in the developing world can also transform countries and drive social change using mobile telecommunications and social media. As these developing countries grow and millions of more people come "on-line," the next large marketplace in the global economy is being created for consumer goods and services that cater to the needs and desires of these newly connected consumers (who are estimated to be more than 2 Billion in number). ["Capturing the world’s emerging middle class," David Court and Laxman Narasimhan, McKinsey Quarterly, July 2010]

However, big data presents challenges for today's networks and computing systems. Major challenges include: data comes from everywhere, is mostly unstructured, is in many shapes and forms, and is far too large to move. Historically, data has come from traditional structured sources, such as corporate and governmental computer systems. Today it increasingly comes from unstructured data sources such as the Internet, mobile devices, social media interactions, GPS location information, weather models, RFID, transportation and logistics scans that do not reside neatly within the tables and columns of traditional uniform databases and computer systems. What this means is that big data is too "Big" and too "Unstructured" to be currently leveraged by most organizations.

Even if the data could be centralized, today's computing systems still have difficulty making sense of the data (i.e., understanding and learning) from the interactions between both related and seemingly unrelated data elements. Using current analytic techniques, most decision-making frameworks are challenged to process and understand volumes of data and then instigate actions that foster desired outcomes within timely decision cycles.

That is why it is essential in today's business and government environments to employ technologies that process and analyze big data in a way much like the human mind senses its environment and processes data. For example, an individual assesses the risks of crossing a street when a car is approaching. The mind processes variables like car speed, distance, obstacles, personal motor skills, and so forth, before making the decision to cross or wait. Like human thought processes, cognitive reasoning ingests and transforms data into prioritized information; creates rich referential connections between data elements; enables understanding and learning; and is then presented as actionable intelligence (within relevant timeframes). This kind of cognitive reasoning can be used by businesses and governments to take on some of today's most vexing challenges. Many of these challenges are cross-industry and cross-discipline in nature. They require complex simultaneous analysis on many levels that can model real, or open, world considerations. Some of these interdisciplinary challenges that apply to the global Consumer Products/Food and Retail Industries include, amongst many others, how to:

  • Feed and provide water to a hungry and thirsty planet;
  • Efficiently develop and deliver energy to a highly consumptive population without increasing the carbon footprint;
  • Motivate consumer-centric outcomes with differentiated insights to maximize value for the consumer, Consumer Packaged Goods companies, and retailers alike;
  • Personalize recommendations for global consumers about the products they purchase;
  • Develop and deliver new drugs to cure disease and increase quality of life;
  • Manage the global supply chain efficiently and with fewer risks; and,
  • More accurately predict weather and its impacts on the environment.

As noted in previous posts, each of the world's 50 largest cities is unique. Some have deep historical roots as large cities and others have only recently joined the list as a result of massive building efforts in places like China (see the attached map -- based on a map from Free World Maps). As the global population continues to grow, new "greenfield" cities are likely to emerge. Regardless of whether a city labeled a "brownfield" (i.e., older) or "greenfield" (i.e., newer) city, some challenges they face are universal (e.g., infrastructure, sanitation, education, healthcare, food security, and so forth). Meeting those challenges, however, can differ significantly. That is where big data analytics plays its most important role.

50 Largest Cities v02 w cities

Making smart decisions about urban growth and lifestyles is important. I agree with Parag Khanna, Director of the Hybrid Reality Institute and a leading geo-strategist, that cities will play a leading role in world affairs. ["Cities, Not Countries, Will Once Again be Key to World Order," The National, 26 March 2013] He argues, "Urban corridors are a force multiplier, a source of great strength." Such corridors can exist within a country as well as between countries. These corridors exist because they link the largest number of people and, therefore, provide the most opportunities (and challenges) associated with life's endeavors. At the international level, most of the traffic along these corridors involves the flow of information, goods, and services. Ensuring that these flows are optimized is going to require cooperation and technology. Khanna believes there are seven activities that cities must embrace if they are going to provide a good quality of life for their residents. They are:
  • First, use technology to empower the population. ...
  • Second, the use of scenario planning to forecast diverse possibilities and strategies for a turbulent world. ...
  • Third, complement urban master planning with economic master-planning. This means investing in the vocational training systems that prepare the labour force for rapidly shifting supply chains.
  • Fourth, use data and social media as a tool of governance to more efficiently deliver public services and manage traffic.
  • Fifth, constantly upgrade infrastructure to meet sustainability standards.
  • Sixth, expand the economic footprint through investing in special economic zones in neighbouring countries.
  • Seventh, and finally, think of all residents of increasingly multinational/ethnic cities not as citizens versus non-citizens, but as stakeholders.

For the remainder of this post, I want to concentrate on three of those activities: Using technology to empower people and companies; improving "rapidly shifting supply chains"; and upgrading infrastructure.

Using technology to empower people and companies

Since most of the new consumers that form the global middle class are found in cities, companies want to connect with them. Because business, like politics, is local, global companies need to act like local enterprises, regardless of their size, if they want to succeed in this new business landscape. That's because cities are so diverse. Each neighborhood has its own character, lifestyle, and preferences. From one street to the next, the culture can change dramatically. If companies want to get in front of the money, they need to understand neighborhood differences and tailor their offerings to meet local preferences. Although some of those offerings will be made available in local brick-and-mortar stores, tomorrow's business landscape is going to be dominated by mobile devices. This makes the digital path-to-purchase a critical element of any company's business strategy. The most successful companies will find a way to seamlessly weave together multi-channel sales opportunities.

In order to sell a product or service, however, manufacturers need to ensure that consumers know about it. That's where technology can empower consumers, manufacturers, and retailers. In mega-cities, the ability to target consumers will be the sine qua non of successful business strategies. For a more in-depth discussion of targeted marketing, read my posts entitled Personalization and Targeted Marketing, Part 1 and Part 2. Even areas that are supposedly "off the grid" have been penetrated by mobile phone technology. The so-called "bottom billion" who live in these areas still require products and services. There are profits to be made selling to the bottom billion, if the products and services can be tailored to their circumstances. Big data analytics can help companies and governments better understand the requirements of people living in these "off the grid" areas so that they are not destined to live forever in poverty and squalor.

Improving "rapidly shifting supply chains"

At Enterra Solutions, we believe that companies must obtain full visibility into their supply chains -- from the issuance of a purchase order (PO), continuing through production milestones, transportation (i.e., ocean shipping, rail/truck), to warehouse delivery, and ultimately shipping to a customer. In order to achieve this, they need to implement what we call Global Network Synchronization. For a company, Global Network Synchronization refers to the ability to understand the complex interactions and dependencies within its supply chain. If a supply chain can be synchronized, it can quickly adjust to disruptive events (e.g., production delays, raw material shortages, weather, port delays, labor disputes, and other events). Global Network Synchronization can also reduce systemic risk by looking for supply chain risk exposures and pre-planning to mitigate those exposures. A company's global sourcing strategy requires real-time supply chain visibility and understanding the perturbative effects of any delays, so that action can be taken to mitigate any negative consequences. Understanding the perturbative effects of a supply chain disruption is a complex multi-threaded analysis that must take into account the entire range of critical supply chain issues and risks.

Upgrading infrastructure

Cities are not going to get smarter if their infrastructure remains dumb and outdated. The single most important infrastructure a city needs is a good electrical grid. Without a stable and reliable source of electricity, cities can't attract businesses, create new jobs, or become an "always on" hub of activity. The next most important infrastructure requirement is a good water and sanitation system. Without such a system, the population is likely to remain unhealthy and exposed to disease. With water forecast to be in short supply in the future, having a state-of-the-art water system could mean survival for some cities. For more on this topic, read my posts entitled Water, Water, Every Where -- What Can We do About It? -- Part 1, Part 2, and Part 3.

Transportation infrastructure is essential for economic growth. Goods cannot move on dirt roads during severe storms. Container-laden ships cannot offload at ports whose harbors are not deep enough or whose docks can't handle the containers. Some analysts believe, however, that the greatest infrastructure shortfall in the developing world is found in the so-called "last mile" of distribution. For example, Andrew Youn and Nicholas Fusso write, "Today’s greatest need is not for scientists and engineers to create new tools. The real need is for better distribution of solutions that already work." ["Distribution, the Key to Unlocking the Development Toolbox," Next Billion, 25 April 2013] Although developing countries suffer from significant infrastructure shortfalls, even countries like the United States have infrastructure issues (see my post entitled Infrastructure, the Supply Chain, and Economic Growth. You cannot discuss transportation-related infrastructure issues in isolation. The logistics world has always been intermodal, but the complexity of orchestrating intermodal shipments is increasing. Only technology can deal with this complexity; which leads to the final infrastructure that is becoming essential for economic growth -- an information grid.

Analysts are predicting that we will shortly have an Internet of Things that will primarily involve machine-to-machine communication. Smart buildings, smart grids, smart robots, smart cars, and every other "smart" machine will be communicating on the Internet of Things ensuring that systems are functioning efficiently and effectively. Technology has always made progress easier and improved the quality of life for millions of people. I don't see this trend ending any time soon.

May 30, 2013

Making Everyone an Innovation Champion

Drew Boyd, a professor of marketing and innovation at the University of Cincinnati, believes that if your organization has an innovation champion you should kill him or her. Not literally, of course. He means if you have a position with innovation in the title, that position should done away with. ["Kill Your Innovation Champion," Psychology Today, 13 May 2013] He believes that such positions stifle rather than foster innovation. "Assigning a champion lets everyone off the hook, he writes. "Why innovate when we have our 'champion' to do it?" He continues:

Innovation champion clear 02"A study by the Association of Innovation Managers found that when companies assign innovation champions and establish separate funding, it threatens the R&D and the commercial departments. 'This kind of sponsorship opens the door for subtle forms of sabotage if the established business units believe that the innovation funding is inhibiting their ability to accomplish short-term objectives and take care of current customers. Without involvement, the commercial arm of an organization can also claim no responsibility for success or be blamed for failure.'"

A number of organizations have established a position titled something like "Chief Innovation Officer," but Boyd claims that people don't stay in those positions very long. He explains:

"If you won't kill your champion, no worry - they will go away on their own. The study also looked at what puts innovation managers at risk. Of the 15 innovation champions in the study, 10 left their organizations and became consultants, 4 joined smaller or start-up companies, and 1 retired. None returned to a Fortune 500 company."

Frankly, there is nothing wrong with the concept of innovation champions per se. Good ideas need people to push them from concept to reality. Rather than having a single "champion," most pundits recommend empowering everyone with the opportunity to become a champion. Ron Ashkenas, a managing partner of Schaffer Consulting, believes "the reality is that unless they're in research or product development, most people in organizations don't think of themselves as innovators." ["Innovation Is Everyone's Job," HBR Blog Network, 6 December 2011] He claims that employees are often discouraged from being innovative. "Many managers discourage their people from inventing new ways of doing things," he writes, "pushing them instead to follow procedures and stay within established guidelines." Another contributing factor, he asserts, is that too often employees think of innovation only in terms of "new products, services, or technology." That perception, he writes, "is problematic." He continues:

"Apple's success has been fueled not only by new products, but also by innovative approaches to packaging, retail sales, customer access, and partner agreements. Similarly, Toyota's growth has come as much from innovations in manufacturing, inventory control, and management systems as it has from new automobiles."

For Ashkenas, the bottom line is:

"Great organizations don't depend on a small number of exclusive people to come up with innovations. Instead they create a culture in which every employee is encouraged and empowered to innovate — whether it's in processes, products, or services. This leads not only to new customer offerings but also better margins, stickier customer relationships, and stronger partnerships with other firms. Moreover it leverages the brains and talents of thousands of people, any one of whom might generate either an incremental innovation or a breakthrough idea."

Tim Huebsch agrees with Ashkenas. "We often look right past our best sources of innovation," he states, "our own employees and everyone else around them. It isn't that we need to find someone else to do it but instead it needs to be part of everyone's job. Each and every person has ideas of how we can do things better. Some ideas are small and won’t have a huge impact on the organization themselves, but by finding ways to encourage, support and implement these small ideas builds a foundation that will foster innovative thought throughout an organization. As employees see their voices are heard and continue to be encouraged to come forward with ideas, it is amazing to see the energy, creativity and passion that is so easily ignited." ["Innovation — It isn’t someone else’s job; It is everyone’s job," Leadership and Community, 22 April 2013] Admittedly, it is easier to assert that innovation is everyone's responsibility than it is to create the environment, culture, and structure that make it a reality.

Ron Thomas, a human resources officer, believes that the first step in making everyone an innovation champion is cultivating better listening skills in executives. ["Best Ideas? They Can Come From Anyone – IF You’re Willing to Listen," TLNT, 5 November 2012] He's correct. If employees know that their ideas are going to fall on deaf ears, there is no reason to speak up. Thomas believes there are three dynamics that foster an innovative environment. They are:

  • Having an organization that creates an environment where everyone brings value and feels comfortable in being heard;
  • Having employees who believe that their voices can be heard;
  • Having leadership that will acknowledge and give credit to their employees.

Personally, I agree with Thomas that employees deserve to be recognized for their contributions. Boyd, however, doesn't agree. He believes that giving credit for good ideas is a bad idea. He bluntly writes, "Don't give credit for good ideas." He explains why:

"Tanya Menon from the University of Chicago describes the paradox of an external idea being viewed as 'tempting' while the exact same idea, coming from an internal source, is considered 'tainted.'

'In a business era that celebrates anything creative, novel, or that demonstrates leadership, "borrowing" or "copying" knowledge from internal colleagues is often not a career-enhancing strategy. Employees may rightly fear that acknowledging the superiority of an internal rival's ideas would display deference and undermine their own status. By contrast, the act of incorporating ideas from outside firms is not seen as merely copying, but rather as vigilance, benchmarking, and stealing the thunder of a competitor. An external threat inflames fears about group survival, but does not elicit direct and personal threats to one's competence or organizational status. As a result, learning from an outside competitor can be much less psychologically painful than learning from a colleague who is a direct rival for promotions and other rewards.'"

It seems to me that Boyd and Menon are trying to throw the baby out with the bathwater. There are actually two problems discussed. The first is jealously. Only a change in culture can correct that problem. The second problem described is failing to give the same credit to employees for their good ideas that executives give to ideas that come from outside the organization. It's the opposite of the "not invented here" paradigm that has traditionally plagued large companies. The solution to the second problem is, in fact, to acknowledge and give credit to employees just like Thomas suggests.

Perry Rotella, a senior vice president and chief information officer at Verisk Analytics, accepts the notion that innovation is everyone's job but goes a step further. He believes that employees should be encouraged to become innovation leaders (i.e., innovation champions). ["If Innovation Is Everyone's Job, Why Not Be a Leader?" Forbes, 11 June 2012] That will only happen, he insists, if the right environment is created. That environment, he writes, is one that encourages risk-taking, gives people the time and tools to experiment, and encourages collaboration.

Supporting innovation champions with more than words and back slaps is important. "In order for innovation to flourish in your organization," writes Chuck Ferry, "your innovation champions must be supported through properly structured responsibilities, goals and resources. Otherwise, they will leave to pursue other opportunities, taking their energy and ideas with them." ["Cultivating Innovation Champions," Innovation Management.se, 30 January 2013] Although Ferry doesn't dismiss the notion that innovation is everyone's job, he clearly doesn't believe that everyone can be an innovation champion. He believes that innovations champions share unique characteristics that make them leaders. He cites a book written by Gerard J. Tellis, entitled, Unrelenting Innovation. Tellis identifies four characteristics shared by true innovation champions. They are: Having a vision for the future mass market; being a maverick and dissenter; having the conviction to persist against heavy odds; and, a willingness to take extraordinary risks to bring their idea to fruition (including the ability to instill this trait in their teams).

Whether everyone in your organization is capable of becoming an innovation champion isn't really the point. Leaders will emerge if the right culture and environment exist. While it is important to recognize that good ideas can come from anywhere, those ideas will fall on barren ground if there isn't an innovation champions to push them through to the end. It doesn't require a title or special position to become an innovation champion; but, it does require a bit of courage.

May 29, 2013

Business and Big Data

"There's a ton of information out there," Steven Rosenbush and Michael Totty write. "And businesses are figuring out how to put it to work." ["How Big Data Is Changing the Whole Equation for Business," Wall Street Journal, 8 March 2013] They note that the definition of big data is "squishy" but believe that the only thing businesses need to know is that they have "more information than they used to, it comes from many more different sources than before, and they can get it almost as soon as it's generated." Regardless of how you define or label this data, "businesses in a slew of industries are putting it front and center in more and more parts of their operations." Rosenbush and Totty continue:

Big Data no U turn"They're gathering huge amounts of information, often meshing traditional measures like sales with things like comments on social-media sites and location information from mobile devices. And they're scrutinizing it to figure out how to improve their products, cut costs and keep customers coming back. Shippers are using sensors on trucks to find ways to speed up deliveries. Manufacturers can trawl through thousands of forum posts to figure out if customers will like a new feature on their product. Hiring managers study how candidates answer questions to see if they'd be a good match."

In an opinion piece published in the Financial Times, Doug Laney, Hung LeHong, and Anne Lapkin, research analysts at Gartner, write, "Big data is one of the most hyped terms on the market today"; but, they also agree that big data "means big money for some." ["What Big Data Means for Business," 6 May 2013] Daniel Burrus, a technology futurist, believes, "We're starting to see that any company's competitive advantage is increasingly determined by the quality of the data they have and how they're using that data to make real-time decisions." ["Competitive Advantage Is Increasingly Determined By Your Data," Huffington Post, 8 May 2013]

The term "big data" has become so ubiquitous it won't be going away anytime soon -- despite protestations that its definition is unclear. One such protest comes from Ted Underwood, an Associate Professor of English at the University of Illinois, Urbana-Champaign, insists, "[The] conversation about 'big data' has become a hilarious game of buzzword bingo." ["Against (talking about) 'big data'," The Stone and the Shell, 10 May 2013] He continues, "The discussion is incoherent, but human beings like discussion, and are reluctant to abandon a lively one just because it makes no sense. One popular way to save this conversation is to propose that the 'big' in 'big data' may be a purely relative term. It’s 'whatever is big for you'." He's correct that "big" is a relative term and that it has little meaning for businesses beyond the fact that, whatever the size of the databases they must work with, it's the value of the insights gained from the data that is really important. Laney, LeHong, and Lapkin offer Gartner's definition of "big data": "Big data is high-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making." That definition might not advance the discussion too much further down the road, but it does make the point that advanced information processing (i.e., new technologies) are required to make sense of it.

They then discuss "three categories of business opportunities" in which "big data can unlock new business value in a wide variety of ways." Those areas of opportunity are: "Making better-informed decisions, discovering hidden insights, and automating business processes."

Better-informed Decisions

The types of decisions that can be enhanced by data analysis are endless. Laney, LeHong, and Lapkin mention, for example, "prices, promotions, staffing levels, or investments." Tim Kastelle, a senior lecturer at the University of Queensland Business School, asserts that the ubiquity of data and analysis means, "We’re all in the knowledge business now." [Innovation for Growth, 19 February 2013] If he's correct (and he probably is), then businesses must master the basics of how to leverage the data they use. Rosenbush and Totty relate how Caesars Entertainment Corporation uses big data analysis to help its employees make better decisions about healthcare treatments and how Catalyst IT Services, a Baltimore-based technology outsourcing company, uses big data analysis to make hiring decisions. Laney, LeHong, and Lapkin remind us that it is not just humans who are making better decisions through data analysis. They explain:

"[When Wal-Mart] wanted to help its website shoppers find what they were looking for more quickly. It developed a machine learning semantic search capability using clickstream data from its 45m monthly online shoppers combined with product and category-related popularity scores generated from text mining social media streams. Wal-Mart's resultant 'Polaris' search engine yielded a 10 per cent to 15 per cent increase in online shoppers completing a purchase (or around a billion dollars in incremental sales)."

Taylor Provost predicts, "Using the totality of an enterprise's data to make forward-looking business decisions, develop new products, and improve marketing efficiency will be so common that there won’t be a name for it." ["Prepping for the Big Data Future," CFO, 3 December 2012] There are hurdles, however, that are likely to be encountered along the road to this business-as-usual dream. Provost explains:

"Using Big Data to make better business decisions requires that all the data a company collects and manages be integrated, not locked away in silos. Part of that problem is technological — the integration is poor because the tools are outdated — and part is political. Data is power. It can sometimes determine compensation; it's frequently used to lobby for resources and position. People are not always willing to share."

Even if the cultural datasharing challenge can be overcome, integrating disparate data is not a trivial task.

Hidden Insights

Laney, LeHong, and Lapkin write:

"Big data analysis can also be used to discover opportunities that are obvious only by looking at large sets of detailed data. Many organisations are mining vast pools of data to discover hidden insights that were previously unavailable to them - often in the development of new or enhanced products."

Obviously discovering hidden insights is closely linked to making business decisions. They discuss how Climate Corp uses big data analytics to determine weather-related crop risks. Rosenbush and Totty describe how UPS combined "GPS information and data from fuel-efficiency sensors installed on more than 46,000 vehicles" to gain insights. The results were impressive. They report, "UPS in 2011 reduced fuel consumption by 8.4 million gallons and cut 85 million miles off its routes."

Automating Business Processes

Laney, LeHong, and Lapkin write, "Finally, new technology can be used to leverage big data in real time, allowing analysis to be built into processes so that automated decision making can occur." The example they discuss is how McDonald's now uses highspeed image analytics to ensure that its buns are baked uniformly. My company, Enterra Solutions, provides an automated business solution for a problem that has plagued manufacturers for year: retailer compliance. Enterra’s Retailer Compliance Module enables an organization to proactively sense and adapt to changing retail requirements, reducing potential future penalties and improving deduction recovery. The system monitors and imports retailer compliance requirements, it applies analytics to identify gaps between the retailer's requirements and an organization's capability to comply. Not only does this address potential compliance issues long before orders are received, it provides a central communications hub and automates the collection of information to speed the research of penalties/shortages to help expedite disputes and exemptions. And by eliminating the time-consuming task of having to manually monitor retailer and carrier sites for requirement changes, it allows manufacturers to reduce their compliance department costs overall.

"Going forward," writes Burrus, "the type, quality, and relevance of the data will become far more important than the quantity of data, so being very good at managing these will create new ways to differentiate as well as find innovative approaches to creating and maintaining competitive advantage." He continues:

"So with all of this data coming in, it's clear that competitive advantage is going to be created by your use of data and by your ability to make sure you're getting good data. After all, bad data yields bad decisions. You want to be able to draw the right conclusions from your data, as that's what provides new opportunities, better solutions to problems, and new competitive advantage. ... If you want to solve seemingly impossible problems and find new competitive advantage, you have to look at the type and quality of the data you're generating and how you're using it. When your data can empower your people and your machines to make better decisions faster, you'll have increased competitive advantage."

No matter how you define big data, the analysis that is now being done on very large datasets is changing the business landscape forever.

May 28, 2013

Privacy: The 600-pound Gorilla in the Big Data Room

"[The] sophisticated world of Big Data analysis has arrived," writes Andrew Carswell, "a realm where both criminal indiscretions and consumer appetite will be targeted by a myriad of data analysts, or 'geek squads', whose sole purpose is to connect the dots - to cross reference your data, paint a picture of who you are, determine what you want and plan how best to serve you." ["Big Data analysis allows businesses and governments to mine your personal details," The Daily Telegraph, 13 April 2013] You have to admit that the thought of a room full of "geeks" tracking your every move and analyzing your life sounds a bit creepy and the debate about privacy is only going to heat up as more sophisticated analytic techniques are developed. Privacy is the 600-pound gorilla in the Big Data room. Carswell continues:

Privacy Gorilla"This so-called Big Data, which has emerged as the boardroom buzz word for 2013, could be in the form of your bank transactions, your phone calls and texts, the treasure trove of personal details on your Twitter and Facebook account, your Google searches, the petitions you may have signed, the purchases you have made, the information captured by websites and electronic sensors. It is a world which is set to revolutionise the way governments provide services; a world which allows businesses to build intimate relationships with customers; but a world which will ignite an intense debate on the issue of citizen privacy."

As a businessman and technologist, I'm a big fan of big data and the potential it holds. The analysis of large data sets can make the world a better and, I admit, a more profitable place. We all know that there have been (and will be) mistakes made in the handling of personal data. Most businesses, however, don't really want to know the kind of personal information that most people would feel is too invasive. As Ruud Wanck, an early adopter who founded an internet company in 1994, told Robert Heeg, "On the web, the vast majority of business models are based on pseudonymous data; we don’t need to know who is who. And we don’t even want to, because it's incredibly expensive to save all that data." ["From Mad Men to Math Men," RW Connect, 8 May 2013] That's why some pundits, like MIT Professor Alex "Sandy" Pentland, believe that the commercial world and consumers will eventually work out a mutually agreeable arrangement in which consumers have control over their personal information. For more on Pentland's views, read my post entitled Big Data and Big Brother. John Carroll, Senior Director in Ipsos MediaCT and Chairman of the Media Research Group, agrees with Pentland. He writes:

"The battleground for data will be held between the vendor and the consumer. People will wise up very quickly to flip the relationship around. 'If you want my personal data, it will cost you'. Or, 'I will give you this bit of my data, but not all of it'. The word 'engagement' rears its head here. If consumers have a relationship with brands and trust them, then there will probably be a healthy two-way 'big data' relationship." ["I predict a big data riot," MediaWeek, 27 February 2013]

Regardless, Wanck agrees that, for those involved with big data, today's biggest challenge is privacy. In the heated discussions that are sure to come, Wanck insists, "We should differentiate between anonymous data and more specific personal data." He explained to Heeg, "Initial political interest in privacy created a very black-and-white, almost populist discussion. The only distinction made was the one between generic data, like how many people visit a website, and personal identifiable information." Heeg continues:

"In between the two, he argues, is a third category that is used predominantly by marketers and media companies. 'This is data that determines how people search for information on the web collectively.' As a company, GroupM says it is not interested in personal data. Media companies, after all, are built on their ability to reach large groups of people who have a similar profile in a short space of time. Wanck calls it 'pseudonymous' data, because it is personal but contains a large element of anonymity. 'Of course you could argue that, if I were to really look for someone with certain identifiable qualities and connect all types of data, I could possibly trace the data back to a certain individual. But why would we? It is not the business we're in. The essence of advertising is that it's more efficient to reach a large target group through marketing and communication in one go. That is more cost-efficient than to have a call-centre approach [for] individuals.'"

Jon Neiditz, a lawyer concerned with privacy issues, doesn't believe current privacy laws are sufficient. ["Big Data Will Turn Privacy Upside Down," ID Experts, 7 May 2013] In his article, he lists six "major concerns regarding the application of current privacy law to big data," and insists "the protection of privacy must have what are called in information security 'compensating controls'." Two such controls involve information security and data company accountability. Derrick Harris agrees with Neiditz that data security is going to play a critical role in the big data arena. He believes security is essential because, unlike Wanck, he insists that there is no such thing as pseudonymous or anonymous data. ["If there’s no such thing as anonymous data, does privacy just mean security?" DAM Foundation, 3 April 2013] He bases his conclusion on an MIT research paper that concluded:

"To extract the complete location information for a single person from an 'anonymized' data set of more than a million people, all you would need to do is place him or her within a couple of hundred yards of a cellphone transmitter, sometime over the course of an hour, four times in one year. A few Twitter posts would probably provide all the information you needed, if they contained specific information about the person's whereabouts."

While I'm sure a lot of people would simply like to see the collection of data halted altogether, Adam D. Thierer, a public policy analyst, believes that would be a bad idea. ["My Senate Testimony on Privacy, Data Collection & Do Not Track," The Technology Liberation Front, 24 April 2013] Testifying before a Senate committee Thierer made three primary points. They were:

  1. First, no matter how well-intentioned, restrictions on data collection could negatively impact the competitiveness of America’s digital economy, as well as consumer choice.

  2. Second, it is unwise to place too much faith in any single, silver-bullet solution to privacy, including 'Do Not Track,' because such schemes are easily evaded or defeated and often fail to live up to their billing.

  3. Finally, with those two points in mind, we should look to alternative and less costly approaches to protecting privacy that rely on education, empowerment, and targeted enforcement of existing laws. Serious and lasting long-term privacy protection requires a layered, multifaceted approach incorporating many solutions.

Although privacy concerns will continue to be raised whenever big data collection and analysis is discussed, Diane Mehta reports that the intensity of the debate may decrease in the years ahead. The reason might surprise you. It won't necessarily be because better solutions will be implemented but because expectations will have diminished. Mehta reports that even though Millennials assert they are not happy that their privacy is being invaded, their actions speak louder than their words. ["New Survey Suggests Millennials Have No Idea What Privacy Means," Forbes, 26 April 2013] She writes:

"They’re happy to give away their online privacy but say they're not. A new study by the USC Annenberg Center for the Digital Future and Bovitz Inc. suggests that Millennials (ages 18-34, aka 'digital natives') are completely confused. Seventy percent of Millennials say no one should have access to their data or online behavior. Yet 25% will trade it away for more relevant advertising, 56% will share their location for coupons or deals, and 51% say they'll share information with companies if they get something in return. In response to the survey, Jeffrey I. Cole, director of the USC Annenberg Center for the Digital Future, declares online privacy dead. 'This demonstrates a major shift in online behavior — there's no going back,' he says."

Those results also strengthen arguments used by Pentland and Carroll that we are in the midst of a period in which consumers and companies are wrestling to see who controls personal data and for what purposes. Elaine B. Coleman, managing director of media and emerging technologies for Bovitz, told Mehta, "[Millennials] perceive social media as an exchange or an economy of ideas, where sharing involves participating in smart ways." Mehta is quick to point out that Coleman works in an industry that relies on data collection and analysis and fears it might skew her opinions. She concludes:

"The larger issue is Internet privacy of course—and how much the discount-searching online shoppers really know about it. Do they understand exactly what data brokers are after—salaries, hobbies, pregnancies, retail transactions? They'll take the information whether Millennials want to give it away or not. As long as marketers can justify that people want to give their privacy away, they can go ahead and push for it."

According to Steve Lohr, companies realize that they can't leave the future of data collection and analysis to chance with the hope that things work out in their favor. "Corporate executives and privacy experts agree," he writes, "that the best way forward combines new rules and technology tools." ["Big Data Is Opening Doors, but Maybe Too Many," New York Times, 23 March 2013] Lohr concludes his column by discussing the work that Professor Pentland is doing at MIT. He writes:

"Dr. Pentland, an academic adviser to the World Economic Forum's initiatives on Big Data and personal data, agrees that limitations on data collection still make sense, as long as they are flexible and not a 'sledgehammer that risks damaging the public good.' He is leading a group at the M.I.T. Media Lab that is at the forefront of a number of personal data and privacy programs and real-world experiments. He espouses what he calls 'a new deal on data' with three basic tenets: you have the right to possess your data, to control how it is used, and to destroy or distribute it as you see fit. ... His M.I.T. group is developing tools for controlling, storing and auditing flows of personal data. ... Dr. Pentland's group is also collaborating with law experts, like Scott L. David of the University of Washington, to develop innovative contract rules for handling and exchanging data that insures privacy and security and minimizes risk. ... 'Like anything new,' Dr. Pentland says, 'people make up just-so stories about Big Data, privacy and data sharing,' often based on their existing beliefs and personal bias. 'We're trying to test and learn,' he says."

His team's work will surely be watched closely by commercial, government, and private stakeholders. The world of big data is predicted to continue its rapid growth and, I suspect, the privacy gorilla living in that world will grow right along with it.

May 27, 2013

Memorial Day 2013

Eagle on gravestone clearAlthough Memorial Day was first celebrated to remember the war dead from the U.S. Civil War, it was eventually celebrated to remember those lost in all wars. It was meant to be a day of somber reflection and reconciliation. Unfortunately, each year there are more men and women to remember. Michael N. Castle once stated, "These fallen heroes represent the character of a nation who has a long history of patriotism and honor - and a nation who has fought many battles to keep our country free from threats of terror." Allen West isn't happy that the meaning of the day is being lost. He writes, "While there are towns and cities still planning Memorial Day parades, many have not held a parade in decades. Some think the day is for honoring anyone who has died, not just those fallen in service to our country."

On this day we certainly should honor and remember those who gave their lives in service to their country; but, I don't think it is a bad thing to honor and remember all those who have passed before us. Centuries ago John Milton wrote, "They also serve who only stand and wait." To Milton, who had gone blind, this meant that everyone has a place in this world and, through whatever honest labor they perform, they help built the society in which they live. That service also deserves recognition. This Memorial Day is particularly poignant in light of terrorist attacks during the Boston Marathon and the recent devastating tornado in Oklahoma that took so many lives.

Perhaps the best way we can honor the dead this Memorial Day is by contributing to help the living whose lives have been thrown into chaos. One of the organizations that is first to the scene of a disaster and offering help is the American Red Cross. If you want to donate to the Red Cross, you do so by clicking on this link.

As sobering as it is to remember our dead, it would be foolish to deny that Memorial Day also marks the unofficial beginning of summer. That means many of you will be celebrating the day with family and friends with cookouts and activities. There is nothing wrong with that either. Those who have gone before us expect us to pursue life, liberty, and happiness. Have a safe and memorable holiday.

May 24, 2013

A Thought Probe Series on Tomorrow's Population, Big Data, and Personalized Predictive Analytics: Part 5, Beyond Smart Cities

I borrow the tagline for this post from Dr. Tim Campbell, author of a book entitled Beyond Smart Cities. Explaining the premise of his book, Campbell writes, "To really achieve smart cities — that is to create the conditions of continuous learning and innovation — this book argues that there is a need to understand what is below the surface and to examine the mechanisms which affect the way cities learn and then connect together." [Beyond Smart Cities] Campbell argues that the same technologies that will make cities smart will also help make the planet smarter -- something, he claims, the world desperately needs. He explains:

"Over the last few decades, nations states and bumped up against real limits in their ability to solve problems in global goods. The failures on climate change and trade in Copenhagen and Durban and Doha, respectively are only the most visible of many such issues, such as environmental sustainability, peak water, and renewable energy use. All have failed to gain much traction among national governments and international organizations. At best, progress has been slow and disappointing. Yet we cannot and should not surrender our ability to tackle global goods issues until all options are given a hearing. One unexpected source of new ideas is coming from below, from cities."

He claims his book reveals "a flowering of city-to-city exchange[s] in policy and practice that may hold promise for some global issues and many local issues of global significance." He continues:

"Cities can never solve globally significant problems by themselves, much less pay for the costs. But they are beginning to act like self-sponsored laboratories of invention. At the same time, they are establishing a new edge in best practice that can lead to more readily adoptable standards by nation states. Perhaps more important, cities are forging a system of learning and exchange that stretches right around the world, north and south, rich and poor. As national policy makers debate intractable problems of global goods, the solutions for some problems might be popping up where we least expect them."

I believe that Campbell is correct in his assessment that solving many of world's most pressing challenges begins in the city. As researchers at MIT note, "In the future, cities will account for nearly 90% of global population growth, 80% of wealth creation, and 60% of total energy consumption. Developing better strategies for the creation of new cities, is therefore, a global imperative." [City Science website] To peel the onion one more layer, if cities are important in the effort to solve many of the world's challenges, then the people living in and governing those cities are even more important. Andrea Di Maio, a vice president and distinguished analyst in Gartner Research, claims that too much attention is being given to technologies and processes and not enough attention is being given to the people involved in smart city initiatives. ["Technology Is Almost Irrelevant for Smart Cities To Succeed," Gartner, 10 August 2012] Technology," he writes, "is mostly irrelevant unless policy-makers, city managers, heads of department and city CIOs get the fundamentals right." He must have realized shortly after writing that sentence that it was too constrained in the list of people that matter. He continued:

"What really matters is how different sectors (not just government) cooperate and, how they can exchange meaningful information. Of course there is technology involved, but that's not enough to make cities smart. Cooperation requires solid governance and a roadmap that is respectful of (1) the different – and potential diverging – business objectives and timeframes of different stakeholders involved and (2) the inevitable resource constraints that affect most urban areas. Actually many people still associate smart cities to environmental sustainability and carbon footprint reduction, but the truth is that the main challenge going forward is financial sustainability and the ability to deal with an increasingly turbulent and uncertain future."

The MIT researchers quoted above mainly discuss building new infrastructure to handle the swelling number of urban residents expected in the future. As I pointed out in Part 4 of this series, smart city initiatives must deal with existing infrastructure as well as new construction. Di Maio agrees that "greenfield" projects aren't going to be sufficient to make cities smarter. He notes, "The vast majority of cities that need to become smart are developed in a brownfield environment. So there are major constraints as far as the physical infrastructure, the availability of budgets and how flexibly or cooperatively they can be used, the entrenched governance processes and the different ways in which city governments provide different services." He is spot on with that assessment.

Because smart city initiatives must be applicable to the real world in which we live, their implementation is going to require cooperation and partnerships amongst an array of stakeholders. It's going to be messy, but necessary. Di Maio explains some of the challenges:

"In some cities most public services are entirely operated by the city government: there are government-owned energy companies, transportation companies, water management companies, telcos, or they are even part of the city government itself. In other cases, city government is just a payer or a supervisor of services provided by external service providers. In either case, the roadmap to make – say – transportation smarter in conjunction with public safety is very different. If the city government owns both, it can ideally establish a common program or a common enterprise architecture or a common interoperability framework that both domains (transportation and public safety) will apply when cooperating on their smart objectives. If the city owns only one of the services and outsources the other, the cooperation must be built through a more careful negotiation process, where vendor management and governance play a much greater role. So, while technologies that can be applied to capture, process, exchange information, to control sensors and actuators, to analyze and visualize performances are pretty much the same around the world, the roles that city governments play in each and every one of the domains that must cooperate to make the city smart vary a lot. Early focus on how these roles can contribute to help or hinder smart city objectives is far more important than looking at technologies and vendors."

Concerning the changing urban landscape, analysts at Pike Research state, "The social, economic, environmental, and engineering challenges of this transformation will shape the 21st century." ["Smart Cities"] They continue:

"While there are many innovative pilot projects and small-scale developments that are looking at the smart city from a holistic perspective, there are no examples yet of a smart city that supports hundreds of thousands, never mind millions, of people. The smart city offers a coherent vision for bringing together innovative solutions that address the issues facing the modern city, but there are many challenges still to be faced. If the smart city is to truly become a blueprint for urban development, then a number of technical, financial, and political hurdles will need to be met."

For those interested in this topic, Pike Research offers a downloadable report about smart cities. The Smart Cities project, whose goal "is to create an innovation network between governments and academic partners leading to excellence in the domain of the development and take-up of e-services," also offers a number of downloadable publications on its website. All smart city initiatives rely on the collection and analysis of big data. Rather than seeing this as a "big brother" effort, MIT's Professor Alex "Sandy" Pentland believes that a win-win situation can be established that benefits individuals as well as their communities. As you will see from the following video, he also explains why targeted marketing has such a promising future and why that future is intimately connected to smart city initiatives.

I share the sense of optimism displayed by Professor Pentland. Big data analytics have the potential to change individual lives as well as cities and the planet. And if Professor Pentland is correct, and I believe he is, most of us will be voluntarily and cooperatively involved in making our cities and the world better places in which to live.

If this series of discussions about smart cities has convinced you of one thing, it should have be that the collection and analysis of big data is critical. It should also be apparent that no one company is capable of addressing all of the challenges or providing all of the insights that will be necessary to achieve smart neighborhoods, smart cities, smart regions, smart nations, and/or a smart planet. That is why I'm suggesting that an open-architecture framework be created that can integrate information from diverse databases, apply big data analytics and artificial intelligence reasoning, and develop reference ontologies for the myriad of activities that combine to help explain the complex structure of neighborhoods, cities, regions, states, and, eventually, the planet.

I'm not suggesting that such a framework will be easy to create in the short-term. I suspect that it will take decades to develop properly so that actionable insights can be drawn from the repository of knowledge that will be created. As the knowledge repository is populated, I believe this framework can be built in such a way that it provides incremental benefits to neighborhoods, cities, regions, and nations. Because the knowledge repository will be linked to ontologies and artificial intelligence technologies, the system itself will provide new insights, discover new relationships, and research and test promising hypotheses that could significantly improve how people live and interact.

There is much we don't know about cities and the data needed to help us know more is not readily available in some cases. For example, author Robert Neuwith recently discussed on NPR's TED Radio Hour why squatters and slums are going to be critical to the planet's future in the decades ahead. Neuwith believes these slums represent the "cities of the future" because by 2050 one out of every three people living on the planet will be living in them. Neuwith notes, that while it may appear to outsiders that slums are lawless areas filled with poverty and crime, there actually are rules and organizations to be found within these "cities." Understanding the relationships in these areas (as well as their relationship to more formal urban networks) won't be easy because, by their very nature, most of them are "off the grid." Only the kind of framework I suggest above can help provide the kind of bottom-up insights that will help make the lives of those billions of people better. Improve their lives and you improve the world.

May 23, 2013

Marketing, Math, and Magic

Jason Stewart, who leads content marketing efforts at Demandbase, writes, "B2B marketers ... continually blanket the web with their offers and messages in the hopes of magically getting impressions in front of their target audiences." ["Why untargeted ads are a terrible investment," iMedia Connection, 8 May 2013] Obviously the magic doesn't occur as often as marketers would like. As the headline for Stewart's article proclaims, he believes that untargeted ads are one reason the magic doesn't happen. Although he grudgingly admits that untargeted ads "might work for massive consumer e-tailers because everyone (conceivably) is a potential customer," he's certain that the approach is a bad one when it comes to B2B marketing. He continues:

Magician clear"The odds of business marketers hitting the very specific set of accounts they can actually sell to is very low. For some time now, B2B marketers have lacked the tools to effectively target, deliver, and measure online advertising. This deficiency in technology has led to a 10-year hangover, due in part to 'untargeted' advertising and strict straight-line measurements of success. ... I could go on for pages about the sheer amount of dollars wasted on untargeted ads, the loss of sales productivity following up on unqualified leads, or even the challenge of designing a one-size-fits-all web experience to cater to the behavioral preferences of an audience that will likely never fill out a form, let alone buy anything. These efforts are resulting in a bad hangover that rips across multiple business channels, from finance, to sales, to marketing. The challenge is that most companies don't have visibility into their traffic and therefore cannot begin to fix what they cannot measure."

Stewart believes that the cure for this hangover can be found in a complementary concoction of big data analytics and "real-time bidding (RTB), which gives marketers the ability to buy a single ad impression in an auction-like style across a variety of web properties through multiple advertising exchanges." Without big data analytics, the RTB scheme couldn't work because market segments couldn't be identified. Using big data analytics, companies can pinpoint "whether an audience is comprised of businesses or consumers, and then [by] identifying the context of subject matter, marketers can close in on advertising to the companies that are actually likely to buy -- a true target audience." It doesn't take much imagination to realize that this is a better approach for consumer marketing as well as business marketing. It's also why you are now hearing, "Mad Men are becoming Math Men." ["Mad Men are becoming Math Men," by Macala Wright, PSFK, 28 April 2013, and "From Mad Men to Math Men," by Robert Heeg, RW Connect, 8 May 2013] Wright explains:

"Mad Men are becoming Math Men with programmatic, transparent, scientific processes. ... Programmatic Marketing is the key phrase and practice for marketers in 2013. Programmatic marketing is a brand's use of their consumer data to measure and tailor messages to incite action – most notably in their digital display advertisements."

Wright borrows the term "programmatic marketing" from Dax Hamman, who notes, "The digital landscape has changed dramatically even in the last few years. Media fragmentation and personalized media are the new normal." ["Why you can't ignore programmatic marketing," Econsultancy, 9 January 2013] He continues:

"Marketers are collecting and mining mountains of data about those users — from the pages they visit to the items they search for and the social graphs they share with — and using it to tailor the delivery of relevant and unique messages. We call this practice 'programmatic marketing,' and it's still relatively new. But it's also clearly the future of digital marketing. Those who fail to catch on now will be left behind."

Hamman calls "data-driven targeting" an "incredible success." He concludes, "When properly analyzed, data tells you what your users want, how they're behaving, and the best ways to reach them — essential factors to effective marketing." Wright adds, "Programmatic marketing enables brands to optimize their media spends and eliminate spend waste by automating ad targeting through leveraging that data and predictive analytics. The practice of programmatic marketing is directly related to the contextual relevance of the ad content to the target audience's behaviors, needs, geographic locations and possibly other AIO [activities, interests, opinions] variables." According to an article published in eMarketer, however, few marketers have yet to master big data opportunities. ["What Do Marketers Want From Big Data?" 10 April 2013] In a survey conducted by the CMO Council and SAS, 61 percent of the respondents indicated that big data is "part obstacle/part opportunity, but we have a long way to go." The article concluded:

"There is no question that however marketers implement Big Data, whether at the operations or outreach level, or both, its role will only get bigger. Nearly six out of 10 analytics professionals from around the world surveyed by Lavastorm Analytics in February 2013 said that their company would be increasing investment in analytics."

Commenting on the eMarketer article, Jonathan Houston wrote, "Granted there are a lot of things outside of the people's control when it comes to gearing up for Big Data analysis. Very often fighting for the budget and the momentum to make infrastructure changes is a technical battle where the person tasked with marketing is out of their depth. Risk needs to be assessed; budgets allocated; technical infrastructure analysed and upgraded." ["What is the big deal about Big Data in marketing?" memeburn, 8 May 2013] Houston concludes:

"The greatest draw card that this does do for marketing is that it categorically changes its position in the business. This makes marketing scientific. This gives marketing a basis on which to make complex decisions that have a true measurable effect on the bottom line. Business takes that seriously, which in turn means that this will make business take marketing seriously. And that's important."

Geoff Livingston indicates that complexity is one reason that marketers are having a difficult time getting their arms around big data and targeted marketing. "Marketing today remains a great challenge," he writes, "in large part because of the consistently changing technology and media landscape." ["7 Daunting Challenges Facing Marketers," 8 May 2013] Adopting technology and automation is the first of the seven challenges he identifies for marketers. He asserts, "Balancing human intelligence, strategy, and frankly, likeability with the precision of data and analysis gleaned from Big Data is an ongoing challenge." The second challenge he identifies is data integration. Not only are there all sorts of structured and unstructured data that needs to be integrated, there are corporate data silos to be overcome. "People still think in single silos within their own domain," he writes, "and [they] are not stretching to create better results for their organizations by teaming with other communicators." His third challenge for marketers involves "rapidly evolving media." He calls this "a huge issue." He explains:

"Today, media evolves so quickly that volatility is part of the game. What worked last year, won't this year, and the same goes for 2013. Look no further than the decrease Facebook has suffered in tactical viability for some types of business. Marketers need to move away from channel specific strategies, and adapt a truly liquid approach to communications. Meaning, deliver a complete content and engagement effort to serve stakeholders wherever they are, and how they like to receive information in that specific medium. Further, businesses should adapt an attitude of constant experimentation."

His next two challenges involve the "omnipresent Internet" and "visual revolution" that has taken place as a result of mobile technologies. In this kind of environment, getting attention is becoming much more difficult. Livingston believes that businesses must create "specific experiences" for their target audiences. He believes that in the future most leads are likely to come from "online content and other forms of inbound marketing." Nurturing skills that make inbound marketing more effective is his sixth challenge. His final challenge for marketers is getting "stuck" in one way of doing things and, thus, "are often limited in conversation to their tactical area of expertise."

Livingston is not sure that solutions to all of these challenges currently exist. Certainly big data analytics and targeted marketing help, but they are only part of the solution if marketers are going to make magic for their clients in the years ahead.

May 22, 2013

Sales & Operations Planning Must be Inclusive

Back in 2011, John Westerveld penned a two-part blog that discussed "a new way to think about S&OP." ["Thoughts from Kinexions – A new way to think about S&OP, Part 1 and Part 2," The 21st Century Supply Chain, 21 and 24 October 2011] The old way of thinking about S&OP, he explained, involved a team that used some kind of demand planning system to create forecasts which were then bounced off a traditional ERP system to see if the forecasts were doable. One of the problems with this method, he noted, was the assumption "that you will only need to evaluate a single demand plan." Anyone familiar with forecasts knows that they are seldom accurate enough to survive in the real world. Westerveld explained in Part 2 of his post that an S&OP process that could match the vagaries of the real world required six different functionalities. They are:

  1. S&OP chair circleYou need a system that allows you to create 'what-if' scenarios instantly, and provides the ability to collaborate with these scenarios.
  2. You need to have both collaborative demand planning and complete supply planning in the same tool. The supply planning tool needs to accurately emulate the planning done by your ERP system.
  3. You need to be able to drive the supply planning system from the demand plan.
  4. Further, you need to be able to configure which forecast stream (or combination of streams) forms the demand plan (and therefore drives the supply planning process). This combined with #1 will allow you to evaluate and compare different demand planning scenarios.
  5. You need excellent reporting tools that allow you to understand supply issues, their cause, and potential resolutions.
  6. You need excellent reporting tools that allow you to present the recommended S&OP plan, the issues, and alternative resolutions to the executive team.

There are a few words in those six functions I would like to highlight; namely: collaborate, collaborative, and the repeated use of the word "supply." Although the word "supply" isn't found in "Sales & Operations Planning," the supply chain clearly plays an essential role in ensuring that any plan (or plans) can be implemented. That's why I was surprised when I read a headline declaring "The 'S' in S&OP Can Stand for Supply Chain, Too." [Becky Partida, IndustryWeek, 18 April 2013] It seemed odd to me that business executives needed to be reminded that their supply chains undergird their businesses. Partida, who is a supply chain management research specialist with APQC, notes, "Sales and operations planning (S&OP) has the potential to promote visibility within the enterprise and foster collaboration among business functions. However, the functions involved in the S&OP effort can vary from organization to organization. " Notice that the word "collaboration" is once again used when discussing S&OP. She reports that analysts at her company asked a number of organizations to tell them what functions they included in their S&OP processes. The results are shown in the following graph.

Partida S&OP Figure 1

One thing struck me immediately upon viewing the chart: Some companies appear to be conducting "sales and operations" planning without involving "sales and marketing,"  manufacturing," and/or "logistics." I'm not sure how that's done. You would have thought that all those categories would have been 100%! Part of the answer may be that some companies simply don't have an S&OP process, which means nobody is involved. Malory Davies, editor of Supply Chain Standard, reports, "The concept of Sales and Operations Planning is hardly new, but a surprisingly large number of companies, 72 per cent, have only started to make use of it in the past five years, according to a survey by consultants Bearing Point." ["All going to plan?," 30 April 2013] Davies goes on to report, "The survey, which focused on process industries in western Europe, also found some significant weaknesses in how S&OP is implemented, notably the supporting IT systems and a lack of S&OP integration to risk management to the finance function as well as to supply chain partners." One of the recommendations made by Bearing Point was improving "collaboration across the company."

By now you should get the idea that S&OP processes need to be much more collaborative than they have been in the past. Peter Balbus, Managing Director of Pragmaxis, LLC, defines S&OP "as a set of integrated corporate-wide planning processes that enable senior management to strategically direct operations with the intent to achieve superior levels of performance on a sustained, long-term basis." ["How S&OP is Changing the Face of Advanced Supply Chain Management," Dustin Mattison's Blog, 5 May 2013] He went on to explain:

"It means integrating customer-focused sales plans, historic trend data and predictive analytics with supply chain management to enhance customer fulfillment capabilities, drive efficiency, improve resiliency and maintain agility to respond rapidly to changes in customer, market and supply dynamics. Successful S&OP processes align operations with the corporate business strategy. As companies have become more global and face rising complexity, volatility and uncertainty, the importance of S&OP is increasing. Especially in those industries that are well-served and where we see growth is flattening, S&OP becomes critical to competitive survival."

I appreciate the fact that Balbus mentioned corporate alignment. You simply can't align an organization if some parts of the organization are excluded from S&OP process. That's why I insist that S&OP must be inclusive. Rich Sherman, a Supply Chain Discipline Expert at Trissential, insists, "Collaboration is the key ... to unlocking the hidden wealth in supply chain operations. For that, S&OP is among the most important best practices and processes a company can implement." ["Sales & Operations Is Only the Tip of the Iceberg," SupplyChainBrain, 13 March 2013]

Balbus also reinforces a point that Davies made earlier. He told Mattison, "The overwhelming majority of companies today have ongoing S&OP initiatives, significantly fewer have fully implemented these planning processes and even fewer are reporting successful outcomes among those that have." I suppose another way of stating that would be, "Companies know S&OP is important but they don't have a clue about how to implement it." Balbus notes that one market survey indicated that "90% of companies responding believe that a strong S&OP process improves supply chain agility and efficiency. All well and good. But only 13% of these same companies report having effectively tied S&OP planning to execution activities!" Balbus concluded his interview with Mattison by sharing a few thoughts about how companies can better implement S&OP processes.

"The key to any successful S&OP initiative is the ability of the organization to align demand, supply, inventory and financial plans easily and seamlessly against the overall business strategy. This requires a technology layer to enable demand translation -- that is, the modeling of changing product mix, and the visualization of equivalent units. This technology layer then becomes the system of record tying together the multiple S&OP plans. However, this alone is not sufficient to sense and define a response to buy- and sell-side market changes. All too often, plans are built on enterprise data – and not external market data. In addition, the process definition is inside-out (where enterprise data is used to predict future market shifts) rather than outside-in (where market data used to sense and shape responses based on market shifts). The most effective S&OP processes use both buy- and sell-side market data to bi-directionally align the organization from market-to-market. As the company shifts from inside-out to outside-in, data models must to be redefined and the technologies re-architected to reflect this profound change in orientation. These are fundamentally different data models. Another common barrier to achieving S&OP excellence is that companies are not sufficiently deliberate in their statement of goals, definition of governance practices, or the definition and alignment of key performance metrics. While companies universally state that they want to improve their S&OP processes and want to be agile, they often struggle to define what this means specifically for their company to make it 'actionable'."

Supply chain analyst Lora Cecere has been touting, for some time, the notion that organizational strategies need to be more outside-in rather than inside-out. If you haven't done so, you should join her Supply Chain Insights Community. Outside-in demand visibility often involves big data. Nari Viswanathan, Vice President of Product Management at Steelwedge, reports that companies are collecting more product, supply, demand, and finance data; but, "are not actually leveraging this data in their S&OP processes, thereby leaving 'blind spots' in their decision making processes around critical supply/demand trade-offs." ["Big Data Is Becoming a Big Deal for Agile S&OP," SupplyChainBrain, 13 March 2013] The biggest blind spots of all, however, can come from within an organization if planning isn't an inclusive activity. Partida concludes:

"Because S&OP affects the entire supply chain, organizations should give serious thought to involving representatives from the supply chain functions in this process. Organizations should consider whether any additional costs incurred in the planning process would be offset by increased efficiency or costs savings generated within the procurement, manufacturing, or logistics functions. It may be that involving the supply chain in S&OP is worth the investment."

I might have put it another way, "Is excluding parts of your organization from the S&OP process worth the cost?" The simple answer is, "No."

May 21, 2013

Supply Chains: Growing Risk, Less Preparedness

In spite of the numerous supply chain disruptions that have occurred over the past several years, results from "the 2013 Aon Global Risk Management Survey points to a significant decline in risk readiness among many of the survey respondents. On average, reported readiness for the top 10 risks dropped a material 7 percent (from 66 to 59 percent) from the 2011 survey and reported loss of income increased 14 percent." ["Worrying trend emerges from 2013 Aon Global Risk Management Survey," PR Newswire, 22 April 2013] Not all industry sectors were found in retreat, "Of the 28 industries defined in the report, only three industries – pharmaceutical and biotechnology, non-aviation transportation manufacturing and agribusiness – reported the same or improved levels of readiness this year." A decline in risk readiness is a bit surprising since risk awareness has never been higher. This was underscored by the fact that the number of respondents to the 2013 survey was 47 percent higher than the number who responded to the 2011 survey. Risk sign clear

Stephen Cross, chairman of Aon Global Risk Consulting, stated, "One possible explanation of the decline in risk readiness could be that the prolonged economic recovery has strained organizations' resources, thus hampering the abilities to mitigate many of these risks. Our survey revealed that, despite diverse geographies, companies across the globe shared surprisingly similar views on the risks we are facing today – whether or not they feel prepared." Another possible explanation for the decline in risk readiness might be the result of increased risk awareness. Executives are no longer living in blissful ignorance when it comes to risks and, therefore, their sense of readiness may have declined.

Below is the Top Ten list of risks identified by respondents.

Risk Description

Risk rank - 2013

    Risk rank  - projected 2016

Economic slowdown/slow recovery

              1

                        1

Regulatory/legislative changes

              2

                        2

Increasing competition

              3

                        3

Damage to reputation/brand

              4

                        8

Failure to attract or retain top talent

              5

                        5

Failure to innovate/meet customer needs

              6

                        4

Business interruption

              7

                      11

Commodity price risk

              8

                        7

Cash flow/liquidity risk

              9

                      10

Political risk/uncertainties

            10

                        6

Judging from the top three risks (which aren't predicted to change over the next few years), respondents seem convinced that economic recovery is going to be sluggish even as competition increases. Kenneth Rapoza notes that those risks "are largely uninsurable," which may account for their high priority. ["What Keeps Business Leaders Up At Night?" Forbes, 23 April 2013] Cross told Rapoza, "The [insurance] industry globally should wake up and think that these are the risks and concerns their clients worry about most." Thinking about it and being able to do something about it are two different things. As Rapoza asserts, "It's one thing to insure an oil spill or lost cargo deep in the Pacific Ocean. But how does a German nuclear power company insure against legislative risk?"

The press release notes, "Political risk/uncertainties broke into the top 10 risks for the first time in 2013. Due to the increasing civil wars and social and political conflicts around the world, this risk is projected to move up to number six in the 2016 survey." For more on that topic, read my post entitled Geopolitics and Supply Chain Risk. In what was the biggest surprise to me, weather/natural disasters did not make the Top Ten list for 2013. It ranked number 16 (although it is "projected to jump into the top 10 list at number nine" by 2016). Most of the significant supply chain disruptions that have occurred over the past five years have been related to weather or natural disasters. Aon Global Risk Consulting analysts don't anticipate this will change "given the unusual climate patterns worldwide and an unprecedented increase in natural disasters and weather events." Rowan Douglas, Chairman of the Willis Research Network (WRN), asserts, "[The] breadth and impact of natural disasters in recent years, coupled with growing concern about the emerging effects of climate change on assets and business operations, have driven resilience to natural hazards high up the corporate risk agenda." ["Willis Report Warns of Growing Risks from Natural Catastrophe Exposures," Insurance Journal, 23 April 2013]

The next 10 risks on the survey list were:

11. Exchange rate risk
12. Technological failure
13. Third party liability
14. Supply chain failure
15. Corporate credit risk
16. Natural disasters
17. Property damage
18. Cybercrime
19. Compliance
20. Counterparty credit risk

A list of the Top 50 risks can be found by clicking on the following link. The Aon Global Risk Consulting analysts "uncovered several significant risks to watch, as these are perceived to be underrated risks." Those risks include: Computer crimes/hacking/viruses/malicious codes; Counter party credit risk; Loss of intellectual property/data; Social media; and Pension scheme funding. Norman Marks, who was a chief audit executive and chief risk officer at major global corporations for more than 20 years, believes there are other "massive risks that are faced ... by a majority of organizations and, even if they are recognized, are often accepted instead of corrected." ["The Important Risks That Are Overlooked but Should Come First," Marks on Governance, 23 April 2013] His list includes:

  • The board and top management setting organizational objectives and monitoring performance without sufficient information. ...
  • A failure to consider risks when establishing strategies and objectives. ...
  • Executives making business decisions without adequate, current, timely, and reliable information. ...
  • A failure to consider risk when making day-to-day business decisions. ...
  • An inability to monitor risk as it changes, which is very often at least daily. ...
  • A failure to communicate and explain the personal relevancy of organizational strategies to every manager and decision-maker. ...
  • Putting cost considerations ahead of the quality of the management team and the workforce in general. ...
  • Processes and systems that cannot move and adapt – a lack of agility. ...
  • A board that is unable to provide effective oversight. ...
  • A conflict between the personal interests of the executive team (short-term results, bonuses, stock appreciation) and the long-term interests of the organization as a whole. ...

Javier Gimeno, a professor in international risk and strategic management at INSEAD, agrees with Marks that corporate boards need to be more involved in risk management. "As part of the board's responsibility to endorse and monitor strategy," he states in the press release, "directors should gain an intimate understanding of the major strategic risks, possible scenarios and how the appropriate strategy allows the exploration of uncertainties and mitigation of strategic risks."

One thing that most analysts seem to agree upon is that because the world is more connected it is more difficult for companies to isolate themselves from catastrophes. Cross told Rapoza, "There's an inter-connectivity of global risk. What happens overseas can impact you anywhere." Phil Ellis, CEO of Willis Global Solutions Consulting Group, agrees, "Major catastrophes – so called 'black swans' – are not the rare risks they once seemed. Population density, urbanization, globalization and climate change make the world increasingly interconnected. A catastrophe in a far-off locale is no longer a remote risk; it could have an immediate impact on a company's operations. Risk modeling can help companies understand, quantify and articulate threats to the bottom line, which in turn helps them plan and prepare for these scenarios."

The Strategic Sourceror concludes, "Organizations that understand risk management is not just part of the business but a means to improve the organization can drive their bottom line. ... Early risk identification can be a method for minimizing the impacts of changes to a supply chain." ["Companies are less prepared for risk," 23 April 2013] A good place to start identifying potential risks to your business is the list generated in the 2013 Aon Global Risk Management Survey.

May 20, 2013

Big and Small Data

Dr. Rufus Pollock, Founder and co-Director of the Open Knowledge Foundation, insists that the real revolution that will take place in the Big Data era will involve "small data." ["Forget Big Data, Small Data is the Real Revolution," Open Knowledge Foundation Blog, 22 April 2013] Pollock asserts, "The real opportunity is not big data, but small data. Not centralized 'big iron', but decentralized data wrangling. Not 'one ring to rule them all' but 'small pieces loosely joined." He explains:

"The real revolution ... is the mass democratisation of the means of access, storage and processing of data. This story isn't about large organisations running parallel software on tens of thousand of servers, but about more people than ever being able to collaborate effectively around a distributed ecosystem of information, an ecosystem of small data. Just as we now find it ludicrous to talk of 'big software' – as if size in itself were a measure of value – we should, and will one day, find it equally odd to talk of 'big data'. Size in itself doesn't matter – what matters is having the data, of whatever size, that helps us solve a problem or address the question we have."

Although I agree with Pollock that what really matters "is having the data ... that helps solve a problem or address a question," for many of those problems and questions, size does matter. Outliers in small data sets can significantly skew results. Pollock himself admits that even though "for many problems and questions, small data in itself is enough," there are times when you need to "scale up." He believes, however, that "when we want to scale up the way to do that is through componentized small data: by creating and integrating small data 'packages' not building big data monoliths, by partitioning problems in a way that works across people and organizations, not through creating massive centralized silos."

Big data initiatives generally involve integrating data not creating "massive centralized silos." Nevertheless, Pollock is not alone in the belief that small data will play a role in the big data era. Bruno Aziza, Vice President of Marketing at SiSense, believes that one of the surprises that has emerged is that "Big Data isn’t about “Big”. [Forbes, 22 April 2013] By that he means that the term "big" is subjective. What is considered big today might be considered normal a few years from now. In other words, like Pollock, he believes that size is irrelevant as a descriptor of data sets. Aziza also agrees with Pollock that most problems don't require petabytes of data to solve. "Sometimes," he writes, "what can be perceived as 'Small Data' can go a long way." That's true as long as it's the right data. Regardless of the size of the data set, what really matters according to Aziza is the analytics applied to it.

The big data era began, he claims, as the result of a revolution in storage capability. He calculates that a terabyte of disk storage would have cost upwards of $14 million (adjusted) in 1980 but can be bought today for $30. When it comes to analytics, however, he asserts that what has occurred has been more evolutionary than revolutionary. Eric Schwartzman, founder and CEO of Comply Socially, underscores the importance of analytics. He writes:

"An avalanche of information is not necessarily a good thing. More often than not, it's a path to obfuscation rather than enlightenment, where speculation inflicts irrevocable harm and sensationalism travels farther and faster than tolerance. If you're a business, the takeaway is that sharing without analytics is essentially useless, that engagement is not as valuable as insight, and that seeing things in context is more important than being popular." ["Without Analytics, Big Data is Just Noise," Brian Solis blog, 24 April 2013]

Jake Sorofman, a research director at Gartner, believes that big data is still a big deal, because "big data [is] the intelligence behind microtargeting." He also agrees with Pollock and Aziza, however, that relevant smaller data sets will remain important in the big data era because "the precision of your aim doesn't matter if the customer experience falls short." He believes these relevant smaller data sets will be created from larger data sets to create "Big Content" and claims that they will be created through content curation. ["Forget Big Data—Here Comes Big Content," Gartner, 12 April 2013] He believes that curated content is especially important in the marketing sector because "content is the grist for the social marketing mill." He continues, "The rhythm and tempo of social marketing puts extraordinary pressure on marketing organizations that are more accustomed to publishing horizons measured in weeks and months than those measured in minutes and hours." As a result, "The expectation for content quality and authenticity has changed dramatically."

Steve Olenski agrees that in the marketing arena less is often more when it comes to the data involved. ["When It Comes To Big Data Is Less More?," Forbes, 22 April 2013] His take on why "less is more" is a bit different than the pundits discussed above. Olenski focuses on the fact that some of the sensitive data that is collected isn't necessary to achieve desired goals. He writes:

Big Data diet clear"Two esteemed professors at an Ivy League school say that while those in the marketing world continue to struggle with how to handle all the data they are accumulating, they may in fact be wasting their time and more than likely need to go on what they refer to as a 'data diet.' ... According to the aforementioned professors, all the talk about Big Data and privacy may be, as they put it, 'a tempest in a teapot.'"

Since many analysts believe that privacy issues are going to create a big storm rather than a tempest in a teapot, Olenski reached out to Eric Bradlow and Peter Fader, Professors of Marketing and Co-Directors of the Wharton Customer Analytics Initiative. Olenski indicates that the two professors "have studied the problem of data-privacy from an empirical perspective." He continues:

"Their research shows that brands and companies who are on a 'data diet' don't necessarily lose that much customer insights because limited customer data in conjunction with aggregate information (less privacy sensitive) can still provide precise insights. And when it comes to personal data, Fader says bluntly that 'most sensitive data is worthless and firms are often making mistakes to try to use it (or even collect it).' And adds that 'when you build a really good model, there isn't a whole lot to be gained by bringing in personal data."

That should be good news for marketers and consumers alike. Olenski reports that Bradlow and Fader believe "brands should keep the data they need to stay competitive and ditch everything else." That's the essence of a data diet. Bradlow told Olenski, "I think there is a fear and paranoia among companies that … if they don't keep every little piece of information on a customer, they can't function. Companies continue to squirrel away data for a rainy day. We're not saying throw data away meaninglessly, but use what you need for forecasting and get rid of the rest."

I'm not so sure that it's "fear and paranoia" that motivates companies to collect data as much as the unknown. Since we are at the beginning of the big data era, we really don't know what data is going to useful in the years ahead as analytics and the questions they address change. We are just beginning to appreciate exactly how valuable analyzed data can become. So at least for the next few years, discovering which data is most relevant and then concentrating on analyzing it should be a priority for most businesses. As companies get more comfortable in the world of big data, we are likely to read more about curated data sets and big data diets.