Site moved to enterrasolutions.com/2013/04, redirecting in 1 second...

« March 2013 | Main | May 2013 »

21 posts from April 2013

April 30, 2013

Semantic Web and Semantic Search

Sir Timothy "Tim" Berners-Lee, inventor of the World Wide Web, also coined the term "semantic web" nearly a score of years ago. It has taken most of two decades to develop technologies that hold the promise of making his vision a reality. Some people believe it may take another twenty years to perfect. Bill Kilpatrick, a student at Temple University's Fox School of Business, offers this brief, but informative, description of the semantic web concept. ["What Is The Semantic Web?," HASTAC, 15 November 2012] He writes:

"The technology aims to link up information, on a worldwide scale, in a way that is easily understood by machines. In essence, the Semantic Web will allow computers to process syntax closer to the way humans do by describing things in ways that computers can understand. For example, consider the following statements:

  • Google-semantic-searchThe Rolling Stones are a rock band.
  • Keith Richards plays guitar in the Rolling Stones.
  • 'Brown Sugar' was recorded by the Rolling Stones.

"Those sentences, and their mutual relationship, are easily comprehended by most people. To be understood by computers, however, they would need the ability to process syntax semantically. This process is likened to the use of hyperlinks, which connect a current web page to another one, thus defining a relationship between the two pages. However, on the Semantic Web, an important difference is that such relationships could be recognized between any two or more resources, if the information is properly structured. To do this, the Semantic Web uses special languages for detailing web-based resources and information, such as RDF (Resource Description Framework). Information put into RDF files allows computers to find, extract, store, analyze, and process web-based information, which the Semantic Web can then describe."

Berners-Lee adds:

"When you use speech grammars and VoiceXML, you are describing possible speech conversations. When you use XML schema, you are describing documents. RDF is different. When you use RDF and OWL, you are talking about real things. ... Because this information is about real things, it is much more reusable. ... The general properties of a car, or a product of your company, of real things, change rarely. They are useful to many applcations. This background information is called the ontology, and OWL the language it is written in." ["Speech and the Future," World Wide Web Consortium, 14 September 2004]

A company called hakia, which advertises itself as "a pioneering company in semantic search technology, notes that a semantic search has to achieve at least ten things in order to be successful. ["What is Semantic Search?"] They are:

1- Handling morphological variations (like tenses, plurals, etc.)
2- Handling synonyms with correct senses (like cure, heal, treat,.. etc.)
3- Handling generalizations (like disease = GERD, ALS, AIDS, etc.)
4- Handling concept matching
5- Handling knowledge matching (like swine flu = H1N1, flu=influenza)
6- Handling natural language queries and questions (like what, where, how, why, etc.)
7- Ability to point to uninterrupted paragraph and the most relevant sentence
8- Ability to Customize and Organic Progress -- Semantic search allows customization in various stages by the owners of the system as well as the user of the system (i.e., such as semantic tagging) where search becomes a part of a social network formed around a business.
9- Ability to operate without relying on statistics, user behavior, and other artificial means
10- Ability to detect its own performance -- A semantic search engine is expected to produce a relevancy score that reflects the degree of meaning match. ... Accordingly, the search engine can understand its poor performance to automatically flag areas of improvement that is needed.

The ability to confidently discover semantic relationships dramatically improves search results. Hakia insists, "Conventional search systems (indexing keyword search) can no longer meet the increasing demand for quality results and time-saving practices in today's world, nor do they offer any room for progress for the future. As a result, semantic search has increasingly been the choice as the next step by many businesses during the last decade." It continues:

"Semantic search technology is based on a computerized system that understands content and query similar to how the human brain processes natural languages. Instead of matching the occurrence of words or symbols (as done in indexing systems), semantic search systems match concepts and their meaningful variations. As a result, a number of benefits emerge:

  • Accuracy: Improves the accuracy of the search results exponentially
  • Focus: Transforms search function from pointing a document to pointing a direct answer
  • Engagement: Allows flexibility to use natural language queries, thus increases user engagement
  • Intelligence: Enables semantic understanding of the user behavior via search analytics
  • Control: Prevents manipulations from content providers and users
  • Independence: Does not rely on external inputs (i.e., popularity) for base performance
  • Progress: Allows customization, user input, organic improvement

These advantages result in indisputable return of investment that are reported in several enterprise-wide case studies."

Eric Savitz agrees that semantic search has the potential to change the business landscape dramatically. ["5 Ways Semantic Search Will Disrupt Business," Forbes, 20 June 2012] The reason he believes that semantic search has a promising future is because it can be used to help make sense of big data. He writes:

"As the Big Data dialogue progresses and the information onslaught grows more acute, a single technology has quietly evolved that holds the promise to put our data anxieties to rest and it's been here all along. One could even say that it was one of the origins of the Big Data challenge – semantic search. Semantic search significantly improves search accuracy and relevance by understanding a searcher's true intent and the contextual meaning of words. The technology considers the context of search, location, intent, word variation, synonyms, multiple meanings of words and foreign language to provide users with exactly what they are seeking. Say good-bye to the days of search results with endless pages of blue links."

Savitz believes there "are 5 areas where semantic search will disrupt and transform business and help solve for the global data obesity challenge." They are:

  • SEO: By 2016, the interactive advertising business will reach $77 billion, according to Forrester. ... Semantic contextualization will enable better ... targeted ads based on a searcher's intent. One to one consumer/advertiser relationships will reach its full potential with semantic search. ...
  • Database Management: Using semantic search to build new ways of analyzing the massive amounts of data that businesses are generating will allow them to identify new business opportunities. ...
  • Drug Discovery: Using semantic search will allow streamlined access to critical information necessary for complex product development such as drug discovery. ... Semantic technology can help choose better candidates by using matching technology combined with scoring and ranking systems, saving money and re-filling the global drug pipeline.
  • Travel: ... The next big transformation in [the] travel industry will be booking the complete travel package. Semantic search can dramatically simplify discovering destinations and activities, reduce the complexity involved with tailoring a vacation. ...
  • Human Capital Management: ... Semantic search holds the promise of becoming the killer app in human capital management because its sophisticated recognition process enables finding the right needle in a million data haystacks.

The most important word in all of these discussions, according to Doc Sheldon, is "semantic." He writes, "'Semantic', ... is a qualifier that means a great deal in this context. It demands that a machine, or more accurately, the software that drives that machine, must understand the information in the way it was intended. Let's face it: most of us know a handful of human beings that are challenged in that regard." ["Semantic Search in 2025," Search Engine Watch, 6 November 2012] He continues:

"Indeed, for a machine to comprehend the meaning behind what a human has put to text, requires a certain amount of artificial intelligence. Humor, irony, and emotion certainly seemed to be beyond the conceivable limits of a computer program in 1994. Even in 2012, there are still some that doubt that such comprehension will be possible in the near future."

As the title of his post reveals, Sheldon believes that the semantic web will be a reality by 2025. He believes that all of the tools and data are in place to make this happen. They just need to be refined over the coming decade.

April 29, 2013

Socially Responsible Packaging

Bruce Horovitz, a marketing reporter for USA Today, recently penned an article discussing how more companies are becoming socially responsible. He credits much of this trend to the Millennial generation, a "trend-setting, if not free-spending group of 95 million Americans, born between 1982 and 2004." ["Millennials spur capitalism with a conscience," 27 March 2013] He continues:

"In an ultra-transparent world, where information zips from Facebook to Twitter to Instagram, just about everything a company does is out in the open, says John Mackey, co-founder of Whole Foods, a ground-breaking company in local community support. 'If everything you're doing is seen,' he says, 'it's human nature to do things that people would approve of.' But it's no longer just outliers such as Ben & Jerry's and Whole Foods doing the right thing. Big consumer brands such as Panera, Starbucks and Nordstrom are members in good standing of the Do-Gooder Society. More likely sooner than later, corporate kindness that doesn't have its origins in the public relations or human resources department may become as common as coupons. Even in a dicey economy, kindness sells."

Recycling clearIt's not just kindness and compassion that Millennials are concerned about. They seem to be a generation much more attuned to making the world a better place in many ways, including environmentally. Over the past couple of years, sustainability has become a hot topic in supply chain circles. Packaging plays a significant role in promoting sustainability. Too much packaging is both wasteful and costly. Too little packaging, on the other hand, can result in product damage that also results in waste and monetary loss. In addition to protection and security, designing the perfect package must also take into account consumer aesthetics. After all, manufacturers produce and retailers sell products in order to make a profit. Products that sit unsold on a shelf help no one. Rosemary Grabowski, Global Marketing Director, CPG-Retail, Dassault Systèmes, reports, "As anyone involved in the consumer packaged goods (CPG) industry already knows, the role of packaging in driving sales cannot be understated. Viewing products on a store shelf is still the leading way for consumers to be aware of what products are available. Eighty-five percent of shopping decisions are made in-store — and most shoppers make their purchasing decisions within 5-8 seconds of seeing the product on the shelf. The package design, color and artwork chosen serve to stop, hold and close the shopper to select the product that is right for them." ["Designing the Perfect Package Better, Faster and Smarter than your Competition!" Consumer Goods Technology, 11 September 2012] She continues:

"With the increased importance of packaging in the overall success of a product, companies across the globe are pouring additional resources into understanding their consumer base and what exactly they want in a product. CPG companies, their packaging suppliers and the retailers that distribute these products need to start their processes by understanding the total consumer experience. What features and functions are most important to my consumers? How can I make my product stand out from the rest? What colors catch the eye? What is the optimal shelf placement? This is about defining the key elements of what is required to maximize the ability to stop, hold and close the consumer. Answering these questions with facts and insights will help improve our success in breaking through the clutter on shelf and deliver a more pleasant overall product experience."

As Millennials become the prime target for marketers, how "green" a package is will likely become more important. Grabowski points out that package design is becoming increasing complex. "Packaging design itself is full of contradictions," she writes. "The package must be tamper-proof, yet easy to open. It must be attractive, yet strong enough to handle transportation and storing. Packaging must be designed with sustainability in mind, yet secure enough to protect the product inside. And packaging must meet many regulatory requirements, yet still draw a consumer's attention at shelf." She notes that package design can no longer be "a siloed process, held completely separately from manufacturing and production."

If you think that too much fuss is being made over packaging, just consider how much is spent on packaging in two sectors: consumer packaged goods and food. According to the Procurement Leaders staff, "The global consumer goods (FMCG) packaging market will reach $436.5bn in 2013, as emerging markets demand FMCG products." ["Consumer goods packaging market tops $436bn," 23 September 2012] And the Modern Materials Handling staff reports, "Food packaging is approximately a $340 billion industry worldwide. In the U.S., about a third of household waste is food packaging, and much of this cannot be recycled." To address challenges like this, the "Rochester Institute of Technology will help to create the Center for Sustainable Packaging, an education and research center dedicated to the development and use of sustainable packaging." ["Center for Sustainable Packaging created at Rochester Institute of Technology," 18 September 2012] A report issued last fall by the consulting firm Deloitte reaffirms the need for more research and development of sustainable packaging. "Achieving breakthrough improvements in sustainable packaging is more difficult than simply substituting one material for another," Deloitte reports. ["You Need to Completely Re-Think Your Approach to Packaging, Report Says," SupplyChainBrain, 29 October 2012] The report, entitled "Thinking Outside the Box: Throw Away Your Current Approach to Packaging," points out that "many companies are seeking to improve product stewardship, achieve waste-reduction goals, and save money through sustainable packaging." The report concludes:

"The next frontier in this area requires a radical re-think of the value chain, thus achieving 'disruptive' improvements that not only reduce waste, energy and raw materials but also improve time-to-market, quality and margins. Sustainable packaging requires a healthy dose of creativity, strategic flexibility, and coordination across multiple functions of the organization. This may involve designers, manufacturers, merchandisers, buyers, suppliers, logistics providers and marketers — all challenging each other to break free from traditional conceptions of what constitutes packaging and working through the interdependencies that new product delivery models require. Companies that can embed sustainable packaging considerations throughout their supply chain management processes, from demand-and-supply planning to delivery and returns, can realize substantial environmental and economic benefits."

As the global middle class continues to increase, demand for packaged goods will also grow. The Procurement Leaders staff notes, "Packaged goods are often viewed as superior quality and more hygienic, explaining emerging market consumers’ preference for them." As the demand for packaged goods grows, the importance of sustainable packaging also increases. The dMass staff reports, "There are a lot of ideas for reducing the waste associated with packaging, from reducing the amount of materials needed for packaging, to making it easier to recycle packaging or using recycled materials as inputs. Designer Aaron Mickelson's concept is different: make packaging 'disappear' altogether." ["Resource Fix: Integrating packaging with products," dMass, 18 February 2013] Mickelson's ideas include printing packaging information right on the surface of products using ink that washes off, eliminating the need for outer packaging, and letting the product serve "double-duty as packaging." The article concludes, "Incorporating packaging right into products would reduce the amount of resources used for delivering products, as well as resources used to dispose of or recycle packaging." Tom Szaky reports on another zero waste packaging idea: edible packaging. He writes:

"Innovation is not just emerging in the form of reuse and redesign of product packaging, but in the form of a new initial purpose. For as long as we know, packaging is the part of the product that gets thrown away. Now there are several scientists working to create 'edible packaging' for products to help eliminate waste. The idea is controversial, and would require our society to adjust its norms about what is and isn’t considered edible." ["Giving packaging a second chance at life," Packaging Digest, 1 April 2013]

Szaky insists that "most consumers are blissfully ignorant of [the] important tasks" that packaging performs. Lora Cecere agrees. She writes, "It is pretty on the shelf, but it can be a problem for the sustainable supply chain. But in fact, unsuspecting consumers would never guess at the issues that the package on the shelf represents for the sustainable supply chain." ["Pretty is as Pretty Does?" Supply Chain Shaman, 17 December 2012] Despite consumer ambivalence to what goes into packaging decisions, Szaky believes that the challenges associated with package design also represent opportunities. He writes:

"Most product packaging has several aspects to it which provide reasons for a product's life to end. Physical life, functional life, technical life, economical life, legal life, and loss of desirability lead to products being thrown out or recycled. However, instead of looking at these in a negative way, we can look at each 'form of life' individually and find ways to extend them. ... In addition to the obvious fact that finding new uses for and redesigning product packaging is beneficial to the environment and supports innovation, it can also be good for business. Brand logos are printed all over product packaging, so if it just gets thrown out or recycled, people will no longer see it, and brand equity is lost. Packaging that is redesigned or designed for reuse helps to preserve the brand equity of those products for a little longer. ... Reduce, reuse, recycle … redesign. Redesign is the fourth 'R' of the future when it comes to eliminating waste. Whether it's redesigning packaging to be completely edible, creating no waste, or finding a way to design product packaging for new purpose and extend its life long term, there is no doubt that we are taking steps towards innovation in the future of waste management."

All of this attention to sustainable packaging is good news for the environment as well as the bottom lines of businesses that successfully achieve an optimally designed package. As the Millennial generation exerts even greater influence on the marketplace, companies that offer sustainable packaging are going to be ahead of the game.

April 26, 2013

A Thought Probe Series on Tomorrow's Population, Big Data, and Personalized Predictive Analytics: Part 1, Getting Started

On Friday, for the next several weeks, I'll be posting a series of articles examining how the world is becoming more urbanized and how governments, businesses, and citizens must adapt to ensure that urban growth is done intelligently. I'll discuss how big data can help influence city planning and operations as well as help manufacturers and retailers ensure that their products and services are available to urbanites. These topics are generally grouped together under the topic of smart cities.

In the post entitled Hurricane Sandy and the Resilience of Cities, I indicated that I had more to say about cities and how to make them more resilient. I believe that resilient cities are smart cities. Dr. Boyd Cohen, a climate strategist, believes "the smart-cities movement is being held back by a lack of clarity and consensus around what a smart city is and what the components of a smart city actually are." ["What Exactly Is A Smart City?, Fast Company, 21 September 2012] He continues:

"While some people continue to take a narrow view of smart cities by seeing them as places that make better use of information and communication technology (ICT), the cities I work with ... all view smart cities as a broad, integrated approach to improving the efficiency of city operations, the quality of life for its citizens, and growing the local economy."

Although I'm a technology guy, I agree with Cohen that a broad definition of what constitutes a smart city is better than a narrow one. To show how various city sectors come together to make a smart city, Cohen has developed what he calls the Smart City Wheel. At the hub of the wheel, you find smart people, smart government, smart economy, smart environment, smart mobility, and smart living.

Smart cities wheel

Cohen admits that his "model has been inspired by the work of many others." The fact that many others are working to make cities smarter should be encouraging, especially in the light of the fact that urbanization is occurring at an historical rate. Cohen indicates that "there are over 100 indicators to help cities track their performance with specific actions developed for specific needs" behind the model. He says that efforts to create a smart city should begin with three steps. They are: 1) Create a vision with citizen engagement; 2) Develop baselines, sets targets, and choose indicators; and, 3) Go lean.

Rick Robinson, an Executive Architect at IBM specializing in emerging technologies and Smarter Cities and a self-labeled Urban Technologist, has his own ideas about to get a smart cities program started. ["How Smarter Cities Get Started," The Urban Technologist, 26 July 2012] He writes:

"Many of the environmental, social and economic forces behind the transition to Smarter Cities are common everywhere; however, the capabilities that enable cities to act in response to them are usually very specific to individual cities. They depend on factors such as geographic location, the structure and performance of the local economy, the character of local communities, and the approach of leaders and stakeholders across the city. The relationships between those stakeholders and communities are crucial. Cities may aspire to encourage economic growth amongst small, high-technology businesses; or to stimulate innovation in service delivery by social enterprises; or to switch to more sustainable patterns of travel and energy usage. To act successfully to achieve any of these aims, long and complex chains of connections between individuals need to work effectively, from city leaders, through their organisations, to community and business associations such as small business forums, neighbourhood communities, and faith groups, to individual companies, their employees and citizens across the city."

Robinson makes an important point about the uniqueness of each urban area's situation. The only way to discover that uniqueness, and the connections that are critical for improving society, is through analysis of big data. As Robinson notes in his article, a "complex web of city systems" must be woven into a cohesiveness system working together towards the same objectives. He doesn't claim to have all of the answers about how this can be done, but he writes that there are observable "patterns in the behaviour of the cities who have made the most progress." He states it all starts with a plan.

"Cities already have plans, of course. In fact, often they have lots of plans – for the economy, for housing, for public service transformation, for marketing and for many other aspects of urban systems. What is really required in a Smarter City context, though, is a single plan that captures the vision and means for transformation; and that is collectively defined and owned by stakeholders across the city; not by any single organisation acting alone. It needs to be consistent with existing plans within individual domains of the city; and in time needs to influence those plans to develop and change."

While working in the risk management area several years ago, we noticed that, because large organizations have departmental silos, each department usually maintains its own mitigation and recovery plan. Because those plans aren't coordinated, actions necessary to make one plan work (like shutting down a system) could negatively impact the mitigation plans of other departments. Only through effective coordination could the overall objective of protecting the organization be realized. The same thing holds true when developing a plan for a smart city; plans must be coordinated (if not integrated) so that various groups aren't working at cross purposes. Technology, like the Sense, Think/Learn, Act™ system being developed at Enterra Solutions, can be used to code natural language plans and then analyze them to discover conflicts, redundancies, non-obvious connections (inferences) and so forth. The same technology can be used to discover the unique qualities of neighborhoods and the important connections between stakeholders that are often not easy, or obvious, to understand. Robinson continues:

"A Smarter City plan needs to set out a vision that is clear and succinct, often expressed in a single sentence capturing the future that the city aspires to. That vision is usually supported by a handful of statements that summarise its impact on key aspects of the city – such as wealth creation, inclusivity and sustainability. Together these statements are something that everyone involved in the city can understand, agree to and promote. ... To make the vision deliverable, a set of quantified objectives against which progress can be measured are vital. In IBM as we work with cities to establish these measures, we're learning that social, financial, environmental, strategic and brand values are all important and related. They could include improvements in education attainment; creation of jobs; increase in the GDP contribution by small businesses in specific sectors; reduction in carbon impact in specific systems or across the city as a whole; improvements in measures of health and well-being; and may include some qualitative as well as quantifiable criteria. It is against such objectives that specific programmes and initiatives can be designed in order to make real progress towards the city’s vision.

In other words, Robinson agrees with Cohen that creating a vision is a critical first step towards a smarter city. Robinson believes that the vision is also critical for aligning stakeholder actions for achieving success. To ensure that stakeholders stay engaged and motivated, Robinson recommends "a mixture of long and short-term projects across city domains; and in particular that [the plan] includes some 'quick wins' – in attempting to work in new partnerships to achieve new objectives, nothing builds confidence and trust like early success." Like Cohen, Robinson believes that everyone must be engaged for a plan to succeed. He writes:

"Once stakeholders from a city ecosystem have come together to define a vision and a plan to achieve it, it's vital that they maintain a regular and empowered decision-making forum to drive progress. The delivery of a Smarter City plan relies on many separate investments and activities being undertaken by many independent individuals and organisations, justified on an ongoing basis against their various short-term financial obligations. Keeping such a complex programme on track to achieve cohesive city-level outcomes is an enormous challenge. Such forums are often chaired by the city's local authority; and they often involve representatives from local universities who act as trusted advisors on topics such as urban systems, sustainability and technology. They can include representatives from local employers, faith and community groups, institutions such as sports and retail centres, and trusted partners in domains such as technology, transport, city planning, architecture and energy. The broader the forum, the more completely the city is represented; but these are 'coalitions of the willing', and each city begins with its own unique mix. In fact, a formative event or workshop that brings such city stakeholders together for the first time, is often the catalyst for the development of a Smarter City plan in the first place."

Like many projects, initial excitement for smart city initiatives can be high and then taper off over time. Keeping people engaged is absolutely essential. That won't happen if the smart cities movement is viewed as a passing trend. Ben Rooney writes, "In the pantheon of Next Big Thing trends, the concept of 'smart cities' is one of the trendiest." ["'Smart City' Planning Needs the Right Balance," Wall Street Journal, 27 September 2012] Rooney's fear is that smart cities initiatives will be driven by well-meaning, but clueless, governments and that "there is a fear that such top-down programs may threaten the very vitality that attracts people to cities in the first place." Done correctly, smart cities should increase the vitality of cities not diminish it. That's why Cohen and Robinson are so insistent that all stakeholders need to be engaged.

Rooney agrees that smart cities initiatives are generally based on data analysis. He claims, however, that doing so is nothing new. He explains:

"One of the first uses of data to drive civic policy was more than 150 years ago when during the 1854 London cholera outbreak surgeon John Snow plotted cases on a map. He traced the outbreak to a water pump in Broad—now Broadwick—Street. When the pump handle was taken off, the number of cases immediately began to decline."

Imagine that kind of analysis being applied to urban challenges based on enormous amounts of data and using today's massive computer power. Health and safety are very important areas where connectivity and analysis can play a vital role. This value was recently demonstrated during the investigation that followed the terrorist bombings in Boston. Video and photographic evidence were critical in catching the perpetrators. Enterra Solutions has helped install a system at the Port of Philadelphia that has been recognized for helping make the port more secure. Holt Logistics' Chief of Security, Kurt Ferry, praises the system as "an example of a successful collaboration between private and public sector organizations using grant funds to enhance regional maritime security." ["Security Grants Help Improve Domain Awareness: A Success Story," The Beacon, Winter 2011] According to Rooney, cities of the future (like one being built in Portugal) will be even more connected. He writes:

"PlanIT is a €10 billion, four-year project to build a new smart city in Portugal to house some 225,000 people. With sensors built into every building it presents itself as an urban utopia where smart buildings can sense our presence and anticipate our needs. But behind this drive for efficiency is a fear that optimizing for data will drive out the inefficiencies that imbue cities with their humanity; cinematic history is littered with cautionary urban dystopias, from Fritz Lang's 'Metropolis' to the Wachowski brothers' 'The Matrix.' As Ben Hammersley, the U.K. prime minister's ambassador to Tech City, told the audience at the recent WSJ Tech Cafe event, optimizing is a dangerous thing unless you are quite clear on what exactly you are optimizing."

You recall that one of Cohen's first steps was to "go lean." Rooney is warning that you can go too lean. I agree. Hammersley an adviser to the Danish city of Aarhus, gave the Tech Cafe audience a specific example of what he was talking about. He told him:

"'Most smart-city initiatives are about optimizing the route for commuters getting to the office,' he told the audience. 'But Aarhus, which has the highest number of pavement cafés in the country—it is essentially one huge café—doesn't want to optimize that. The lesson is be careful what you measure because you may end up optimizing on the wrong thing.'"

That's where big data analysis and relationship discovery technology can play a major role in helping cities become the best they can be. Rooney labels the use of big data as "a bottom-up" approach to smart city planning. He writes:

"A very different kind of smart-city initiative has had success in cities as diverse culturally and geographically as San Francisco and Singapore, and is coming to Europe. Called Urban Prototyping, the movement approaches cities from a bottom-up—not top-down—viewpoint. Peter Hirshberg, who lives in San Francisco, is the man behind the drive, which aims to bring together programmers, planners, activists, and even artists to use data and technology to solve problems in cities. 'There needed to be a dialog between people doing interesting things and the folks running the city,' he said. 'San Francisco is awash with data, but it turned out that inside the bus system the guys use Walkie Talkies and clipboards to track busses that break down.' Over one weekend in July 2011 ... developers built an iPad app to solve this problem. 'Everyone wants the busses to run properly. This was a case of citizens getting together and solving a problem.' He cites dozens of other examples. 'We had a festival called "The Summer of Smart". The head of public works for San Francisco came along with a proposal. "Where I see public art I don’t see graffiti," he told us. "Where I see graffiti, I don’t see public art."' The result was a project called Art Here that brings together the two."

Although it sounds great, Rooney admits that even the best-intentioned efforts need champions who can make things happen. He reports that nine months after the bus app had been developed for the city, it still hadn't been implemented. That kind of underwhelming response can trigger both anger and ambivalence. As Rooney put it, "City Hall has proved to be the movement’s biggest challenge." He concludes, "If this movement is to make a real impact on the lives of city dwellers it will take more than some well-intentioned hackers and a lot of pizza."

I'm speculating that is why one of the main sectors on Cohen's Smart Cities Wheel is Smart People. Based on advertisements run during the last political season, it's clear that we could use a lot more smart people in government. Robinson agrees that "it’s impossible to understate the importance of individual people in making cities Smarter. The functioning of a city is the combined effect of the behaviour of all of the people within it; and Smarter City systems will not change anything unless they engage with and meet the needs of those individuals." Fortunately, I'm an optimist. I believe that technology can be used to create a better future and smarter cities.

April 25, 2013

In Stock, Out of Stock, and On-Shelf Availability

Every retailer knows that they can't sell what they don't have. If a product is not on the shelf when a customer wants it, the likely result is a lost sale. Dan Gilmore, Editor-in-Chief of Supply Chain Digest, believes that the out of stock (OOS) "problem is getting worse for many retailers." ["A Unified Theory of Out-of-Stocks?" 19 April 2013] Gilmore calls this situation "amazing" considering all of the technology currently available. He continues:

Out-of-stock-temporarily-stamp"E-commerce has added a wrinkle to the retail OOS challenge, as now the issue isn't just products not being on the store shelf, but now potentially 'not in stock' at the e-store as well. An OOS there likely has an even bigger impact in terms of lost sales than it does at brick and mortar retail."

Gilmore points out that in this instant-gratification world, consumers don't ask for "a rain check at a brick and mortar store for sale items that go out-of-stock" like earlier generations did. Gilmore points out that today's consumers "go elsewhere, or don't buy at all. So the penalty for OOS rises." I wonder how many younger shoppers even know what a rain check is? He goes on to note, "Out-of-stocks are not at all only a retail issue, it is a core supply chain and inventory challenge for virtually every company." In his article, Gilmore refers to a research paper written by Jesper Aastrup and Herbert Kotzab, entitled "Forty years of Out-of-Stock research – and shelves are still empty." [The International Review of Retail, Distribution and Consumer Research, Volume 20, Issue 1, 2010] Gilmore states that the study's title pretty well sums up his feelings.

To underscore how difficult the out-of-stock challenge can be, Gilmore notes that Walmart, widely recognized as a leader in supply chain innovation and implementation, still struggles with the problem. He writes, "[If] mighty Walmart's supply chain is mightily challenged with OOS, no wonder the rest of us are too." To learn more about Walmart's struggles, Gilmore points to an earlier Supply Chain Digest article entitled, "Walmart US Chief Says Sales are Suffering Because of Challenges in Keeping Stores Shelves Stocked." [4 March 2013] Gilmore continues:

"Here's why this issue is so complex: it is an equation that involves forecasting, 'long tail' management, retail in-store execution, uncertain and/or difficult to calculate financial impacts, different impacts depending on product category, different impacts on retailers versus manufacturers, the Bullwhip Effect, the Perfect Order, vendor variability, store inventory accuracy, overstocks, collaboration, etc."

In other words, there is no one cause that can be pointed out as the culprit creating out-of-stock situations. A large part of the problem, however, is the fact that there is not enough information sharing among stakeholders. One frequent result of information starvation is the Bullwhip Effect mentioned by Gilmore. The Bullwhip Effect is characterized by the depletion and rapid restocking of inventory. Loretta Chao and Lorraine Luk explain:

"This frustrating phenomenon occurs when falling customer demand prompts retailers to under-order so as to reduce their inventories. In turn, wholesalers under-order even further to reduce theirs and the effect amplifies up the supply chain until suppliers experience stock-outs – and then over-order in response. The effect can ripple up and down the supply chain many times." ["Acer Reassesses Inventory Policies," Wall Street Journal, 6 June 2011]

Until true demand driven supply chains are implemented, the best that can be hoped for is better collaboration that will help dampen the consequences of the bullwhip effect. Brent Nagy of TMC, offers "5 tips for successful Demand Smoothing":

"1. Dedicated Assignment. Asking a supply chain or transportation employee(s) to fit this forecasting and planning responsibility in amongst their day-to-day tasks typically does not work well. Assign someone who has time to make it a priority.

"2. Supplier and Carrier Collaboration. Give ample warning to vendors and Tier 1 carriers so they know what's coming. Often, they can make their own adjustments to help with surges in your demand.

"3. Metrics Monitoring. Understanding route guide and tender, fill rate, CWT and visibility by way of track and trace and potential late loads all act as ways to manage and isolate areas of concern and success. ...

"4. Use Proven Modeling Tools. While it seems simple, there is a science to Demand Smoothing. There are TMS-related technologies and processes that can be powerful tools.

"5. Project 100 Days Out. Work with sales and manufacturing teams on the inside and with suppliers and carriers on the outside to create a Demand Smoothing plan that looks out 100 days." ["Planning for the Bullwhip Effect," TMC Managed TMS Blog]

Walmart must believe that better collaboration with suppliers will help with its out-of-stock challenges. It recently announced "suppliers who receive appropriate training will be able to use their smartphones to perform many of the functions that currently require assistance from a store associate armed with a bulky Telxon tracking device." ["Walmart Gives Vendors Access to Backroom Inventory Data," by Tom Ryan, RetailWire, 15 April 2013] Ryan reports, "Beyond real-time access to data, suppliers and third-party merchandising firms will be trained to use the devices to handle basic practices such as printing missing shelf labels for items, stocking their products, and ensuring their displays are correct."

My company, Enterra Solutions, offers a few products that can help address some of the challenges associated with out-of-stock situations. Our Pre-Inventory Prediction (Available to Promise) and EnterraConnect™ modules represent critical components of Enterra’s Supply Side Visualization capabilities. Pre-Inventory Prediction utilizes advanced predictive capabilities. The system can monitor your vendor supply chain prior to warehouse delivery in order to detect shipping delays and identify the perturbative effect on customers. It then recommends remedies to minimize the impact of delivery delays, and it proactively assists with avoiding order cancellations through the renegotiation of cancel dates or by suggesting an alternative product. In addition, the system helps to reduce customer service time further by automating the tedious task of allocating limited-availability inventory across multiple order lines, purchase orders, and deliveries. This not only helps your organization meet current demands, but supports future growth by creating a positive experience for the customer. EnterraConnect extends supply chain visibility to the manufacturing vendor by tracking key milestones around production. This solution monitors supply chain prior to warehouse delivery to minimize the impact of delivery delays.

  • Attempts to mitigate "product not available" issues
  • Monitors shipping vessels, trucks, and rails to detect shipping issues
  • Predicts the effects of shipping delays or production delays on customer orders
  • Recommends remedies to minimize the impact of delays
  • Suggests container unpacking priorities
  • Allows for pre-allocation of future inventory deliveries (Available to Promise)
  • Automates the task of allocating limited-availability inventory
  • Extends milestone tracking to manifesting vendors to detect and respond to production delays.

Enterra's In-Flight Optimization offering addresses challenges in fast-paced sectors such as food, where risk of spoilage and daily replenishment factors are critical, having a system that can identify in-flight process optimization opportunities can mean the difference between realizing a highly profitable supply chain or having no supplies at all. Enterra’s In-Flight Supply Chain Optimization Module is designed to predict risks associated with delivery delays and inventory decline to protect against product deterioration and/or depletion. The offering optimizes time-critical supply chains to reduce product expiration and improve delivery results.

  • Predicts intra-day at-risk customer service issues
  • Reduces possible goods spoilage
  • Allows for more frequent inventory replenishment
  • Enables dynamic production changes to maximize deliveries
  • Allows for intra-day optimization in fast-paced sectors such as food

In today's fast-paced world, more frequent inventory replenishment is becoming a necessity. Robert J. Bowman, Managing Editor of SupplyChainBrain, notes, "For retailers, the bar just keeps getting raised. Supply-chain excellence used to be about filling orders seasonally. Then weekly. Then daily. Now we hear of merchandise being replenished multiple times a day." ["In Modern Retail Replenishment, Once a Day Isn't Enough," 15 April 2013] He concludes:

"Intra-daily replenishment isn't easy to do, despite the wealth of data that flows through a typical retailer’s supply chain today. Merchandisers face two discrete challenges. One is technological, involving the systems needed to wed real-time demand data with planning. The other is physical – are the store and its distribution centers set up to pick, load and move product multiple times throughout the day, within extremely narrow windows? ... Intra-day replenishment isn’t only for brick-and-mortar operations. E-commerce merchandisers can become more agile in their fulfillment strategies as well. ... Intra-day replenishment might seem exotic to U.S. retailers today. As with every other advance in supply-chain efficiency, however, it’s likely to become a critical part of many merchandising strategies. While the future is impossible to predict, you can count on one thing: retail supply chains aren’t going to get slower."

Another Enterra product that can help with inventory management is our Productive Inventory Module, which helps to maximize the relationship between manufacturer and retailer by helping to determine the optimal product mix for your customer base. By integrating a menu of syndicated data, such as detailed store-related factors, local demographics, point of sale information, and other various historical data, the solution is able to identify underperforming products and better predict demand, which helps your organization maximize its shelf space. This product maximizes the productivity of shelf space and helps to optimize the retailer/manufacturer relationship.

  • Eliminates or fixes underperforming products
  • Looks for "turned off" products
  • Helps to better predict demand
  • Reduces spoilage

Even when a product is "in stock" if it isn't on the shelf and readily available to consumers, sales can still be lost. That's why Professor Jim Crowell, director of the Supply Chain Management Research Center at the University of Arkansas, reminded the editorial staff at SupplyChainBrain, "the notion of 'in stock' is quite different from that of 'on-shelf availability'." ["Why On-Shelf Availability Is Critical for Retailers," 17 August 2012] Crowell stated that knowing a product is in the building "might seem like a valuable metric, [but] it misses out on the complexity of retailing today." The article concludes:

"Add in the realities of technology and e-commerce, and it becomes increasingly difficult to determine exactly what constitutes a 'shelf.' 'In-stock could mean an item is on the premises but is in the back room,' says Crowell, adding that there exists 'a great abyss' between the mere presence of product and its availability to the consumer. Studies by the Grocery Manufacturers Association and participating retailers have helped to define some of the complexities involved in making stocking determinations today. Beyond the physical shelf, retailers are beginning to offer order and pickup services that bypass traditional store setups. Response to online orders can be as quick as two to three hours. 'Folks are starting to say, "Let's get the inventory as close to the customer as we can,"' Crowell says. In theory, consumers can pick up product at multiple locations. Such a broad network of options is expected to become more prevalent in the years ahead. For multichannel retailers, the challenge becomes how to support both kinds of order streams. Retailers are still grappling with the question of how to manage fulfillment to physical stores as well as the growing volume of internet orders. In addition, they are experimenting with different sizes of stores - yet another complication when it comes to executing efficient inventory management."

Dan Gilmore is probably correct that that out-of-stock situation is getting worse as supply chains get more complex. Unfortunately, there are no silver bullet solutions for inventory management; but, new technologies are available that can help.

April 24, 2013

Personalization and Targeted Marketing, Part 2

In the first segment of this two-part series on personalization and targeted marketing, I discussed the views of a number of analysts who believe that getting to know customers better and then providing them with tailored offers represents the future of advertising. Despite the almost unanimous opinion that personalization is going to be the differentiator that sets successful businesses apart, studies show that many companies have been sluggish to adapt their business models to include an effective digital path to purchase. One of the reasons for this delay may be that they don't know where to begin or how to proceed. Joel Rubinson, who labels the way ahead "behavior marketing," offers five cornerstones that will "equip your organization for behavior marketing." ["From mind marketing to behavior marketing," Joel Rubinson on Market Research, 14 March 2013] They are:

  1. Targeted Marketing clear 03Build interactivity into your brand communications creative [endeavors]. TV advertising should encourage digital and second screen behaviors. Facebook stories should encourage return to the fan page, etc.
  2. Build brand audiences so you have an annuity from your marketing investments. Encourage people to like and follow your brand, sign up for e-mails, and to download branded apps.
  3. Create media targeting strategies that are based on behavior first and only use demos as a last resort for more scale.
  4. Master mobile. It is not too soon to get ahead of the curve and soon will be too late. Mobile is likely to account for 10-20% of your total ad spend by 2020.
  5. Build an insights and brand tracking strategy that converts digital and social behaviors into brand KPI metrics.

Many of the actions contained in Rubinson's cornerstones involve what Laurent Faracci, the U.S. chief strategy and marketing officer for packaged-goods giant Reckitt Benckiser, labels "calls to action." He states that if he had his way, "100% of our digital media would have a call to action." ["Years After Ditching the Click, CPG Marketers Embrace Web Ads With 'Calls to Action'," by Jack Neff, Ad Age, 25 February 2013] As you can see, behavior marketing must embrace both previous and contemplated consumer behavior. Francesco Banfi, Andrea Ghizzoni, Eric Hazan, and Andrea Travasoni, analysts with McKinsey & Company, believe that corporate digital advertising budgets are "often allocated inefficiently. At best, consumer companies evenly distribute their advertising resources across all consumer segments. At worst, they spend the majority of their budgets on the consumer segments with the least conversion potential." ["Digital intelligence: Profiles to profits in a new consumer landscape," Telecom, Media, and High Tech Extranet, 10 April 2013] To optimize marketing spend, they recommend a much more refined approach. The McKinsey & Company approach is based on four key pillars. Their first pillar involves user profiling:

"A profiling algorithm that uses semantic analysis classifies the content of the Web sites that Internet users visit. The algorithm tracks their histories and profiles them using up to 200 variables based on the content of the Web sites they visited and their purchasing behavior. Specifically, each user is automatically scored on each area of interest based on his or her browsing behavior (e.g., daily visits to travel Web sites or time spent reading travel articles). Additionally, online purchases for each product are tracked for at least one month, including whether the user was exposed to online ads. The profiling algorithm makes it possible to create highly descriptive profiles based on analyzing click-through and conversion rate data, which include user interests and sociodemographic descriptors."

Gabby Griffith agrees with the McKinsey analysts that the best way to provide a personalized digital path to purchase for consumers is "by gathering as much customer data as possible and building user profiles which can be updated as more data is received." ["Personalisation of ecommerce," eSeller, 18 March 2013] She warns, however, "This kind of use of data is viewed by some as an invasion of privacy so ... be careful not to go too far. But done correctly, personalisation can lead to a better customer experience and improved conversion rates for ecommerce businesses." The second pillar offered by the McKinsey analysts involves segment-product matching. They write:

"Using the relationship between product conversion rates and profiling variables, consumer companies can then employ a statistical algorithm to identify specific user segments for each product. These segments are defined by their appetites for and propensity to buy a certain product and are characterized by a well-defined and homogeneous profile. This makes it possible to create user microsegments that are potentially interested in a certain product."

Analysts at the marketing firm Compete agree. They write, "More work is ... needed to understand and personalize the mobile experience to different segments." ["Aligning Mobile Marketing With Consumer Behavior," Compete Pulse, 11 March 2013] Caspar Craven also agrees. He states that "knowing which segments of their target audience is engaging with what content, allows you to achieve three things." They are:

  • Attribute the real value that marketing generates to the 'sales' process.
  • Segment their target audience and reach out to key targets with the right messages at the right time.
  • Build stronger relationships through enhanced client insights.

Craven adds, "We're seeing much more sophisticated measurement, reporting and analysis coming through, and the plethora of tools will continue to increase." Judi Hand, however, believes that segmentation doesn't go far enough. ["Three Ways to Use Customer Data to Drive Marketing Relevance," Customer Experience Blog, 12 March 2013] She writes:

"Adaptive marketing programs are in tune with customers because they use next-generation analytics to extract valuable insights from mountains of data. They gain ground by leveraging intelligence to segment customers and evaluate their engagement patterns as well as their total customer lifetime value. Marketing masters know that although segmentation has long been a key component of a successful strategy, it shouldn't be the only analytical engine driving the effort. Customer engagement analytics and predictive models are also critical."

The next pillar discussed by the McKinsey analysts involves targeted advertising. They write:

"Once the relationship between products and users has been identified, digital advertising can be pushed to only the most relevant segments using tailored messages. This approach optimizes advertising budget allocation and significantly increases ROI. McKinsey's pilot shows that the solution can improve digital campaign ROI by 250 percent thanks to accurate targeting and ad tailoring."

A 250% improvement in ROI is a big number and it makes you wonder why some companies have been slow to adapt a good digital path to purchase strategy. According to Hand, one of the reasons that a better ROI can be achieved through targeted marketing is that fragmentation of markets is no longer the barrier it used to be. She explains:

"The one-size-fits-all marketing approach is dead, and influential marketers are using psychographic profiles to develop value propositions that resonate with each customer type. But, how do they efficiently generate marketing materials that are as unique as their customers? They use adaptive engagement technologies to do it all for them—automatically. Gone are the days when markets were considered too fragmented to penetrate cost effectively. Smart marketers use technology-driven marketing solutions to automatically evaluate and create custom-tailored campaigns for each person's unique needs. These solutions analyze customer data across all communication channels and dynamically generate customized content based on personal profiles and buying behavior. The most intelligent solutions shift strategies based on customer online activity, purchases, and other trigger events."

The final pillar discussed by the McKinsey analysts is easy implementation. They explain:

"This solution can easily be integrated into ad servers. Once the algorithm is rolled out, targeted campaigns can be both continuously run and continuously improved - ensuring that only those Web users with the greatest potential for conversion will be targeted. Beyond advertising, this solution can be used to achieve other objectives, such as optimizing a merchant Web site so that it better fits visitor characteristics. Examples of this include customizing Web site layout, offering tailored promotions and prices, and displaying products to maximize sales conversion. Companies can even enrich their CRM databases to enhance their relevance by using Web behavior information to design more sophisticated multichannel campaigns."

Although McKinsey's algorithm is proprietary, the pillars they discuss are applicable to similar solutions offered by other companies. Hand, for example, pushes her TeleTech's solution. She writes:

"True marketing powerhouses collect multichannel data (including social and mobile), deploy adaptive marketing technologies to extract customer insights, and automatically generate personalized content for each segment. While many companies get bogged down in building their own architecture to execute customer-centric marketing strategies, leaders leapfrog their competition by involving their IT partner and by working with customer experience companies to deploy all-in-one cloud solutions for the data-driven marketing of tomorrow."

My company, Enterra Solutions, can also provide powerful digital path to purchase solutions for companies. I agree with Joel Rubinson that "paid advertising WILL change in fundamental ways in our digital, social, and mobile future." The companies that will do best will be those that start now to understand as much as they can about this digital future.

April 23, 2013

Personalization and Targeted Marketing, Part 1

"The role of digital has become so profound that it is less about how much consumers incorporate it into their lives and more about how much individuals live their lives within the digital realm," writes a team of McKinsey & Company analysts. "When it comes to shopping, for example, digital is not just a tool that consumers reach for at the moment of transaction. Internet-connected devices have become the portals through which consumers engage the entire end-to-end customer decision journey." ["Digital intelligence: Profiles to profits in a new consumer landscape," by Francesco Banfi, Andrea Ghizzoni, Eric Hazan and Andrea Travasoni, Telecom, Media, and High Tech Extranet, 10 April 2013] The authors go on to note:

"In the consideration and evaluation stages of the purchase journey, for example, shoppers heavily engage online resources. Online reviews have the greatest influence on purchases, and in many markets most sales are preceded by searching or researching online. And - after extensive pre-purchase research online - consumers stick with digital at the point of making the actual transaction."

In other words, the digital path to purchase is too important for manufacturers and retailers to ignore. Compete, a marketing company, agrees completely. "Mobile has long been thought of as an emerging channel, yet data recently collected by Compete sister company LightSpeed Research suggests that mobile should no longer be thought of as an emerging channel for consumers. 54% of consumers report having access to a smartphone with Internet access – approaching the number of consumers with a laptop and/or desktop computer." ["Aligning Mobile Marketing With Consumer Behavior," Compete Pulse, 11 March 2013] To reinforce this sentiment, the company provided the following infographic based on research conducted by LightSpeed.

Mobile marketing and consumer behavior

The McKinsey analysts conclude that the digital path to purchase will become a larger focus for marketers -- a trend they are already seeing. They write:

"Many consumer companies are seeing the valuable efficiencies of digital marketing and are heavily shifting their advertising budgets from traditional media (e.g., radio, out-of-home) to digital media - in fact, this is the only growing segment in advertising. In 2010, less than one-fifth of money spent worldwide on advertising was directed toward digital channels. Just three years later, that share has grown to one-fourth and some projections place digital's share in overall advertising at nearly one-third in 2017 with double-digit growth expected. Although digital offers more sophisticated measurability compared with other media, its potential has not yet been fully tapped."

The Compete article concludes that it is important to understand "not just the degree that consumers shop on different devices, but what specific areas of content are most relevant/useful." Joel Rubinson agrees that consumer behavior is critical to the future of marketing. He believes, "Marketing will transform from being a battle for the mind to becoming a battle for behaviors." ["From mind marketing to behavior marketing," Joel Rubinson on Market Research, 14 March 2013] He explains:

"In traditional marketing, we ran mass advertising, targeted based on demos, that was intended to change people’s attitudes about a brand. We measured our effectiveness via surveys like copy testing and brand trackers with attribute ratings. All attitudes, all mind marketing. We HOPED that affecting the mind would result in increased sales behavior but there was little direct connection unless you were in the direct marketing business. In a digital, social, and mobile culture, people can do things with media and marketers should want these behaviors to occur as much as possible so the commercial doesn't pass by like dust in the wind. They amplify the effectiveness of your paid media. Behaviors include sharing links, liking or following a brand, commenting in Twitter or in a forum, taking a picture in Pinterest, searching to find out more and visiting brand.com websites. Add to the list downloading branded apps, Shazamming a TV commercial, scanning a QR code, and who knows what else a few months from now?"

You obviously can't affect consumer behavior if you don't understand what motivates them. That's where big data analytics play a critical role. Gabby Griffith reports, "Personalisation of the ecommerce experience is increasingly seen as the Holy Grail of online shopping and the key to bridging the gap between online and in-store retail." ["Personalisation of ecommerce," eSeller, 18 March 2013] Personalized (or targeted) marketing is considered the Holy Grail because it holds the potential of providing a better ROI for marketing spend. The McKinsey analysts explain:

"By spending customer-segment marketing dollars smarter, companies can either boost revenues without increasing their advertising budgets or maintain their conversion rates but at about half of their current advertising spend. They could, for example, optimize their advertising spend by displaying ads only to the right targets across the full range of Web sites they use to advertise. They could also boost their advertising effectiveness by creatively tailoring messages to user characteristics and profiles."

Despite the already significant trend towards targeted marketing, Rubinson asserts, "The biggest sea change in behavior marketing has yet to occur, so fasten your seat belts, because it is coming. It is the rise of mobile marketing which brings digital marketing right to the point of purchase. It is the convergence of advertising and shopper path to purchase where a marketer will be able to deliver the right message, at exactly the right time and place, to shoppers who that brand has a relationship with but who split their purchases among other brands as well." Richard Ting agrees with Rubinson that there is a long way to go before companies fully embrace targeted marketing. He writes:

"Aside from a select few companies — like Amazon — most brands still have no unified view of what their customers are saying, doing, or buying on their websites, in retail, and across social media. As a result, instead of better targeting and personalizing brand messages, experiences, and deals, most brands are still embracing the 'spray and pray' tactics commonly used during the height of traditional advertising." ["The Customer Profile: Your Brand's Secret Weapon," HBR Blog Network, 11 March 2013]

Caspar Craven agrees with all previously cited analysts and believes that knowing your customer better is "like digging for gold, except that it's not just one nugget but many" for which you are digging. ["Digging for gold? Strategies to drive better client engagement…" Fourth Source, 11 March 2013] He continues:

"With customer intelligence, new technology and advanced business intelligence tools, it is now possible to have a real grasp on which of your clients are engaging with what type of content, and then monitor the level of this engagement. Indeed, in today’s always-on digital world, customer intelligence strategies and tools are a business imperative. It's about getting to grips with, and exploiting, the myriad channels, touchpoints and data sources available in order to take account of the broader perspective."

Since coming to grips with the challenge of getting to know your customers better and using that knowledge to better personalize marketing is an imperative, in the next post, I'll discuss some of the strategies that analysts cited above (as well as others) recommend for doing just that. As you will learn, the best strategies involve gathering not just a lot of data but the right kinds of data.

April 22, 2013

The Supply Chain Crisis and Disaster Pyramid

Back in 2009, R. Glenn Richey. Jr., a professor in the Manderson Graduate School of Business at the University of Alabama, wrote an article entitled "The supply chain crisis and disaster pyramid: A theoretical framework for understanding preparedness and recovery." [International Journal of Physical Distribution & Logistics Management, Volume 39 Issue: 7, pp.619 - 628] In the abstract concerning that article, Richey wrote:

"The research on supply chains concerning disaster and crisis situations is in its infancy, but rapidly expanding on the backs of top researchers in the field. As with most young research streams there is very little theoretical grounding in extant studies. The purpose of this research is to integrate four prominent existing theoretical perspectives to provide a concise yet holistic framework for grounding future research."

The "four prominent theoretical perspectives" used by Richey in his paper were: the resource-based view; communication theory; competing values framework; and relationship management theory. These four theories identified the corners of a three-sided figure he called the "Supply Chain Disaster and Crisis Pyramid." To better understand how the pyramid interrelates these theories, let me provide a little background on each theory.

Resource-based view -- According to Wikipedia, "The resource-based view (RBV) as a basis for a competitive advantage of a firm lies primarily in the application of the bundle of valuable interchangeable and intangible tangible resources at the firm's disposal. To transform a short-run competitive advantage into a sustained competitive advantage requires that these resources are heterogeneous in nature and not perfectly mobile. Effectively, this translates into valuable resources that are neither perfectly imitable nor substitutable without great effort." In other words, a company needs to identify those resources that differentiate it from its competition and provide it with a substantial competitive advantage and then figure out how to protect those assets.

Communication theory -- Wikipedia states, "Communication theory is a field of information and mathematics that studies the technical process of information and the human process of human communication." In other words, companies need to understand how they communicate best with the stakeholders in their supply chain and then ensure that those lines of communication are going to be available during a disaster.

Competing values framework -- A University of Twente website, provides the following explanation of this theory:

"The Competing Values Framework emerged from a series of empirical studies on the notion of organizational effectiveness (Quinn & Rohrbaugh, 1983). These efforts were an attempt to make sense of effectiveness criteria. Quinn and Rohrbaugh (1983) discovered two dimensions of effectiveness. The first dimension is related to organizational focus, from an internal emphasis on people in the organization to an external focus of the organization itself. The second dimension represents the contrast between stability and control and flexibility and change. The Competing Values Framework received its name because the criteria within the four models seem at first to carry conflicting messages. We want our organizations to be adaptable and flexible, but we also want them to be stable and controlled. [The four models are: the Internal Process Model; the Open Systems Model; the Rational Goal Model; and the Human Relations Model.] ... While the models seem to be four entirely different perspectives or domains, they can be viewed as closely related and interwoven. They are four subdomains of a larger construct: organizational and managerial effectiveness. The four models in the framework represent the unseen values over which people, programs, policies, and organizations live and die."

From a supply chain perspective, there are not only competing values within a company but between companies in the supply chain as well.

Relationship management theory -- Relationship management theory focuses on what is commonly called public relations. According to Wikipedia, "Public relations (PR) is the practice of managing the flow of information between an individual or an organization and the public." In simple terms, communication theory tells you how you are going to get your message to its intended audience while relationship management theory helps you craft the message itself.

Daniel Dumke identifies the four corners of the crisis and disaster pyramid discussed above in simpler terms: resources (resource management); collaboration (relationship management); communication; and contingency planning (competing values). ["Supply Chain Crisis and Disaster Pyramid," Supply Chain Risk Management, 2 November 2011] Those terms are much easier terms to understand as well as being much more recognizable to most risk managers. Dumke concludes:

"All in all Richey's framework is aimed at providing a guideline for future researchers to find new insights into supply chain disaster management and how to improve supply chain reactions at the intersection of communication, collaboration, resources and values. And these four aspects should not only be considered by researchers, but also by supply chain professionals. I especially liked the inclusion of the competing value theory, which might lead to a shift in research from the currently leading paradigm that goals of supply chain partners are always well aligned. On the other hand, this framework could also be used beyond only disaster and crisis management, the aspects could perhaps prove influential in a larger number of supply chain related research fields and applications."

Jan Husdal was disappointed that Richey's article was so theoretical and aimed at researchers rather than supply chain risk managers. "If you are a supply chain or logistics professional looking for a paper that discusses the intricacies of managing a supply chain in a disaster area, how to prepare and how to recover," Husdal writes, "this is NOT it." ["Pyramidal thoughts," husdal.com, 10 March 2010] Despite his disappointment, Husdal concludes, "It is a framework that is well founded, based on the literature review. It will be interesting to see how many researchers pick up on this article and develop the suggested research strands."

Husdal points out that planes created by the four corners of Richey's pyramid identify specific relationships that need to be fostered and utilized during a crisis. They are:

1 – the independents
resources – competing values – communication
How do firms re(act) as disconnected and disinformed individual organizations?

2 – the proactive partnership
resources – communication – relationship management
How can firms develop communication and collaboration?

3 – the co-opetition resource
resources – competing values – collaboration
How do firms grow their situational awareness balancing when to compete and when to collaborate?

This is how the Supply Chain Crisis and Disaster Pyramid looks in three dimensions.

Crisis Pyramid 02

Obviously, each "plane" of the pyramid requires different approaches for the stakeholders involved. One of the biggest challenges in determining the right approach to take is identifying who all is involved. The more complex supply chains (or value networks) become, the more difficult it is to identify all of the players and their relationships. Supply chain analyst Lora Cecere has frequently insisted that supply chain visibility must at least include your suppliers' suppliers and your customers' customers. One recommended approach to understanding complex networks is to map them. Daniel Dumke writes, "There are several key advantages to supply chain mapping." ["Solution to Strategic Supply Chain Mapping," Supply Chain Risk Management, 23 April 2012] Among those advantages are:

  1. To link corporate strategy to supply chain strategy.
  2. To catalog and distribute key information for survival in a dynamic environment (in order) to direct the focus of the managers.
  3. To offer a basis for supply chain redesign or modification.
  4. Current channel dynamics can be displayed in a supply chain map.
  5. The process of building the strategic supply chain map, in itself, will help define the perspective of the supply chain integration effort.

Even though Richey's article didn't provide any recommended solutions for dealing with supply chain crises and disasters, if you are a supply change risk manager, his framework is worth considering. Simply identifying who are independent, cooperative, and proactive partner stakeholders is a valuable thing to know.

April 19, 2013

The Power of Stories

ScheherazadeMost of us are familiar with tale of Scheherazade, the amazing storyteller of legend who was able to spare her own life by keeping her husband, the king, so enthralled with her tales that he couldn't convince himself to behead her like he had a thousand other wives before her. John P. Kotter, a renowned expert on leadership at the Harvard Business School, writes, "Over the years I have become convinced that we learn best--and change, from hearing stories that strike a chord within us." ["The Power Of Stories," Forbes, 12 April 2006] As an academic, Kotter admits that he wrote many a "dry" volume in his early years. Those books were undoubtedly well-written and thought-provoking, but they were also lightly read. It was only when he learned to become a storyteller that Kotter says his readership improved. He continued:

"As I look around me today, I see that too few business leaders grasp the idea that stories can have a profound effect on people. The gestures made (or not made) by leaders can turn into the stories that powerfully affect behavior. Leaders who understand this and use this knowledge to help make their organizations great are the ones we admire and wish others would emulate. Those in leadership positions who fail to grasp or use the power of stories risk failure for their companies and for themselves."

Kotter noted that stories can have corporate impact both internally and externally. "The stories that a company broadcasts about itself," he wrote, "can also have a powerful impact on customers, shareholders and employees." He continued:

"It is important that executives ask themselves these questions: What are the stories that define us in light of our customers, employees and shareholders? And are these the stories we want to tell--and have others tell about us? If the answer to these questions is 'No,' then you must start taking the actions that will replace the old stories. Your success depends on the honesty and integrity of your actions as well as on the emotional impact they make. The cynicism created when the 'stories' are proved to be false or misleading can be extremely damaging."

Roger Jones, who advises executives on telling stories, asserts, "Every day we are bombarded with more and more information and our business world is becoming increasingly complex. We need a better way to persuade people, get your messages to stick and inspire people to take action. PowerPoint presentations make audiences doze off, facts and figures often bore people." ["Fables for board tables," by Emma Jacobs, Financial Times, 27 September 2011] Jacobs adds, "Stories are not just for speeches, but can be used in company literature or marketing." Stephen Denning, who has held a number of positions at the World Bank, told Jacobs, "Can you think of anyone who turned around a situation who didn’t use a story? If you think of all the great religious leaders, philosophers, generals, [political] leaders, what do they have in common? They were all great storytellers." In the United States, Abraham Lincoln immediately pops to mind.

Peter Guber, a professional storyteller, asserts, "Telling stories is not just the oldest form of entertainment, it's the highest form of consciousness. The need for narrative is embedded deep in our brains. Increasingly, success in the information age demands that we harness the hidden power of stories." ["The Inside Story," Psychology Today, 23 January 2012] Dr. Pamela Brown Rutledge agrees with Guber. "Stories leap frog technology," she writes, "taking us to authentic experience." ["The Psychological Power of Storytelling," Psychology Today, 16 January 2011] In the information age, stories can be related across an ever increasing array of technological platforms. Rutledge calls this "transmedia storytelling." She calls transmedia storytelling "the ultimate mashup of ancient traditions and new communications models." She goes on to claim, "The orchestration of a story across multiple media platforms can be a complex creative endeavor." From the beginning, marketers have known that their profession involves storytelling. Those that tell stories well do the best. Those who simply try to pass along information don't do quite as well. Rutledge continues:

"Even with technology's increasingly sophisticated and jaw-dropping capabilities, the tools are becoming simultaneously more accessible and user-friendly. So much so, that the boundaries are blurring not just across technologies but also across the people who are creating, using, producing, augmenting, distributing, hacking, mashing, and every other '-ing' imaginable. In spite of all the excitement, however, the human brain has been on a slower evolutionary trajectory than the technology. Our brains still respond to content by looking for the story to make sense out of the experience. No matter what the technology, the meaning starts in the brain. The transmedia producer may get the credit line, but the success of the transmedia effort rests on the resonance, authenticity, and richness created by the storyteller."

One of the transmedia channels through which stories can be transmitted is social media. "Companies are spending countless hours and millions of dollars trying to master social media," writes Dan Singer. ["The power of storytelling: What nonprofits can teach the private sector about social media," McKinsey Quarterly, February 2011] Singer interviewed Jennifer Aaker, a marketing professor at Stanford University, and Andy Smith, a marketing strategist, to learn more about a book they wrote entitled The Dragonfly Effect. They told Singer that one of the four wings of their "dragonfly" model "is engagement, which they define as 'truly making people feel emotionally connected to helping you achieve your goals' through storytelling, authenticity, and establishing a personal connection." By now you should begin to see a pattern in this narrative about storytelling. All of the experts talk about the importance of connection and authenticity when stories are told. Guber calls stories "state-of-the-heart technology." Aaker and Smith assert there are four important elements to getting your message across. They are: telling a story, empathizing with your audience, emphasizing authenticity, and matching the media with the message.

So how do you tell a good story? Aaker states, "Good stories have three components: a strong beginning, a strong end, and a point of tension. Most people confuse stories with situations. They'll tell about a situation: X happened, Y happened, Z happened. But a good story takes Y, the middle part of the story, and creates tension or conflict where the reader or the audience is drawn into the story, what's going to happen next." That's exactly what Scheherazade did each night as she stopped her new story at point where the king was desperate learn what was going to happen next. Guber adds, "The first rule of telling stories is to give the audience — whether it's one business person or a theater full of moviegoers — an emotional experience. The heart is always the first target in telling purposeful stories. Stories must give listeners an emotional experience if they are to ignite a call to action. By far, the most effective and efficient way to do that is through the use of metaphor and analogy." To learn more about the power of analogies, read my post entitled Analogies and Innovation.

Aaker told Singer, "Treating stories as assets is an underrealized idea right now." Perhaps if business leaders thought of stories as data it would be different. After all, the World Economic Forum has declared data a new commodity asset, like gold or oil. A good story has value. Most of us think of stories as narratives, but, writes Christina Shepherd McGuire, "Storytelling doesn't have to consist of lengthy narratives relaying epic adventures. You can tell your stories via captivating images, action-packed videos, short blog posts or social media campaigns." ["Storytelling: A Tale of Creating Customer Loyalty," by Evan Buzbee, Transworld Business, 26 March 2013] Well-known internet entrepreneur Guy Kawasaki, uses the word "enchantment" to describe how to win customer loyalty. ["Enchanting without fairy tales," by Philip Delves Broughton, Financial Times, 16 March 2011] Kawasaki insists, "There are all kinds of money to be made if you can make a phone sound magical, a PC miraculous or a pair of running shoes life-changing." At the same time, you need to sound authentic. That's not an easy trick. As Broughton concludes, "Enchantment, whether in fairy stories or in marketing departments, never happens by accident."

Mike Lehr raises an interesting caveat about the use of stories. He agrees that "stories galvanize people, helping them to learn, to coalesce around ideas"; but, he argues, if the story becomes such a critical part of a company's identity it "could thwart change and innovation." ["Stories as Inhibitors of Change, Innovation," Influencing and Problem Solving, 11 March 2013] He explains, "When we seek to change, to innovate, we will likely need to question the validity of existing stories no matter how factual and truthful they seem. They are likely prisons inhibiting us from considering what is outside their walls." In my post about analogies, I noted the same thing can occur. To avoid this trap, stories need to be continually updated and changed. When that happens, Dr. Rutledge concludes that stories can be a catalyst for change. She writes:

"[Transmedia] technologies have created a demand for fundamentals: authenticity, participation, and engagement. Special effects and funny Super Bowl ads are fine, but they are expensive one-offs if they do not touch the core of experience. I don't care how you calculate, that's not going to get you a very good ROI. When organizations, causes, brands or individuals identify and develop a core story, they create and display authentic meaning and purpose that others can believe, participate with, and share. This is the basis for cultural and social change. This is a skill worth learning."

Gina Rudan, the author of Practical Genius, told Jacobs, "Storytelling goes wrong when it lacks authenticity or truth." But all of the pundits noted above, agree that when you get it right, storytelling is a powerful tool in the kit of any corporation.

April 18, 2013

Quantum Computing: Is the Future Here?

The world of quantum mechanics has one of the strangest landscapes in science. It is the world atomic and sub-atomic particles. Yet it is also a field that holds great promise for better understanding the entire universe. One of the reasons that the field of quantum mechanics may provide the ultimate breakthrough to our understanding of how things work is that it is the gateway to quantum computing. For a quick primer on the subject of quantum computing, watch the following video.

Because the qubit (i.e., the quantum bit) is the irreducible carrier of quantum information, it also requires the least amount of energy to control; hence, quantum computers hold the promise of small size, great speed, and efficient operation. It should come as no surprise that developing a commercial-grade quantum computer has become the Holy Grail of computing. Quentin Hardy reports that Lockheed Martin believes development of a reliable quantum computer has finally reached the point where it can be scaled for commercial use. ["A Strange Computer Promises Great Speed," New York Times, 21 March 2013] Hardy reports:

"A powerful new type of computer that is about to be commercially deployed by a major American military contractor is taking computing into the strange, subatomic realm of quantum mechanics. In that infinitesimal neighborhood, common sense logic no longer seems to apply. A one can be a one, or it can be a one and a zero and everything in between — all at the same time. It sounds preposterous, particularly to those familiar with the yes/no world of conventional computing. But academic researchers and scientists at companies like Microsoft, I.B.M. and Hewlett-Packard have been working to develop quantum computers. Now, Lockheed Martin — which bought an early version of such a computer from the Canadian company D-Wave Systems two years ago — is confident enough in the technology to upgrade it to commercial scale, becoming the first company to use quantum computing as part of its business."

The reason this is big news is because "to date, quantum computers have been implemented so that programming their operation was, in essence, hardwired into their essential structure. Although many useful demonstrations of quantum computing have resulted from such special-purpose devices, they are basically one-problem computers which cannot easily be reprogrammed or scaled to attack larger problems. As early models of practical quantum computers, they don't make the grade." ["Quantum computer with separate CPU and memory represents significant breakthrough," by Brian Dodson, Gizmag, 12 February 2012] Because mastering quantum computing has proved to so difficult, Hardy reports, "Skeptics say that D-Wave has yet to prove to outside scientists that it has solved the myriad challenges involved in quantum computation." Regardless of the skeptics, Ray Johnson, Lockheed's chief technical officer, is a believer. He told Hardy, "This is a revolution not unlike the early days of computing. It is a transformation in the way computers are thought about." He'll be correct, if the D-Wave computer works as advertised.

Because of their potential for high-speed computation, quantum computers could tackle some of the world's most difficult challenges. Lockheed Martin plans to use the D-Wave computer "to create and test complex radar, space and aircraft systems." But Hardy notes that quantum computers are ideal for research in a number of fields including medicine and artificial intelligence. One of the reasons that there are vocal skeptics following the Lockheed Martin announcement is that in 2007 D-Wave announced it would have a commercially-available quantum computer within a year. The company had to withdraw that claim shortly after it was made. For more on the uproar that created back then, read my posts entitled Quantum Computing -- Not Coming Soon to a Store Near You and Quantum Computing or Quackery? Hardy continues:

"People working in quantum computing are generally optimistic about breakthroughs to come. ... Quantum computing has been a goal of researchers for more than three decades, but it has proved remarkably difficult to achieve. The idea has been to exploit a property of matter in a quantum state known as superposition, which makes it possible for the basic elements of a quantum computer, known as qubits, to hold a vast array of values simultaneously. There are a variety of ways scientists create the conditions needed to achieve superposition as well as a second quantum state known as entanglement, which are both necessary for quantum computing. Researchers have suspended ions in magnetic fields, trapped photons or manipulated phosphorus atoms in silicon."

Last year, John Markoff reported, "Australian and American physicists have built a working transistor from a single phosphorus atom embedded in a silicon crystal." ["Physicists Create a Working Transistor From a Single Atom," New York Times, 19 February 2012] Andreas Heinrich, a physicist at IBM, told Markoff that "the research was a significant step toward making a functioning quantum computing system." Gerhard Klimeck, a professor of electrical and computer engineering at Purdue, told Markoff, "It shows that Moore's Law can be scaled toward atomic scales in silicon." Markoff explains, "Moore’s Law refers to technology improvements by the semiconductor industry that have doubled the number of transistors on a silicon chip roughly every 18 months for the past half-century. That has led to accelerating increases in performance and declining prices."

Cather­ine Zan­donella reports that another breakthrough that has been made is that quibits are now able to be manipulated "at room tem­per­a­ture. Until recently, tem­per­a­tures near absolute zero were required, but new diamond-based mate­ri­als allow spin qubits to be oper­ated on a table top, at room tem­per­a­ture." ["Quantum computing moves forward," Princeton Journal Watch, 7 March 2013] Hardy reports, "In the D-Wave system, a quantum computing processor, made from a lattice of tiny superconducting wires, is chilled close to absolute zero. It is then programmed by loading a set of mathematical equations into the lattice." Obviously, a system that didn't need to be chilled close to absolute zero would be a cheaper system to operate. A second big breakthrough, Zan­donella reports, "is the abil­ity to con­trol these quan­tum bits, or qubits, for sev­eral sec­onds before they lapse into clas­si­cal behav­ior. ... A remain­ing chal­lenge is to find ways to trans­mit quan­tum infor­ma­tion over long dis­tances."

One of the more mysterious characteristics of quantum mechanics that affects quantum computing is entanglement. "Many quantum algorithms require that particles' spins be 'entangled,' meaning that they're all dependent on each other," writes Larry Hardesty. "The more entanglement a physical system offers, the greater its computational power." ["Proving quantum computers feasible," MIT news, 27 November 2012] Hardesty continues:

"Until now, theoreticians have demonstrated the possibility of high entanglement only in a very complex spin chain, which would be difficult to realize experimentally. In simpler systems, the degree of entanglement appeared to be capped: Beyond a certain point, adding more particles to the chain didn't seem to increase the entanglement. ... However, in the journal Physical Review Letters, a group of researchers at MIT, IBM, Masaryk University in the Czech Republic, the Slovak Academy of Sciences and Northeastern University proved that even in simple spin chains, the degree of entanglement scales with the length of the chain. The research thus offers strong evidence that relatively simple quantum systems could offer considerable computational resources."

Jacob Aron reports that "chips made by D-Wave ... have passed two tests that suggest that the bits in their machines have the quantum property of entanglement. That doesn't end the controversy, but it strengthens D-Wave's claim that a revolution in computing is a lot closer than we thought." ["Controversial quantum computer aces entanglement tests," NewScientist, 8 March 2013] Whether the D-Wave computer proves out or not, breakthroughs continue to be made.

Joseph Brean reports, "Canadian researchers have succeeded in side-stepping an obstacle of Heisenberg's Uncertainty Principle, a strange law of the quantum world that says precise measurement is impossible, because the act of measuring changes what you are trying to measure. Their experiment in an Ottawa lab — in which they measured the polarization states of single light particles, called photons — is seen as a small step toward a quantum computer, a major goal of modern science." ["Canadian researchers take a sneak peek at Schrödinger’s Cat and a step toward a quantum computer," National Post, 4 March 2013] A research team at Yale "recently developed a new way to change the quantum state of photons, the elementary particles researchers hope to use for quantum memory." ["Yale Researchers Ride Photons in Search of Quantum Computer," by Klint Finley, Wired, 29 March 2013] Science Daily reports, "Carbon nanotubes can be used as quantum bits for quantum computers. A study by physicists at the Technische Universitaet Muenchen (TUM) has shown how nanotubes can store information in the form of vibrations. Up to now, researchers have experimented primarily with electrically charged particles. Because nanomechanical devices are not charged, they are much less sensitive to electrical interference." ["Quantum Computers Counting On Carbon Nanotubes," 21 March 2013] "UCLA physicists have pioneered a new technique that combines two traditional atomic cooling technologies and brings normally springy molecules to a frozen standstill." ["Quantum Computing? Physicists' New Technique for Cooling Molecules May Be a Stepping Stone to Quantum Computing," Science Daily, 27 March 2013] Finally, Adrian Cho reports:

"You've heard the hype a hundred times: Physicists hope to someday build a whiz-bang quantum computer that can solve problems that would overwhelm an ordinary computer. Now, four separate teams have taken a step toward achieving such 'quantum speed-up' by demonstrating a simpler, more limited form of quantum computing that, if it can be improved, might soon give classical computers a run for their money. But don't get your hopes up for a full-fledged quantum computer. The gizmos may not be good for much beyond one particular calculation." ["New Form of Quantum Computation Promises Showdown With Ordinary Computers," Science, 21 December 2012]

With new breakthroughs being announced almost daily, if the world of quantum computing hasn't already arrived with the D-Wave system, its development is likely in foreseeable future.

April 17, 2013

Gamification and the Future

According to Wikipedia, "Gamification is the use of game thinking and game mechanics in a non-game context in order to engage users and solve problems." The staff at Mashable adds:

"Typically gamification applies to non-game applications and processes, in order to encourage people to adopt them, or to influence how they are used. Gamification works by making technology more engaging, by encouraging users to engage in desired behaviors, by showing a path to mastery and autonomy, by helping to solve problems and not being a distraction, and by taking advantage of humans' psychological predisposition to engage in gaming. The technique can encourage people to perform chores that they ordinarily consider boring, such as completing surveys, shopping, filling out tax forms, or reading web sites. Available data from gamified websites, applications, and processes indicate potential improvements in areas like user engagement, ROI, data quality, timeliness, or learning."

It should be clear from those descriptions that gamification can be used in any number of situations and in any number of domains. For example, gamification is one of five technologies predicted to improve education in the years ahead. According to one website, gamification in education will include: educational programming tools, educational games, achievements & badges, student-developed applications, and self-paced learning modules. Mother of three and founder of TechMamas.com, Beth Blecherman, is a strong believer that gamification has a place in the classroom. She claims to have witnessed "first-hand how making activities into a game suddenly turns stubborn stand-offs into engaging fun." ["Does Gamification Help Classroom Learning?" Mashable, 8 March 2013] She doesn't claim that gamification is the be-all and end-all for learning, but insists that it is a valuable tool in education's kit. She explains:

"Engagement is key. If my kids aren't enjoying a learning environment, whether it's in school, summer camp or a sports team, I hear about it very quickly. In our house, Little League is all about how the kids and the coaches all gel together, and not necessarily about the game itself. ... How does this all translate to the classroom? It looks like the way to get kids to learn during 'game play' is by engaging them to investigate a topic, not just play to earn points. It's also important to understand why. For some kids, this type of learning may be more effective than traditional classroom lectures. For my kids, the answer is 'yes' but it won't work for everyone."

Imagining the use of games in an educational environment is not that hard to do. After all, most kids love to play games. But it's not just kids that love to play games. The fact of the matter is most people love to play games throughout their lives. The type and level of sophistication of those games may change; but, the satisfaction we gain from playing them apparently remains fairly constant throughout life. That is why Sam Laird writes, "From recycling, to personal health, to corporate culture, gamification is seeping into all aspects of everyday life." ["Why Gamification Can't Be Stopped," Mashable, 7 April 2012] Rajat Paharia, founder of company called Bunchball, which helps company use gamification, told Laird, "Gamification is all about providing sustained user engagement. The word itself implies a transformation of something that exists, and people are starting to see more and more how they can apply it to their own situations."

GamificationGabe Zichermann, founder of Dopamine, a strategic consultancy specializing in engagement science, claims that startup companies primarily use gamification to acquire customers. ["The Game Mechanics of Customer Loyalty," Mashable, 24 May 2012] Zichermann notes that gamification doesn't necessarily have to take the form of something that looks like a game. Gamification is a call to action. He explains:

"Think of the top five social actions you want your users to take, and use verbs to describe them. Don't use 'buy' or 'subscribe' because those are outgrowths of good engagement rather than ends in and of themselves. Concepts such as 'like,' 'comment,' 'argue,' or 'challenge' are good examples. It's critical to think of them as social actions because they will also help you attract new users."

Using gamification for the purposes of marketing is a fairly well known technique. Most of us have received emails offering us the opportunity to join a contest or something similar. Sharlyn Lauby believes that companies should also look internally for ways to use gamification. She writes, "An example would be to create a game to learn something new for work: While a lecture session could potentially turn off employees and prevent learning, a game that teaches the same skills could lead to an interested employee that is eager to learn." ["The Evolution of Gamification in the Workplace," Mashable, 15 June 2012] She continues:

"Companies need to embrace the idea of blending games with work. And in order for that to happen, gamification needs to be perceived as a profession, not a frivolous activity. Proper gamification must have a minimum knowledge base and skill set about the given subject matter, as well as both theory and practical application of its core principles. Lastly, it must create a common vehicle for advocacy and ethics to maintain standards."

Lauby assumes that most workplace gamification activities will involve IT technologies. The reason for that assumption is the fact that the "number of people playing video games in the U.S. has risen 241% since 2008." That means that most employees are already familiar with some of the technologies they will be asked to use as part of their training. This familiarity enhances the prospects for success and reduces training costs. Since so many sectors are starting to adopt gamification, Lauby concludes, "While some aspects of gamification are still evolving, there's clearly a vibrant future ahead for the profession." In a follow-on article, Lauby notes, "Often the hardest part of introducing games into the workplace isn't the game itself; it's selling the idea to senior leadership." ["Give Your Business a Boost With Games in the Workplace," Mashable, 27 July 2012] She goes on to note that once they see the results, most executives are convinced. She concludes, "Whether it's brand awareness, recruitment or high performance, companies are using gamification in varied forms to accomplish their business goals."

Another use of gamification in the workplace focuses on innovation. According to the staff at Market, "Gamification will be used for innovation by 50% of companies and will become as important as Facebook for customer retention, Gartner predicts." ["50% of companies will gamify innovation by 2015," 18 February 2013] According to the article, Gartner identified "four principal means of driving engagement using gamification." They are:

1. Accelerated feedback cycles: Gamification increases the velocity of feedback loops, such as annual performance reviews, to maintain engagement over time.

2. Clear goals and rules of play: Gamification provides clear goals and well-defined rules of play to ensure players feel empowered to achieve goals.

3. A compelling narrative: While real-world activities are rarely compelling, gamification builds a narrative that engages players to participate and achieve the goals of the activity.

4. Tasks that are challenging but achievable: Gamification provides short-term, achievable goals to maintain engagement while seeking to fulfill more long-term goals.

Other analysts indicate that gamification, when partnered with big data analytics, can be used to address security and fraud challenges. ["How Big Data Analytics and Gamers Could Solve Fraud and Security," by Heong Weng Mak, Gamification Corp., 29 March 2013] "When a system is constructed," writes Mak, "whether it is an online game or within an enterprising organization, raw data is constantly generated by its users' actions and stored onto servers. When these vast quantity of unstructured data are compiled and analyzed through analytical programs, amazing things begin to happen." One of those "amazing things" is the ability to discover whether an individual is "gaming" the system. People who game any system create problems for the rest of us and generally end up costing us money because we have to cover the costs of fraud.

Because people like to play games, the gamification of all sorts of activities is likely to happen over the next few years. Clearly, however, marketers will embrace gamification. They are always looking for new and exciting ways to engage with consumers.