Site moved to, redirecting in 1 second...

« May 2013 | Main | July 2013 »

19 posts from June 2013

June 28, 2013

Tableware and Taste

Our sense of taste is an interesting topic. In other posts on the subject, I've noted that other senses (especially smell and sight) play a role in how we judge the taste of foods. "To make your yogurt seem more decadent," writes, Emily Sohn, "eat it from a plastic spoon." At least that is what a new study suggests, "which found that the color, weight and shape of utensils influences our experience of the food we eat with them." ["Utensil Color, Shape, Size Affect Food Flavor," Discovery, 25 June 2013] Perhaps being born with a silver spoon in one's mouth isn't so desirable after all. Sohn continues:

"The findings add to plenty of other evidence that the taste of food varies depending on the accessories we use to eat it. When people drink from a glass that has a color perceived as 'cool,' for example, they find the beverage more thirst-quenching. And food eaten out of a heavier bowl seems like a larger portion. To see what kind of influence utensils might have on food experiences, researchers from the University of Oxford in the United Kingdom conducted a series of experiments. They added hidden weights to some plastic spoons, for example, and asked people to rate the sweetness, density and expensiveness of yogurt. They also played around with the color and shape of cutlery. People perceived yogurt as denser and more expensive when eaten with lighter spoons, the researchers report today in the journal Flavour, probably because those spoons matched their expectations. But yogurt tasted with that spoon was also rated as less sweet than when eaten with heavier or larger spoons.When using a blue spoon, people thought pink yogurt tasted saltier than white yogurt, possibly because salty snack foods in the U.K. often come in blue packages with white lettering — leading to unrealized salty expectations from the white yogurt. And people thought yogurt tasted sweeter when they ate it with a white spoon than with a black one."

Writing about the results of the same study, an article by Agence France-Presse, stresses the point that beyond how tableware can affect the taste of food, it could have serious health consequences as well. ["Tableware color influences food flavor - study," Rappler, 26 June 2013] It explains:

"British hospitals use red trays in a program to combat malnutrition, but may have chosen the worst possible color, according to a study linking the tinge of tableware to food enjoyment. ... 'Red could ... be used to serve food to people who need to reduce their food intake, but should certainly not be used for those who are underweight,' the team wrote in the journal Flavour. British hospitals use red trays to make it easier for nurses to identify people who need help eating. 'Red appears to be the worst possible tray color ... for those individuals who are being encouraged to eat more,' the researchers warned."

If true, the study could prompt a whole new line of dieting dishware for tableware manufacturers. Sohn reports that study another experiment with cheese. She writes:

"In a final experiment with different types of cheese, the researchers found that people rated cheese as saltier when they sampled it with a knife instead of a fork, spoon or toothpick. This may have been because eating with a knife is an unusual behavior, the researchers speculated, or because it reminded people of using a knife to try samples in a cheese shop, where cheeses tend to be more aged and therefore saltier."

It's clear that we have a lot left to learn about our sense of taste and flavor preferences. The Oxford researchers obviously came up with more questions than they did answers. Not knowing how the study was conducted, it would be interesting to see if the results remained true if the experiments were conducted using participants with sensory handicaps (e.g, blind or deaf volunteers). Sohn asserts, "The findings add yet more evidence that subtle details about way we eat influence how we experience the textures and flavors of our food. The results may also lead to new strategies for more or less healthful eating." I'm not sure why anyone would want to develop a strategy for less healthful eating, but you never know. In a press release that accompanied publication of the study, co-author of the report, Vanessa Harrar, stated:

"How we experience food is a multisensory experience involving taste, feel of the food in our mouths, aroma, and the feasting of our eyes. Even before we put food into our mouths our brains have made a judgment about it, which affects our overall experience. Subtly changing eating implements and tableware can affect how pleasurable, or filling, food appears. So, when serving a dish, one should keep in mind that the color of the food appears different depending on the background on which it is presented (plate or cutlery) and, therefore, tastes different."

I'll discuss more about how senses affect our taste in future posts.

June 27, 2013

Understanding the Digital Path to Purchase

Analysts have tried depicting a consumer's digital path to purchase in a variety of ways — loops within loops, funnels, circles, and so forth. The reason it is so difficult to depict is that consumers have many ways to access the Web and you really never know at what point they are going to jump onto the path. For example, the path to purchase could begin after seeing an ad in a magazine while sitting in the doctor's office or after watching a commercial on television or hearing an advertisement on the radio or receiving a flyer in the mail or with an online search. As a result, consumer's may jump onto the "digital" path to purchase anywhere along path. They don't necessarily begin down that path digitally. Regardless of where they jump onto the digital path, Tyson Goodson believes, "There are six distinct segments of consumers who exhibit similar behaviors and intent." ["The 6 Types of Digital Consumers and Their Paths to Purchase," Compete, 30 May 2013] Goodson writes, "Of consumers who utilized digital at least once in their purchase pathway, six distinct segments emerged. The segments, as identified by GroupMNext, are:

  • "Basic Digital Consumers: These are not highly digital users. They are comfortable with Internet shopping and research, but they are not mobile or social and have the second-highest likelihood of buying offline.

  • "Retail Scouts: These consumers have short journeys and prefer retail sites to brand sites. They use mobile, but are twice as likely to use it in the home as out. They are comfortable buying online but did not express a preference between online and offline.

  • Brand Scouts: Brand Scouts are the spiritual partner to the Retail Scouts except instead of having a favorite retailer, they have a favorite brand. When asked, 72% said they start their journey with a brand in mind.

  • "Digitally Driven Segment: They use every digital tool at their disposal. They use social and mobile more than any other segment in the study, value convenience above all else and they do everything in their power to avoid physically going to a store. The Digitally Driven exist in good numbers already, but within five years this will be the dominant segment of consumers.

  • "Calculated Shoppers: These shoppers seem to know they are going to make a purchase, but they are deciding which brand to choose. They are similar to the Digitally Driven Segment, but have no urgency to their purchase and they're willing to take the time to get the best deal.

  • "External Shoppers: These are non-mobile shoppers. They want the answers to, 'Should I buy?', 'What do I buy?' and 'What brand do I buy?' – all at the same time. These shoppers have no urgency to make a purchase and they do their research on desktop and laptop computers."

If GroupMNext is correct and the Digitally Driven Segment is indeed going to be the dominant segment of consumers within the next five years, you can understand why it is important for companies to have a solid multi-channel strategy. Goodson's article was also accompanied by an excellent graphic that helps explain why depicting the digital path to purchase is so difficult. In the graphic below, note how many different steps (on average) are involved in purchase decisions by the various consumer segments and how they jump off and on the digital path to purchase.

Compete-dp2p chart

Several years ago McKinsey & Company analysts, David Court, Dave Elzinga, Susan Mulder, and Ole Jørgen Vetvik, wrote, "Marketing has always sought those moments, or touch points, when consumers are open to influence. For years, touch points have been understood through the metaphor of a 'funnel' —consumers start with a number of potential brands in mind (the wide end of the funnel), marketing is then directed at them as they methodically reduce that number and move through the funnel, and at the end they emerge with the one brand they chose to purchase." They went on to note, "Today, the funnel concept fails to capture all the touch points and key buying factors resulting from the explosion of product choices and digital channels, coupled with the emergence of an increasingly discerning, well-informed consumer." ["The consumer decision journey," McKinsey & Company, 1 June 2009] From their research, the McKinsey analysts concluded:

"The decision-making process is a more circular journey, with four primary phases representing potential battlegrounds where marketers can win or lose: initial consideration; active evaluation, or the process of researching potential purchases; closure, when consumers buy brands; and postpurchase, when consumers experience them (see below). The funnel metaphor does help a good deal — for example, by providing a way to understand the strength of a brand compared with its competitors at different stages, highlighting the bottlenecks that stall adoption, and making it possible to focus on different aspects of the marketing challenge. Nonetheless, we found that in three areas profound changes in the way consumers make buying decisions called for a new approach."

McKinsey DP2P model

Although this depiction of the consumer digital path to purchase seems to have gathered support, it doesn't capture either the nuances or complexities of the six types of consumer discussed by Goodson. If you click on the link to Goodson's article, you'll find an infographic that is much more complex. To be fair, the McKinsey analysts did go on to note that the landscape has dramatically shifted beneath the feet of marketers. They wrote:

"The second profound change is that outreach of consumers to marketers has become dramatically more important than marketers’ outreach to consumers. Marketing used to be driven by companies; 'pushed' on consumers through traditional advertising, direct marketing, sponsorships, and other channels. At each point in the funnel, as consumers whittled down their brand options, marketers would attempt to sway their decisions. This imprecise approach often failed to reach the right consumers at the right time. In today’s decision journey, consumer-driven marketing is increasingly important as customers seize control of the process and actively 'pull' information helpful to them. Our research found that two-thirds of the touch points during the active-evaluation phase involve consumer-driven marketing activities, such as Internet reviews and word-of-mouth recommendations from friends and family, as well as in-store interactions and recollections of past experiences. A third of the touch points involve company-driven marketing. Traditional marketing remains important, but the change in the way consumers make decisions means that marketers must move aggressively beyond purely push-style communication and learn to influence consumer-driven touch points, such as word-of-mouth and Internet information sites."

Two other McKinsey analysts, Peter Dahlström and David Edelman, agree with their colleagues that the business landscape is shifting and that consumers are becoming more empowered. "Digital marketing," they write, "is about to enter more challenging territory. Building on the vast increase in consumer power brought on by the digital age, marketing is headed toward being on demand — not just always 'on,' but also always relevant, responsive to the consumer’s desire for marketing that cuts through the noise with pinpoint delivery." ["The coming era of ‘on-demand’ marketing," McKinsey & Company, April 2013] They continue:

"The developments pushing marketing experiences even further include the growth of mobile connectivity, better-designed online spaces created with the powerful new HTML5 Web language, the activation of the Internet of Things in many devices through inexpensive communications tags and microtransmitters, and advances in handling 'big data.' Consumers may soon be able to search by image, voice, and gesture; automatically participate with others by taking pictures or making transactions; and discover new opportunities with devices that augment reality in their field of vision (think Google glasses). As these digital capabilities multiply, consumer demands will rise in four areas:

1. Now: Consumers will want to interact anywhere at any time.

2. Can I: They will want to do truly new things as disparate kinds of information (from financial accounts to data on physical activity) are deployed more effectively in ways that create value for them.

3. For me: They will expect all data stored about them to be targeted precisely to their needs or used to personalize what they experience.

4. Simply: They will expect all interactions to be easy.

"... One thing is clear: the consumer's experiences with brands and categories are set to become even more intense and defining. That matters profoundly because such experiences drive two-thirds of the decisions customers make, according to research by our colleagues; prices often drive the rest."

Ned Smith, BusinessNewsDaily Senior Writer, appears to agree with the results of proprietary research conducted by in collaboration with Latitude that seems to merge McKinsey's circular approach with the behaviors exhibited by GroupMNext's six consumer segments. He writes, "The linear churn of the purchase funnel has been supplanted by the 'Purchase Loop,' a spider web of six behaviors that steer the path to purchase." ["Understanding the 'Purchase Loop': Why Consumers Buy," BusinessNewsDaily, 6 February 2013] A press release about the study states, "Key elements of the study found that: Shopping is more complex than simply identifying a need, exploring options and purchasing. Paths to purchase are more complex and less linear. Paths to purchase may require a greater number of 'stops' along the way but purchases happen more quickly. Consumers' relationships with brands are much more personal. Shopping today is less about brands and products themselves and more about the consumers' feelings and needs." ["Press Release – Purchase Loop," Latitude, 5 February 2013] The release goes on to describe the six behaviors along "The Purchase Loop." They are:

  • Openness – consumers are receptive to new or better experiences stemming from pre-existing interest in or curiosity about a category or topic area. Consciously or subconsciously, brands, products or services may be on the consumers’ radar.

  • Realized want or need – something acts as a catalyst giving the consumer a reason to start looking into things he/she wants or needs to do.

  • Learning and education – understanding the broad fundamentals in order to make a purchase the consumer can feel good about.

  • Seeking ideas and inspiration – looking for, noticing and keeping track of examples, thought-starters, and motivators in order to take the next step.

  • Research and vetting – comparing options, looking for deals, comparing prices, reading reviews and determining personal associations with the brand.

  • Post purchase evaluation and expansion – consumer uses or experiences a purchase and decides how he/she feels, might post reviews and share experience, can send the consumer into additional purchase loops if renewed openness to brand or inspiration to look into related products, tasks or needs.

According to the study, those behaviors connect inside the Purchase Loop and can interact with elements of loop along the way (as depicted in the following graphic).

About DP2P

It should be apparent by now that understanding the consumer digital path to purchase is not a straight-forward endeavor. There are nuances and complexities that make the digital path to purchase both difficult to understand and to depict. The most important thing for manufacturers and retailers to understand is that regardless of when or how a consumer gets on the digital path they expect a good experience. Failure to provide that experience almost guarantees a "no sale."

June 26, 2013

Trekkies Rejoice! Food Replicators are Coming Soon

James T. Kirk clearIn the original Star Trek series, meals served to crew members in space came from food synthesizers that, in subsequent Star Trek series, morphed into replicators. The well-stocked space pantry consisted primarily of feedstock that could be combined into all sorts of different foods and taste combinations to suit the various humanoid species that resided in the starship. Laurie Segall writes, "A 3D food printer sounds like something out of Star Trek, but it's not out of this world." In fact, when she wrote her article back in 2011, one was already "up and running at the French Culinary Institute in Manhattan" and had been in service since 2009. ["This 3D printer makes edible food," CNN Money, 24 January 2011] She also predicted, "In five years, it could be in your home." The 3D printer used by the Institute in Manhattan was developed at Cornell University by a group of scientists and students. "The device attaches to a computer," Segall wrote, "which works as the 'brain' behind the technology."

One of the comments in Segall's article that fascinated me most came from David Arnold, director of culinary technology at the French Culinary Institute. "One of the main things I hope this machine will let us do is create new textures that we couldn't get otherwise," he told Segall. "This is the first time I've really seen this happen." Normally, the first thing that comes to mind when one thinks of food is taste. But, as I noted in a previous post entitled Enjoying Food: Taste, and Our Other Senses, all of our senses play a role in how we perceive the taste of food. If a true replicator is going to be made, then the computer that controls it must contain the right sensory information. Texture is one of those characteristics.

If you think that this subject remains in the realm of science fiction, you'd be wrong. It was recently announced that "NASA and a Texas company are exploring the possibility of using a '3D printer' on deep space missions in a way where the 'D' would stand for dining." ["3D Printing: Food in Space," Space Travel, 27 May 2013] The first step being undertaken is a feasibility study. The article continues:

"Systems and Materials Research Consultancy will conduct a study for the development of a 3D printed food system for long duration space missions. Phase I SBIR proposals are very early stage concepts that may or may not mature into actual systems. This food printing technology may result in a phase II study, which still will be several years from being tested on an actual space flight."

Although there remains a debate about whether manned space exploration is necessary, at this point in time, NASA doesn't want to preclude any option. If the decision is made to launch a manned mission to Mars, for example, then feeding the astronauts on that long voyage becomes a real challenge. The article explains:

"NASA's Advanced Food Technology program is interested in developing methods that will provide food to meet safety, acceptability, variety, and nutritional stability requirements for long exploration missions, while using the least amount of spacecraft resources and crew time. The current food system wouldn't meet the nutritional needs and five-year shelf life required for a mission to Mars or other long duration missions. Because refrigeration and freezing require significant spacecraft resources, current NASA provisions consist solely of individually prepackaged shelf stable foods, processed with technologies that degrade the micronutrients in the foods. Additionally, the current space food is selected before astronauts ever leave the ground and crew members don't have the ability to personalize recipes or really prepare foods themselves. Over long duration missions, a variety of acceptable food is critical to ensure crew members continue to eat adequate amounts of food, and consequently, get the nutrients they need to maintain their health and performance."

As you can imagine, there are a number of challenges that must be met. What kind of materials will be used as feedstock for the 3D printer? How will they be stored? What combinations of feedstock can be used to create new recipes? How do you incorporate things like taste, aroma, and texture? The article concludes:

"NASA recognizes in-space and additive manufacturing offers the potential for new mission opportunities, whether 'printing' food, tools or entire spacecraft. Additive manufacturing offers opportunities to get the best fit, form and delivery systems of materials for deep space travel. This's why NASA is a leading partner in the president's National Network for Manufacturing Innovation and the Advanced Manufacturing Initiative. 3D printing is just one of the many transformation technologies that NASA is investing in to create the new knowledge and capabilities needed to enable future space missions while benefiting life here on Earth."

Getting back to "life here on earth," Jeffrey Lipton, a PhD candidate at Cornell, and Hod Lipson, a professor at Cornell's Creative Machines Lab, provide an update to Segall's story about their 3D food printer project. ["Adventures in Printing Food," IEEE Spectrum, 31 May 2013] They note that the project actually began back in 2005 as the Fab@Home project. The following video, which accompanied Segall's story, shows the machine in action at the French Culinary Institute and contains a brief interview with Lipton.

Lipton and Lipson report that one of the first individuals to use their 3D food printer outside of a research lab was a high school student named Noy Schaal. She built one of the machines, adapted it to print chocolate delights, and won first prize at a local science fair after she "printed chocolate letters, textured bars, and other shapes directly from a computer-aided-design (CAD) model and then handed them to the judges." Among the early foods that were experimented with (in addition to chocolate and the ingredients discussed in the video) are hummus, peanut butter, Brie and apricot comfiture, and Cheez Whiz. Lipton and Lipson continue:

"While a paste-based diet may have sufficed for the early astronauts, it's too limited for most people. For digital cooking to really catch on, we concluded, the printers needed to accommodate a larger range of recipes, ingredients, and cooking temperatures. Getting the printers to operate at the right temperatures for different types of food is not easy. Food, unlike plastic, can change dramatically over a relatively short period of time: A batch of frosting made in the morning may work fine at one temperature, but the same batch later in the day may not. Now consider the huge array of possible ingredients and the different settings that each would need, and you can see why creating a truly useful home food printer seemed at first impossible."

And, if you think that an earthbound printer seemed impossible, you can see why a space-based one was unthinkable. Lipton and Lipson report, however, that a Cornell University graduate student named Daniel Cohen had an idea. They continue:

"What was needed, he thought, was the equivalent of an RGB standard for food. RGB stands for red, green, and blue, the basic color elements used in televisions to reproduce a rainbow of colors; a similar set of basic colors—cyan, magenta, and yellow—are used in inkjet printers. Cohen's idea was to create a similarly standard set of elements for the food printer that would make it simpler to produce a variety of foods—and also allow you to share your designs, so that you could 'send' a piece of cake to your uncle's printer."

The problem, of course, was finding the food equivalent of RGB. Remarkably, Lipton and Lipson report they "didn't have to look far." They explain:

"A huge industry already exists to devise food flavors and colors that can make just about anything look and taste like something else. Supplements like vitamins, minerals, and fibers are also widely available. The only problem, then, was getting the right texture. For that we turned to hydrocolloids—materials like carrageenan, xanthan gum, and gum arabic—that today appear on many food labels. They're the thickeners in McDonald’s milkshakes, for instance. We brought in other gelling agents like those used in Jell-O desserts. We were already familiar with some of these substances, having used them to help print living cells. This time, we mixed the gels and gumming agents with other ingredients and then put them through our printer to create edible constructs like cubes of milk, raspberry domes, and mushroom-shaped bananas."

Lipton and Lipson admit that some of their concoctions were "a little too weird" to be accepted by the masses. But the concept was proven. The exciting thing about this technology is that it opens up new possibilities for feeding the world. Lipton and Lipson explain:

"Researchers at TNO (the Netherlands Organisation for Applied Scientific Research), are extracting basic carbohydrates, proteins, and nutrients from algae, insects, and the like and then using them to print something resembling steak and chicken. Eventually, this may allow them to print a filet mignon from a protein that requires far less water, energy, and labor than does a cow. TNO isn't the only place exploring this realm. Susana Soares at London South Bank University has used a flour made from crushed bugs to print edible objects that look like butterfly wings and honeycombs. While this approach could someday solve the Malthusian concerns of food production, it's a hard idea to swallow. The trend these days is to back away from highly processed foods."

While the idea may be hard to swallow today, attitudes change with circumstances. There have been a number of stories in the news as of late about the food qualities of insects. Making them palatable for most people is the challenge. Food printing may prove to be part of the answer. Currently, however, Lipton and Lipson are partnering with world-class chefs to take existing foods and mold them into interesting shapes using their printer. They report, "Using the printer to creatively customize food shapes, we discovered, is a lot more appealing than crafting milk cubes out of hydrocolloids." They have also been able to create cookies that have a written message inside. They conclude:

"Digital cooking is still a nascent field, but we're amazed at how much progress has already been made: From those humble peanut butter, hummus, and chocolate objects, it has already morphed into a movement that could someday transform how we prepare and consume food. While some people believe the future of printed food will begin at the chemical level, others think it will become a common tool to augment the molds, knives, and ovens we already have. Regardless, both camps agree that the information age's transformations have started making kitchen magic."

Part of what will make the kitchen magic will be research into how all the senses come into play in the enjoyment of food. As this data is fed into computers and analyzed, we won't be too far away from creating the food synthesizer that Gene Roddenberry conceived when he sat down to write his first Star Trek script.

June 25, 2013

Cognitive Computing

There seems to be some confusion about exactly what the term "cognitive computing" means. S.E. Smith writes, "Cognitive computing refers to the development of computer systems modeled after the human brain. Originally referred to as artificial intelligence, researchers began to use the term cognitive computing instead in the 1990s, to indicate that the science was designed to teach computers to think like a human mind, rather than developing an artificial system." ["What is Cognitive Computing?" wiseGEEK, 25 February 2013] On the other hand, Forrester analyst John Brand writes, "The term 'cognitive computing' emerged in response to the failings of what was once termed 'artificial intelligence'." ["Make No Mistake - IBM’s Watson (and Others) Provide the *Illusion* of Cognitive Computing," John Brand's Blog, 21 May 2013]

Cognitive computing offers two definitions of the adjective cognitive. First, it pertains "to the act or process of knowing, perceiving, remembering, etc." Second, it pertains "to the mental processes of perception, memory, judgment, and reasoning, as contrasted with emotional and volitional processes." When it comes to cognitive computing both definitions seem apply. To my mind, cognitive computing involves the processing of data in such a way that the system reasons about it, remembers it, and makes judgments (i.e., decisions) based on what it learns. All cognitive computer systems are learning systems. Whether the computer exactly mirrors human thought processes is irrelevant. Frankly, we don't know enough about human thought processes to mimic them accurately. What researchers are finding is that computers can learn a lot given enough data and the right algorithms (see my post entitled Intelligence from Chaos).

Most analysts seem to agree that cognitive computing is step forward into a new era. An article on cognitive computing, written by analysts at IBM Research, asserts that it's necessary to change computing paradigms in order to progress. ["Cognitive systems: A new era of computing"]. It states:

"Over the past few decades, Moore's Law, processor speed and hardware scalability have been the driving factors enabling IT innovation and improved systems performance. But the von Neumann architecture—which established the basic structure for the way components of a computing system interact—has remained largely unchanged since the 1940s."

If you are not familiar with the von Neumann architecture, you can watch the following short video on the subject. There are a few spelling errors in the presentation, but it is packed with information and is cleverly presented.

Given the rise of big data, the IBM article insists that the von Neumann architecture is "no longer good enough." It explains:

"We now are entering the Cognitive Systems Era, in which a new generation of computing systems is emerging with embedded data analytics, automated management and data-centric architectures in which the storage, memory, switching and processing are moving ever closer to the data. Whereas in today's programmable era, computers essentially process a series of 'if then what' equations, cognitive systems learn, adapt, and ultimately hypothesize and suggest answers. Delivering these capabilities will require a fundamental shift in the way computing progress has been achieved for decades."

The article goes to describe several characteristics of cognitive computer systems including that they are data-centric (use big data) and designed for statistical analytics. Bernie Meyerson, IBM's vice president of innovation, told Brian Deagan, "Cognitive computing is a completely different approach to drive performance in computers." ["IBM Predicts Cognitive Systems As New Computing Wave,", 23 January 2013] He continued:

"These machines will perform better because they learn, they adapt, they sense — and by doing that you don't program it so much as you can teach the system to learn. That is incredibly efficient, compared to what you can do today, where you literally type in millions of lines of code to get the machine to do what you want. This is a machine that can observe and follow."

Deagan asked Meyerson if IBM's Watson was a cognitive computing system. Meyerson answered, "Yes. Watson is the embodiment of cognitive computing. For example, it can be taught not only to recognize the right and probable answer to a medical diagnostic issue such as a cancer, but it can also learn from uncertain data, even if you have conflicting data. Watson, because it is probabilistic, might not know the exact answer, but if the odds favor or point to one answer it will assign a high probability to the correct answer." Forrester's Brand disagrees with Meyerson, he writes, "Let's get real. Despite the fact that 'Watson' was trained to successfully win a game show (Jeopardy), IBM's technology (and others to be fair) are not cognitive computing systems at all. That's not to say they aren't valuable - just that we shouldn't overstate their value or capabilities."

Brand's real quibble is that systems like Watson don't really match human intelligence. Since he admits they are nevertheless valuable, does it really matter? What matters is that cognitive computer systems can learn (either through discovery or by being taught). Meyerson told Deagan, "With cognitive computing, it's about providing the computer a richer set of data to make decisions. The idea behind cognitive machines is that you don't program them, you teach them." Cognitive computing systems are far enough along that Shweta Dubey believes they could be the next disruptive technology. ["Is Cognitive Computing the Next Disruptive Technology?" The Motley Fool, 7 January 2013] She writes:

"Cognitive computing will be a larger part of the future as an emerging field in which computers can operate more like a human brain. The computers will go beyond performing static operations and will begin using the five senses, just like a human brain."

Bertrand Duperrin, a Consulting Director at Nextmodernity, agrees with Dubey. "There are many chances," he writes, "the next wave will be about cognitive computing." ["Towards cognitive computing," Bertrand Duperrin's Notepad, 21 May 2013] He continues:

"A new era is starting, where small and big decisions will me made, at any level, by people having a comprehensive view of their environment and of the way it moves. New ways of doing things that will be supported by new platforms and a new approach to computing : Cognitive Computing. ... If Big Data is about mass data processing, it’s only one side of Cognitive Computing which is also about data analysis and tagging, pattern discovery and the ability the system has to learn from his own experience. Cognitive Computing also [has a] human side. ... People will still be key for interpretation -- provided the information they're given is of quality, well targeted and they have the required knowledge and training. Cognitive Computing is what will help to move [organization's] from [having] data to [having] information. ... If computing is a matter of data, social computing a matter of people, cognitive computing is more a convergence than a next step: the need of using people and data together."

The most exciting potential of cognitive computing is using vast databases to discover new relationships. The Technical Committee on Cognitive Computing of the Systems, Man & Cybernetics Society, asserts:

"Cognitive Computing breaks the traditional boundary between neuroscience and computer science, and paves the way for machines that will have reasoning abilities analogous to a human brain. It is an interdisciplinary research and application field, and uses methods from psychology, biology, signal processing, physics, information theory, mathematics, and statistics. The development of Cognitive Computing will cross fertilize these other research areas with which it interacts."

My company, Enterra Solutions, is focused on the development and application of an advanced artificial intelligence system, the Enterra Cognitive Reasoning Platform™ (CRP), that analyzes both structured and unstructured data using ontologies and mathematical algorithms. The CRP is capable of addressing various commercial markets and disciplines using a generalized framework, yet it is designed so that it can be tailored to handle the disparate data sources and specific challenges found in individual industries and in different functional areas. Since most major universities have scientists conducting research in this area, I expect that field of cognitive computing will mature rapidly. That's good news for all of us.

June 21, 2013

Innovation: Tinkers, Tailors, Soldiers, and Spies

The great thing about innovators is that they come in all shapes and sizes. Some pundits believe great innovators share certain traits or habits, but beyond that, trying to describe what makes a great innovator is nearly impossible. In fact, most creativity gurus don't believe in the "lone genius" innovator. "The truth is," writes Rob Cross, an associate professor at the University of Virginia's McIntire School of Commerce, Andrew Hargadon, a professor in the Graduate School of Management at the University of California, Davis, Salvatore Parise, an assistant professor in the Technology, Operations, and Information Management division at Babson College, and Robert J. Thomas, executive director of Accenture's Institute for High Performance Business, "most innovations are created through networks -- groups of people working in concert." ["Together We Innovate," Wall Street Journal, 19 June 2013] Group settings involve a number of different types of personalities and approaches.

Tinker tailor soldier spyHaving said that, Dr. Tomas Chamorro-Premuzic, a Professor of Business Psychology at University College London (UCL), believes that true innovators share "five rarely discussed habits." He agrees with creativity coaches that anyone can come up with good ideas; however, he also believes that "most of us tend to believe we are more creative than we actually are." ["If You Have These Five Habits You Are Probably an Innovator," Psychology Today, 23 April 2013] He explains:

"On one hand, we all have the potential to do innovative things. On the other, some people are much more likely to innovate than others - and this depends more on their personality, attitudes and values than the educational, corporate, or even cultural policies or norms."

The five habits that Chamorro-Premuzic believes differentiates innovators from others are:

1) You are an evening rather than a morning type

2) You can multi-task

3) You often have weird and embarrassing thoughts

4) When you work on something you enjoy, you completely lose track of time and immerse yourself in the task

5) You dislike rules, norms and the status-quo

In his article, he explains why each of those habits can make a person more innovative. He also invites readers to take part in study he is conducting and he promises to give you instant feedback on "your potential for innovation." A word of caution -- if you find yourself skulking around the city at night having weird and embarrassing thoughts and breaking the rules, the police might not think you're being innovative!

Roger von Oech claims that innovation teams must embrace four roles: explorer, artist, judge, and warrior. ["The 4 Roles of Creativity: Explorer, Artist, Judge, Warrior," 99u, 29 May 2013] As you will see from the descriptions below, it is almost impossible to find all of these traits in a single individual. Working together, however, these four personality types can be quite innovative.


"The Explorer's job is to collect the raw material for creativity. He is constantly asking questions, talking to different people, and processing as many inputs as possible."

Commenting on a Booz & Company study about corporate R&D spending, Gijs van Wulfen concludes that it is critical for companies to have explorers. The Booz & Company calls them "Need Seekers," because they find out what customers really need. Van Wulfen writes, "Need seeking is essential, because a good innovation is a simple solution to a relevant customer need." ["The best innovators are need seekers," LinkedIn, 12 November 2013]


"The Artist takes the raw material from the Explorer and combines it in new and interesting ways. He's playful and imaginative with no concerns about judging the quality of what he's creating."

When it comes to innovation, the artists that are most often involved are designers. To give you a sense of how important designers are in the innovation process, read a post I wrote back in 2007 entitled The Medici Effect and New Design. The only quibble I have with von Oech about the description of artists is that designers ARE concerned about the quality of what they create. Good designers create objects that are both beautiful and functional. Steve Jobs once said, "Technology alone is not enough. It's technology married with liberal arts that yields us the result that makes our hearts sing."


"The Judge takes the Artist's ideas and determines if they're practical. He thinks critically and realistically about what can actually be done."

Judges are often also referred to as "Gatekeepers." Gatekeepers are important because they are able to kill bad ideas before bad ideas kill the company. Cross, Hargadon, Parise, and Thomas note, "Very often, these gatekeepers hold their esteemed position for good reason -- they have technical expertise or other skills that have served the company well." But they caution that not all gatekeepers are good gatekeepers. "They may not be the best judges of new ideas, and their expertise in one area may in fact blind them to innovations in other areas."


"The Warrior takes an idea the Judge has determined worthy and tenaciously follows it to completion. The Warrior's job is to overcome resistance, be courageous, and ship the idea."

Warriors are often referred to as innovation champions. For more on that subject, read my post entitled Making Everyone an Innovation Champion. As I wrote in that post, "Supporting innovation champions with more than words and back slaps is important. 'In order for innovation to flourish in your organization,' writes Chuck Ferry, 'your innovation champions must be supported through properly structured responsibilities, goals and resources. Otherwise, they will leave to pursue other opportunities, taking their energy and ideas with them.' ["Cultivating Innovation Champions," Innovation, 30 January 2013]

Mike Lehr sees a much simpler view of personality types associated with innovation. "In the development of ideas," he writes, "we generally see two types of people: creators and pruners. While people often display both types, usually one is dominant." ["Creators vs. Pruners: Personality Typing," Influencing and Problem Solving, 28 January 2013] He continues:

"Creators birth ideas or develop existing ones further by adding onto them. Pruners take ideas and modify them to fit a situation. Whereas ideation tends to be a growing process with creators (and potentially infinite), it tends to be a cutting back one with pruners (and thus finite). In the extreme, creators never complete ideas because of constantly 'perfecting' them while pruners will reduce ideas until they're nothing or provide no value."

Lehr may be a bit too harsh in his description of pruners. They are not exactly gatekeepers, but they do serve a reality-check function. A good gardener doesn't prune his trees in order to kill them, but to make them flourish. You could probably apply the Pareto principle (also known as the 80–20 rule) when defining these two personality types. Creators come up with 100 percent of the ideas, but only 20 percent of them are any good. It's the job of pruners to find that 20 percent. Lehr recognizes that any number of labels can be applied to these personality types. He explains:

"Business contains many examples formalizing these functions. We have 'writers' and 'editors.' The first creates, the second prunes. Manufacturers create products and retailers select (prune) what they will offer. In media, we have content creators such as newspapers, movie production companies, television producers, etc. and content aggregators such as search engines, cable companies, booksellers, movie renters, etc. who choose (prune) the content they will offer."

Lehr's framework certainly provides food for thought, but I believe that van Oech's framework offers more insight into the personality types needed to foster innovation. Van Oech's four creativity types can be expanded upon -- for example, some creativity gurus believe you need to add a "Joker" or "Fool" to the mix. Like a good comedian, the joker sees absurdities in what most of us see as normal. Niels Bohr once remarked, "We all know your idea is crazy. The question is, whether it is crazy enough?" Who knows, some consultant is likely to come up with reasons why every innovation team also needs a tinker, a tailor, a soldier, and a spy.

June 20, 2013

Smart Cities and Traffic

One simply can't discuss smart city initiatives without including the topic of traffic -- that is, how things move within the confines of an urban environment. Perhaps the most talked about challenge is how motorized traffic can move most efficiently and effectively through crowded streets. In some planned cities, like Masdar in the UAE, traditional automobiles are simply not allowed. Cars must be parked and specially designed urban vehicles must be utilized. Some pundits dream of a day when most cities restrict traffic to a limited number of routes so that people can cross streets anywhere in safety and children are allowed to play stickball without fear of being run over. Neal Peirce reports that one such dreamer is Enrique Peñalosa, former mayor of Bogota, Colombia, who proposed "a radical 'green' and 'safe' prescription for building and rebuilding streets, both city and suburban" ... in an article for the March-April edition of Urban Land magazine." ["Greener Cities Will Push Cars To The Curb," Hartford Courant, 15 May 2013] Peirce continues:

"Peñalosa proposes a radical remedy: cities with generous numbers of auto-free streets and with greenways reserved for pedestrians and bicycles. He challenges us to imagine a Manhattan — or another city — 'where alternate streets and avenues are reserved for use by pedestrians and bicycles, with a few of those streets, green with trees, allowing trams or buses on narrow busways.' The result would be a network of pathways free from competition with autos and trucks except at every-other-street intersections. It would constitute a return, in major aspects, to the safer pedestrian walkways and life of the pre-auto era."

Frankly, such radical plans aren't likely to be implemented in most existing cities. They would simply create too many new challenges (e.g., garbage collection, deliveries, parking, etc.). Since city fathers don't have the luxury of planning from the ground up, they must deal with things as they have evolved. That is why Carlo Ratti and Anthony Townsend are interested in developing what they call "Truly Smart Real Cities." ["Harnessing Residents' Electronic Devices Will Yield Truly Smart Cities," Scientific American, 17 April 2011] They understand that most "real" cities will have a difficult time finding a way to pay for new infrastructure embedded with sensors; so, as the title of their article suggests, they recommend harnessing the power of the mobile devices already being carried by residents. Google is already taking advantage of these devices with the help of volunteers. Ratti and Townsend explain:

"Rather than focusing on the installation and control of network hardware, city governments, technology companies and their urban-planning advisers can exploit a more ground-up approach to creating even smarter cities in which people become the agents of change. With proper technical-support structures, the populace can tackle problems such as energy use, traffic congestion, health care and education more effectively than centralized dictates. And residents of wired cities can use their distributed intelligence to fashion new community activities, as well as a new kind of citizen activism. ... An ideal beginning is to leverage the growing array of smart personal devices we all wield and recruit people as the sensors of a city rather than relying only on formal systems embedded into infrastructure. The traffic function on Google Maps is a good example. Instead of building a costly network of dedicated vehicle sensors along roadways, Google constantly polls a large network of anonymous volunteers whose mobile devices report their up-to-the-minute status, which reveals where traffic is flowing, slowed or stopped. The information is delivered to drivers via mobile mapping applications in various ways — as colored overlays indicating traffic speeds, as estimated driving times that account for delays or as a factor in determining alternative routes."

The primary concern with this approach involves safety issues surrounding drivers getting distracted while responding to polling queries in the midst of congested traffic. And, in many cities, that's the norm. There are lots of good reasons for wanting to get a handle on urban traffic problems. One of those reasons is money. According to a report by Texas A&M University, traffic congestion "costs the U.S. $121.2 billion per year." ["Big Data Has ‘Big Impact’ in Beating Traffic," by Steve Rosenbush, Wall Street Journal, 27 March 2013] "At the micro level," Rosenbush explains, "commuting time and jammed highways can have an impact on whether a particular house gets sold or if a shipping company can make money on a given trip."

Another reason for promoting better traffic flow is reduced stress. Drivers on the move are much happier than those stuck in traffic. When one thinks about congested roads in the United States, the Los Angeles area is generally pretty high on the list. As Mike Wheatley reports, "Los Angeles' reputation for being an absolute nightmare for drivers is well-deserved." ["Big Data Traffic Jam: Smarter Lights, Happy Drivers," SiliconANGLE, 3 April 2013] Wheatley reports, however, that Los Angeles has been working hard to improve traffic flow through the city.

"A new traffic control system might just be able to ease some of their suffering. Almost three decades in the making, the city's newly completed Automated Traffic Surveillance and Control System will synchronize all 4,500 traffic lights in the metropolis, with the goal of not just reducing drive times, but cutting down on pollution as well. ... The project is totally unique with regards to its size and its scope, and is reputed to have cost more than $400 million to implement. The New York Times gives us a breakdown of how it all works:

'Magnetic sensors in the road at every intersection send real-time updates about the traffic flow through fiber-optic cables to a bunker beneath downtown Los Angeles, where Edward Yu runs the network. The computer system, which runs software the city itself developed, analyzes the data and automatically makes second-by-second adjustments, adapting to changing conditions and using a trove of past data to predict where traffic could snarl, all without human involvement.'

"The system is intelligent in that it can automatically adjust the time delay between light changes whenever issues arise."

Getting traffic to move more intelligently through urban streets obviously relies heavily on the availability of big data, analytics, and artificial intelligence. "Synchronized traffic lights aren't the only thing that's in store for cities of the future," Wheatley writes. "What with the number of cars on our roads rapidly piling up in cities across the world, tech companies like IBM are designing numerous solutions to cut back on congestion and pollution." As an example of what IBM is working on, Wheatley pointed to the following video about IBM's efforts in Istanbul.

Another company involved with the big data analytics effort regarding traffic is INRIX Inc. It was founded in 2004 by former Microsoft employees. According to Steve Rosenbush, INRIX analyzes "10 billion data points a day." Shira Ovide reports, "Start-ups including Waze Inc. and Uber Technologies Inc. and such tech giants as Google Inc. and Apple Inc. are also applying big data to transportation issues." ["Tapping 'Big Data' to Fill Potholes," Wall Street Journal, 12 June 2012] As a side note, Waze was just acquired by Google for over a billion dollars. Ovide continues:

"For many cities and cash-strapped agencies, officials say, the proliferation of data technologies can help them cut costs and make smarter choices about transportation projects, direct people to empty parking spots and otherwise ease annoyances for citizens. In Boston, for instance, the city is planning to launch Street Bump, a mobile-phone app that can identify potholes as people drive on city roads. The app detects minute changes in a phone's accelerometer—the same technology used to shift the orientation of a smartphone screen when it's tilted sideways. Chris Osgood, co-chair of city hall's Office of New Urban Mechanics, says Boston also hopes to use Street Bump technology to figure out which roadways are most in need of repaving. Today, municipalities often make such decisions based on cumbersome surveys that involve engineers in pickup trucks dragging chains behind them and measuring the vibrations of the metal."

Wheatley worries that such efforts might be too successful. He explains:

"One danger lies in the prospect that – if we can leverage technology to reduce congestion – people might suddenly decide that the roads aren’t so bad and start buying even more cars to clog them up with, creating a vicious circle if you will. Ultimately, the technology will only help us so much, but it's nice to know that we seem to be on the right track."

Although this post primarily deals with automobile traffic, there are obviously a number of people thinking about other ways of moving people and things about in the urban environment. I focused on automobile traffic, however, because some pundits believe that the automobile industry is about to experience its "Kodak Moment" after which everything is going to change. The following video provides a glimpse into that discussion.

More of the conversation can be viewed by clicking on this link. In future posts about smart city initiatives, I hope to discuss more about how people will move through the urban environment without having to rely on personal automobiles. Automobiles are not going to disappear; but, even the automobile industry understands that the business landscape is rapidly shifting beneath its feet. Although some pundits believe the end is in sight for the industry, I believe the automotive industry's eyes have been opened and we should continue to see some exciting ideas emerge from the industry itself about how to best address how people will transported in the future. The industry will increasingly think of itself as being in the people and goods transport business rather than in the automobile business.

June 19, 2013

The Digital Path to Purchase, Part 2

In Part 1 of this 2-part series on the digital path to purchase (DP2P), I explored why some analysts believe that digital technologies have forever altered the business landscape. Kevin Glacken believes, "Society is rapidly approaching a complete digital state." As a result, he writes, "There’s never been a better time for marketers for to see, understand and respond to the customer journey." ["Mapping the Customer Journey with Social Intelligence," Social Media Today, 25 May 2013] He explains:

"Today’s 'open social age' has created a 'looking glass' for companies to understand the complex nature of their customers as millions of them are broadcasting their opinions, attitudes, behaviors, experiences and even unmet needs on a real-time basis. ... New technologies have transformed the way people work, learn, communicate and share. And consumers freely share their opinions and experiences on social networks, blogs, micro-blogs, message boards, forums, mainstream news sites and a variety of other online platforms. According to Dimensional Research, over 90 percent of these consumers rely on independent reviews of products and services before they make a purchasing decision."

The "looking glass" through which companies view their customers is constructed from big data and analytics. Glacken asserts, "While many within the marketing realm point to the customer journey fragmenting between the offline and online worlds, the mobile revolution is actually pulling these worlds more closely together than ever before." Although that may sound counterintuitive, Glacken states the reason for this convergence is that consumers are "no longer ... tethered to their computers." He continues:

Blog Picture 02"According to a Morgan Stanley study, 91 percent of mobile users keep their device within three feet of them 24-hours per day. Mobile devices are now widely a standardization of life; people watch TV with them, make medical decisions with them and certainly shop with them. This 'always on' state that mobile online accessibility facilitates is particularly transforming the way we all undertake traditional offline shopping activities, from conducting product comparisons and identifying best prices to locating coupons or offers and getting opinions and reviews. This, along with the all-time accessibility of mobile devices has made the wealth of information ubiquitous in the 'offline' world."

In this post, I want to examine why mobile technologies are receiving the kind of attention Glacken discusses as well as look more closely at which consumers are likely to embrace DP2P mobile technologies. Although mobile technologies (e.g., tablets and smartphones) are often lumped together, Monica Ho asserts, "User behavior by device differs greatly – from experience to expectation." ["Smartphone vs. Tablet Commerce: 3 Essential Behaviors You May Be Overlooking," Search Engine Watch, 22 May 2013] She explains:

"With evolving features and functionality, as well as situational needs to consider, there are many unique differences in user behavior by device that can provide marketers with insights on not only the when and where – but how consumers use smartphone and tablet devices for their specific research and purchase-related needs."

Ho reports that even though PCs account for two-thirds "of all time spent in digital media, ... consumers are starting to rely heavily on their mobile devices, utilizing either smartphone or tablet based on distinct needs." Like the analysts discussed in Part 1 of this series, Glacken asserts that the digital world "creates a myriad of paths customers can take towards their purchase." As a result, consumers are becoming more empowered and retailers are finding the business landscape more complex. "The key to driving success," Glacken explains, "is for marketers to deeply understand consumers, continually map the journey and leverage this insight to drive messaging, education, promotions and innovation to align with the customers’ needs (met and unmet) and wants." He continues:

"The basis of understanding the customer journey today is mapping it with advanced social intelligence. To adequately accomplish this it must be derived from the 'big data' mining of millions of daily consumer social conversations. Simple in concept, but challenging in technical execution. This is why so many leading brands are turning to advanced, streaming 'big data' solutions to deliver deep market and customer insights on rapid basis. The ability to analyze millions of customers and prospects allows for deep insights and the construction of key strategic journey components, which enables the organization to strategically drive decisions and innovation."

Although Glacken's emphasis on mobile technology is understandable, Ho points out, "Just 22 percent of mobile users complete a purchase directly via their smartphone or tablet, and therefore the full impact of mobile cannot be measured without tracking conversion activities beyond the mobile device." According to Ho, "26 percent of research and decision activity conducted in tablet will drive sales online via PC, while that number for smartphone is just 9 percent. This revelation affects everything from targeting strategy to mobile ad creative." That means that the vast majority of purchases are still made in person and that realization has some profound implications for marketers and retailers. For one thing, smartphones are more likely to be used in a hybrid path to purchase (i.e., digital/in person) than a tablet. Ho explains:

"Consumers are rarely without their smartphones, constantly relying on them to access information while on the go, while tablet usage is typically done at home or work. This difference in typical usage location leads to a different type of user intent related to consumer expectation of distance. ... Nearly 50 percent of smartphone users leverage their device to look up a business locale or directions, and 20 percent look for a business phone number. While, tablet users typically utilize their devices for research or information, generally unrelated to location, including price comparisons, coupons, and reviews. But this doesn't mean that location is irrelevant to the activity conducted by tablet users. One-third of tablet users do indeed look for information related to their location – be it local events, area restaurants, or even a business phone number. While the immediacy of physical location may not be as important to a tablet user, location is an essential contextual element that should drive relevancy in everything from ad creative, to targeting strategy."

Ho concludes, "Your mobile consumers are essentially telling you what they want and how they want it, and by identifying mobile user activity related to device during the length of their path to purchase, you can better engage and guide their split mobile audiences throughout their interaction with your business – and reap the benefits." That statement begs the question: "Who are your mobile consumers?" It should come as no surprise that different generations have embraced digital technologies in much different ways. To over-generalize a bit, the younger the generation the more likely it is to embrace digital technologies. Consumers from the Baby Boom generation and those known as Generation Xer's have certainly embraced technology, but not to the same extent as following generations.

Generation Y consumers, also known as "Millennials," are often described as the first generation of "digital natives." Even so, a study by the Urban Land Institute (ULI) determined that many of them still embrace a hybrid path to purchase. An article in Consumer Goods Technology reports, "Despite being far more tech-savvy than previous generations, Generation Y, the 80-million strong cohort of Americans between the ages of 18 and 35, has not forsaken shopping in stores for online purchasing — as long as retailers keep their offerings 'fresh' and interesting." ["New Gen Y Shopping Preferences Revealed," 20 May 2013] The generation following Gen Yer's appears to be even more connected to technology.

The Coca-Cola Company certainly believes this and recently announced an all-digital, mobile campaign focused on teenagers. ["Coke Runs First All-Digital Effort, Focusing on Teens and Mobile," by Christopher Heine, AdWeek, 23 April 2013] According to Heine, Pio Schunker, Senior Vice President of integrated marketing communications at Coca-Cola North America, told an online press conference, "This is going to mark the first all-digital campaign by Coca-Cola. And critically, this signals a whole new way in which we've decided to create marketing content. ... Mobile phones are [teens'] lifelines. It's not that they don't watch TV. But mobile is their first screen." Another indicator that mobile technologies (especially smartphones) represent the future of the digital path to purchase is that "Hispanic shoppers are more likely to use their mobile devices to make purchases than general market consumers." ["Mobile Important For Hispanic Shoppers," Marketing Daily, 23 May 2013] Age also plays a role in that demographic. Armand Parra, director of insight and strategy at The Integer Group, told Marketing Daily, "The median age of Hispanics in the U.S. is roughly 10 years younger than the total population. This younger population are adopting technology at a faster rate than the older general population. Secondly, for the majority of Hispanics, mobile is their primary access point to the Web and therefore their whole Web experience is based around a mobile tool set versus the PC-based Web experience."

That description could also apply to much of the developing world. "Mobile phones, which are proliferating and increasingly used to access the Internet in Asia, are becoming a powerful tool for both marketing and market research as they tap into key consumer trends," writes M. Hafidz Mahpar. ["Smartphones seen becoming the 'centre of digital' marketing," The Star, 4 February 2013] James Fergusson, global head of TNS, told Mahpar, "Mobile phones connect location, voice, social, and m-commerce more than any other device. Mobile is becoming the centre of digital. If you are a market research supplier, mobile has to be your primary area of investment. If you don't have a mobile strategy, you're not going to grow."

As the target audience for manufacturers and retailers moves down the generations, the importance of mobile technology and the digital path to purchase are going to move up.

June 18, 2013

The Digital Path to Purchase, Part 1

"The nature of the digital challenge is evolving rapidly and challenging traditional business models," write McKinsey & Company analysts Francesco Banfi, Paul-Louis Caylar, Ewan Duncan and Ken Kajii. "The initial wave that propelled the dramatic rise in consumer digital usage has already turned into a transformative surge - reshaping the ways consumers buy new products and services." ["E-journey: Digital marketing and the 'path to purchase'," McKinsey & Company Telecom, Media, and High Tech Extranet, 16 January 2013] If the authors are correct and the business landscape is being reshaped, businesses must also change if they are going to traverse the new terrain successfully. "The challenge today is much bigger than simply building an efficient online sales engine," the McKinsey analysts insist. "Instead, it involves creating brand engagement through digital media and platforms, turning that brand engagement into brand preference, and leveraging it to drive sales and loyalty."

Blog PictureWhat most people call the "digital path to purchase" (DP2P), the McKinsey analysts call "the consumer decision journey (CDJ)." Whatever you call it, it "describes the iterative and circular process shoppers go through today when selecting brands, products, and services." The analysts assert that the CDJ "has four phases: consideration, evaluation, purchase, and post-purchase." Frankly, there is nothing new about that conceptual framework. What's new, the authors report, is that digital media "has made this already complicated process even more complex." They explain:

"The traditional channels that typically influence consumers during the consideration phase include the 'big three' above-the-line (ATL) media - TV, radio, and print advertising - along with word-of-mouth recommendations from friends and family. With the advent of digital channels, the list expands to include brand Web sites, mobile apps, online advertising, social networks, price comparison engines, as well as blogs and forums. ... Pulling this tangled knot of complexity even tighter, digital touch points themselves have spread far beyond traditional personal computer platforms to include those behind smartphones, tablets, gaming devices, TVs, and even smart applications."

Dion Hinchcliffe agrees with the McKinsey analysts that complexity now characterizes the consumer's journey to purchase. "Businesses planning today to improve their connection to customers in digital channels," he writes, "are increasingly looking at the discipline of mapping out what's being called the 'customer journey'." ["The new digital customer journey: Cross-channel, mobile, social, self-service, and engaged," ZD Net, 19 May 2013] He continues:

"Over the last ten years, the fragmentation of customer engagement across dozens of channels has turned into both a highly vexing problem and an increasingly disruptive challenge to businesses that still keep doing what used to work, but are getting sharply falling off results from old touchpoints like TV, phone, and e-mail. This fragmentation of customer touchpoints cuts across marketing, sales, customer service, and even product development. In short, customers have moved to the digital world en masse, and companies have not kept up."

The McKinsey analysts note that not all "customer touch points" are equal. They concluded, "A brand's Web site has by far the highest touch point 'quality' - or the ability to positively influence a customer toward a particular brand." They claim that "mobile apps also show solid potential." What about social media? The analysts conclude that despite the "impressive growth and strong hype" they have received, "social networks do not appear to drive brand preference among consumers." To learn more about the importance of brand engagement, read my posts entitled Targeted Marketing and Brand Engagement, Part 1 and Part 2. Hinchcliffe agrees that now and in the future it is "imperative for companies to shift from transactions to engagement." Traditional marketing solutions, he believes, "have long been channel-centric, instead of customer-centric. While the majority of companies certainly do have modern call centers, social media marketing plans, e-mail campaigns, a mobile app strategy, an SEO policy, and so on, they are frequently unsynchronized and siloed. It's also highly likely these channels are also not monitored well and fall far short of the participation levels required to achieve ROI." Anytime you hear or read the word "silo" associated with business, you can be assured that alignment challenges are lurking nearby.

Keeping marketing strategies aligned across channels is important because consumers are likely to use several digital touch points during their path to purchase. Just as importantly, they are likely to jump in or depart the path to purchase at any time. As the McKinsey analysts conclude:

"The consumer decision journey is not a onetime process. Customers are continuously engaged and in various stages of multiple processes at any given point in time. For an individual brand, the importance here is its ability to keep customers coming back. Active loyalty plays a significant role in driving repeat purchase decisions - particularly in most developed markets. A broad shift in the way consumers are engaged online becomes apparent - even after the purchase. This can positively influence the repeat purchase decision."

Banfi, Caylar, Duncan and Kajii recommend that companies keep three key considerations in mind when establishing a digital path to purchase strategy. First, identify "the most critical touch points across the consumer decision journey for their own products and services." Second, "have a multichannel orientation and [don't] focus on one digital channel exclusively or on digital at the expense of traditional channels." Finally, balance activities between obtaining new customers and retaining old ones. "Since the consumer decision journey applies to both the retention and the acquisition phases," they write, "the relative focus on digital in either of these phases is organization-specific." Hinchcliffe points out that simply identifying critical touch points is not as easy as it may appear. "New digital channels," he asserts, "are accumulating faster than many companies can integrate them into their customer experience." He continues:

"New social networks, mobile devices, app stores, online touchpoints like fan sites, customer communities, Facebook pages, Twitter accounts, etc. seem to emerge on a weekly basis. If that wasn't enough, customers are more ready to interact than ever before, further creating challenges of scale: Even if a company could integrate a new digital engagement channel into their efforts, these new venues are far more two-way. Customers expect a meaningful response to their attempts to connect with the companies they are interested in or otherwise desire involvement. So this is the opportunity and the challenge combined: Engaged customers generate more revenue and stay more involved with the companies that respond in kind. Yet it's very challenging to meet their demands for engagement without fundamentally changing the rules of how companies connect with them."

Because there is such a dizzying array of touch points for consumers to use on their path to purchase, Hinchcliffe recommends that companies look at each of them with three things in mind. First, use them to "solve a problem. Make a pain point go away, such as seamlessly conveying the current status of orders in any desired channel, or providing an innovative new way to participate in the co-design of a new product or service." Second, "make life simpler. Remove the time, effort, and/or friction the customer has in engaging with you." Finally, "engage the customer. The core of the problem companies have in onboarding new digital channels such as social media, is that customers will then expect the company to respond and participate in conversations. And engagement at scale is one of the hardest things for companies to do as they are organized today." He concludes, "The one thing that is ultimately untenable is ignoring customer needs."

Hayden Sutherland believes that the traditional path to purchase is dead. "The path to purchase is dead," he writes, "long live the path to purchase." Just like royal succession, he knows that something is always ready in the wings to ensure that business moves on. ["The digital path to purchase," Press 2.0, 4 March 2013] He continues:

"Today's shoppers now have a number of tools at their fingertips to help them chose which products to buy and consumer technology has been the facilitator of this. Research from Google Analytics has found that over a two-day period customers now interact with a brand 4.3 times before making a purchase. Furthermore the average U.S. shopper now interfaces with a total of 10.4 traditional and online media sources prior to purchasing. That's a lot of browsing before buying (or looking before booking, if you're in an industry such as online travel). Furthermore, according to further research by Google, online customers are now making the majority of their purchasing decision before engaging with a sales representative. There's no doubt that the traditional way that shoppers purchase has now evolved and the marketing to these people has changed with it."

You might ask yourself, who are "these people" to whom Sutherland refers? That is the subject that I'll address in Part 2 of this series.

June 17, 2013

Smart Cities 37 Years On

"If you work for a young web company," writes Jasmine Gardner, "you probably think your office is pretty cool. Maybe it has a pool table or even a roof terrace. Pah! Give it 37 years and, according to engineering company Arup, our office blocks will contain working farms, produce their own energy, be linked together by suspended green walkways and sections of each floor will be removable, upgradable and replaceable." ["Smart cities: what urban life will be like in 2050," London Evening Standard, 4 March 2013] Sounds exciting doesn't it? Well, don't get too excited. Futurists tend to exaggerate how much things will change in the future and seldom are their predictions fulfilled. The attached 1925 postcard is a good example of two things.Future New York City First, buildings last a long time. You can easily see recognizable New York City buildings that are standing today. Even though a few high profile buildings had been added to NYC skyline by 1962 (37 years after the postcard was printed), people traveling forward in time would still have been able to walk down the streets of New York and recognize where they were. The second point, is that the details are nearly always wrong. Aircraft never looked like those depicted; elevated trains never ran skyward through buildings; and so forth. Obviously, futurists don't get everything wrong; but, if they were given a letter grade for the bulk of their predictions they would probably get a "D."

Having said that, I really don't fault the futurists. Their job is to look at what's possible not really predict the probable. Everything that the Arup company predicted is possible. Buildings like those described probably will be built, but not a lot of them. I predict that if you travel forward in time 37 years to 2050, you would be able to walk down the street of any major city that exists today and find your way around using landmark buildings that are already in place. Cities change slowly -- at least established (i.e., brownfield) ones.

There are new cities being built, however, especially in the developing world. How those cities are built will make a huge difference in the lives of millions of future citizens. It is in those cities where one might see the kinds of buildings predicted by Arup. Gardner continues:

"This is the building of the future, imagined in a report released [in March 2013] by Arup’s 'Foresight + Innovation' arm. It is just one example of the elements that will make up our 'smart cities' of the next age. 'Smart cities' is the buzz phrase of the moment. It refers to energy-efficient and spacially economical urban worlds in which we'll live in years to come — all thanks to technology. Smarter cities are now a focus of both big business, such as Shell and IBM, and small entrepreneurs and scientists, such as the Dutch microbiologists who have developed a self-healing concrete. Cracks in the buildings of the future will be filled by calcium carbonate, produced by a bacteria feeding on nutrients, both incorporated into the cement. The bacteria are only activated when rainwater gets into a crack."

Some older buildings will undoubtedly be retrofitted with smart technologies to make them more efficient and others will gain new life by housing urban farms or 3D manufacturing plants; but, only in newly constructed cities are you likely to find "from-the-ground-up" smart technologies built into every building. Nevertheless, it remains important for urban planners and engineers to keep developing concepts. You never know when a true breakthrough will occur that will help us cope with the 9 billion people who will inhabit the planet in 2050 (most of whom will live in cities). Rick Robinson, executive architect for the IBM's Smarter Cities initiative, told Gardner, "In the West we've become accustomed to building cities outwards around cars. If more people fall into that lifestyle we're going to exhaust the world's resources very, very quickly." Robinson recognizes that it is the new urban dweller coming to live in newly constructed cities that will in large measure hold the key to planet's future.

But city dwellers living in brownfield cities are also going to put pressure on resources and energy. Adam Newton, a project manager for the Strategy and Scenarios team at Shell, told Gardner, "By 2050, between 70 and 80 per cent of the world's population will live in cities. How and where people consume energy will be very important." Frankly, I'm much more concerned about the future of brownfield cities than those that are yet to be constructed. Here are a few reasons why. Old water pipes leak. Sanitation systems are being overwhelmed (if they exist at all). Transportation patterns are mostly established. Inefficient electric grids are already in place. Finally, many of the building that will be around in 2050 are already built. Smart people and enlightened politicians are going to have to work with private enterprise if we want things to change for the better in many of these brownfield cities. Gardner continues:

"For IBM, smarter cities mean ones that harness data. It is already creating space on the cloud to share information such as water flow and distribution. 'By managing pressure on a water distribution network, you can serve additional houses without needing to expand the system — allowing you to support a growing population without spending hundreds of millions of pounds on infrastructure,' explains Robinson. Intelligent traffic lights are also high on the agenda. 'In Singapore and California we've used technology that can make predictions that are 85 per cent accurate about how traffic is going to develop over the next hour,' he says. In future the light sequencing might change automatically. For Shell, it's all about energy efficiency. 'The low energy prices that drove cities to sprawl may not exist in 20 years. Fewer roads and better integrated public transport is likely to be the way forward,' says Newton."

But better and more integrated public transportation systems costs money (which cities don't have) and will only be sustainable if the cultural mindset of urbanites and suburbanites change significantly. Rick Robinson bluntly states, "No-one is going to pay cities to become Smarter." [Sustainable Cities Collective, 30 November 2012] He goes on to discuss "four ways in which money is already spent" and then recommends way "to harness that spending power to achieve the outcomes that cities need." He concludes, "We should not wait for new, large-scale sources of Smarter City funding to appear before we start to transform our cities – we cannot afford to; and it’s simply not going to happen. What we must do is look at the progress that is already being made by cities, entrepreneurs and communities across the world, and follow their example."

The economic sector that is most likely to see deep penetration of smart technologies is the electric utility sector. Gardner reports, for example, that the U.S. energy data analytics company Intelen has begun a project "which monitors energy usage in office blocks with smart meters and has created a social gaming application where employees compete for low-energy scores." That's a great initiative because it addresses both technology and culture. Gardner admits that "these don’t seem [like] outlandish developments," such as those described in the Arup report, and she wonders if its "futuristic building [is] just science fiction." Josef Hargrave, the author of Arup's report, acknowledges that some of the predictions made in the report are unrealistic (like the flying robots which remove and re-insert building sections), but he believes that smart electricity grid integration is very realistic. Having said that, Hargrave told Gardner that "all the ideas are based on current prototypes and development — flying robots are actually being developed in Zurich, Switzerland, and can build a six-metre tower out of foam bricks." Gardner concludes:

"Given the exponential rate of technological advancement, it's not hard to fathom that skyscrapers coated with photovoltaic paint will indeed come to pass — and sooner than you think. Indeed, the floors of Arup's building occupied by algae-filled biofuel pods are not unlike a current project by French biochemist Pierre Calleja. He is building algae street lamps that eat up CO2 in the atmosphere. Combine this with another algae lamp that produces its own light using energy created by photosynthesis and you get self-powered, anti-pollution street lamps. Researchers at the University of California Los Angeles (UCLA) just announced their creation of a graphene supercapacitor — essentially a battery but one that charges up to 1,000 times faster than the normal kind, and that can be composted. The future promises instant phone chargers and petrol stations with plugs that can charge cars faster than they currently fill up on unleaded. ... The real smart aspect may come down not to the technology which we know exists but to foresight and willingness to change. 'There have to be new models of collaboration for businesses and decision-makers in cities and government to have the positive impact we know the technology could support,' says [Shell's] Newton. The city that leads in this department may just end up the smartest in the class."

Cities will change over the next 37 years. Change is inevitable. Whether that change is good or bad remains to be seen. The one thing that worries me most is the fact that almost all Smart Cities initiatives address problems associated with connected citizens. Many of today's largest cities have large, unconnected ghettos associated with them. Individuals who live in those ghettos are survivors; but, getting them connected to grids and networks would make their lives much more productive and their futures much brighter. In all the talk about smart cities, we can't forget about them.

June 14, 2013

Numbers in the City, Part 2

In Part 1 of this series, I discussed an emerging movement called quantitative urbanism and how mathematics can be used to better understand life in city. The goal of the movement is to use this knowledge to make life better for people living in urban environments. The article from which many of the observations were drawn was written by Jerry Adler. ["Life in the City Is Essentially One Giant Math Problem," Smithsonian, May 2013] According to Adler, the movement can trace its origin to a collaboration between Geoffrey West, Luís Bettencourt, and José Lobo. West and Bettencourt head up the "Cities, Scaling, and Sustainability" initiative at the Santa Fe Institute. This "research effort is creating an interdisciplinary approach and quantitative synthesis of organizational and dynamical aspects of human social organizations, with an emphasis on cities. ... A particularly important focus of this research area is to develop theoretical insights about cities that can inform quantitative analyses of their long-term sustainability in terms of the interplay between innovation, resource appropriation, and consumption and the make up of their social and economic activity." The movement is catching on and they have been joined by others. Adler reports:

Numbers in the City"If Bettencourt and West are building a theoretical science of urbanism, then Steven Koonin, the first director of New York University's newly created Center for Urban Science and Progress, intends to be in the forefront of applying it to real-world problems. Koonin, as it happens, is also a physicist, a former Cal Tech professor and assistant secretary of the Department of Energy. He describes his ideal student, when CUSP begins its first academic year this fall, as 'someone who helped find the Higgs boson and now wants to do something with her life that will make society better.' Koonin is a believer in what is sometimes called Big Data, the bigger the better. Only in the past decade has the ability to collect and analyze information about the movement of people begun to catch up to the size and complexity of the modern metropolis itself."

Bettencourt writes, "A science of cities, recognizable across the full spectrum of urban disciplines, from physics and biology to social psychology and sociology is starting to emerge." ["The Kind of Problem a City Is," Santa Fe Institute Working Paper, 2013-03-008] He concludes his Working Paper this way:

"Cities reveal at once the best and the worst aspects of humanity, in terms of our creativity and imagination but also in our tendencies for violence or discrimination. Because of this enormous potential for human development cities should not be seen as systems to be controlled or resisted, but encouraged to evolve spontaneously in the direction of achieving the best open-ended expressions of our collective nature. That then is our challenge. We are living the last few decades of the great urban transition and finally fulfilling our potential as the most social of all species to create something altogether new in Earth's history. We have within sight age-old human aspirations, such as to eliminate extreme poverty, to end most injustice, to gain access to good health for all, and to do all that in balance with the Earth's biosphere. All this will have to happen in cities and it can now happen very quickly. Bigger data and a more scientific approach to cities will certainly help. But the ultimate challenge for all of us involved in influencing and practicing urban planning is to translate, apply and further develop these new ideas to promote urban environments that can encourage and nurture the full potential of our social creativity, targeted at sustainable and open-ended human development."

Koonin agrees with Bettencourt. He told Adler, "We have acquired the technology to know virtually anything that goes on in an urban society, so the question is, how can we leverage that to do good? [How can we] make the city run better, enhance security and safety and promote the private sector?" Driving all of this potential is big data analytics. As we now know, the data comes from all sorts of sources: mobile phones, CCTV systems, the Internet, and so forth. Increasingly, data will come from machine-to-machine networks that help our systems run more efficiently and effectively.

Adler goes on to point out how mathematics can help us understand other things about how a city grows. For example, Glen Whitney, who founded the Museum of Mathematics in New York City, has a theory about the height of skyscrapers in cities (i.e., the better the Gross Regional Product the taller the buildings). Whitney admits that "building heights are constrained by engineering , while there's no limit to how big a pile you can make out of money, so there are two very rich cities whose tallest towers are lower than the formula would predict. They are New York and Tokyo. Also, his equation has no term for 'national pride,' so there are a few outliers in the other direction, cities whose reach toward the sky exceeds their grasp of GDP: Dubai, Kuala Lumpur." Adler notes, "Deep mathematical principles underlie even such seemingly random and historically contingent facts as the distribution of the sizes of cities within a country. There is, typically, one largest city, whose population is twice that of the second-largest, and three times the third-largest, and increasing numbers of smaller cities whose sizes also fall into a predictable pattern. This principle is known as Zipf’s law, which applies across a wide range of phenomena." He concludes:

"As West and his colleagues are well aware, this research takes place against the background of a huge demographic shift, the predicted movement of literally billions of people to cities in the developing world over the next half century. Many of them are going to end up in slums — a word that describes, without judgment, informal settlements on the outskirts of cities, generally inhabited by squatters with limited or no government services. 'No one has done a serious scientific study of these communities,' West says. 'How many people live in how many structures of how many square feet? What is their economy? The data we do have, from governments, is often worthless. In the first set we got from China, they reported no murders. So you throw that out, but what are you left with?”

The challenge, of course, is the fact that slums are generally considered to be places that are "off the grid." The ubiquity of mobile phones is making this less true every day; nevertheless, getting data about slums remains difficult. Adler reports that "the Santa Fe Institute, with backing from the Gates Foundation, has begun a partnership with Slum Dwellers International, a network of community organizations based in Cape Town, South Africa," to address challenges associated with off the grid communities. "The plan is to analyze the data gathered from 7,000 settlements in cities such as Mumbai, Nairobi and Bangalore, and begin the work of developing a mathematical model for these places, and a path toward integrating them into the modern economy." Lobo told Adler, "For a long time, policy makers have assumed it's a bad thing for cities to keep getting larger. You hear things like, 'Mexico City has grown like a cancer.' A lot of money and effort has been devoted to stemming this, and by and large it has failed miserably. Mexico City is bigger than it was ten years ago. So we think policy makers should worry instead about making those cities more livable. Without glorifying the conditions in these places, we think they're here to stay and we think they hold opportunities for the people who live there." To read a little more about slums, see my post entitled: Will Cities Save Us?

Adler writes that he hopes Lobo is correct because Michael Batty, who runs the Centre for Advanced Spatial Analysis at University College London, predicts "that by the end of the century, practically the entire population of the world will live in what amounts to 'a completely global entity ... in which it will be impossible to consider any individual city separately from its neighbors ... indeed perhaps from any other city.'" Although I doubt that prediction will come completely true, I certainly believe that urbanization is going to continue unabated. Bettencourt told Adler that we are seeing "the last big wave of urbanization that we will experience on Earth." Adler believes that it will be the math men, like West, Koonin, Batty and their colleagues, that will unlock the formulae that will point us down the road to a better urban future.