Site moved to, redirecting in 1 second...

« January 2013 | Main | March 2013 »

20 posts from February 2013

February 28, 2013

The Road to Innovation is Paved with Questions and Experiments, Part 2

In Part 1 of this series, I discussed the importance of asking good questions at the beginning of the innovation process. Once good questions have been asked and the problem framed, serious work needs to be done to answer those questions. Although this post focuses on why experimentation and prototyping are important for the innovation process, they are only some of the tools available to answer questions. I agree with Tim Kastelle, who asserts, "I am always suspicious of one-size-fits-all solutions. They are very easy to sell in a book or a blog post, but they rarely work in the real world. There's too much variation." ["There Must Be Forty Ways to Innovate," Innovation for Growth, 5 November 2012] Too prove his point, Kastelle offers a list containing forty ways to innovate:

Idea Generation

  • get to the edge Hearing aid
  • scratch your own itch
  • be a genius
  • blue sky R&D
  • applied R&D
  • ask your customers
  • watch your customers
  • ask your people
  • brainstorm
  • gamestorm
  • think outside the box
  • think inside the box
  • co-create
  • scenario planning

Selection and Implementation

  • experiment!!
  • R&D
  • stage/gate
  • innovation team
  • innovation coach
  • expert panel
  • minimum viable product
  • iteration Thinking cap 02
  • gut instinct
  • does it fit with what we’ve always done?
  • do whatever the CEO wants
  • focus groups
  • market testing
  • A/B testing
  • team consensus

Spreading Ideas

  • network
  • traditional distribution
  • viral marketing
  • advertising
  • influentials
  • small seeds
  • word of mouth
  • lead users
  • co-creation
  • pull strategies
  • partnerships

Notice that only one of his forty ways has exclamation points associated with it -- experimentation. Kastelle isn't the only innovation guru who is keen on experimentation. Jim Manzi, chairman of Applied Predictive Technologies, is another proponent of experimentation. He wrote his thoughts in a book entitled Uncontrolled: The Surprising Payoff of Trial-and-Error for Business, Politics, and Society. In a review of the book, Trevor Butterworth writes:

"Imagine that you are the chief executive for a chain of 10,000 convenience stores, 8,000 of them called QwikMart, 2,000 of them called FastMart. Strangely, the FastMarts are bringing in 10% more in sales on average than the QwikMarts, and your instinct is that it may have something to do with customer preference for the FastMart name. How do you find out if your hunch is correct? Jim Manzi ... was once asked to address such a question by the owner of a convenience-store chain—a story he relates in [in his book]. Finding the answers was not easy. There were hundreds of variables that could account for the QwikMart-FastMart revenue gap, including distance to nearby highways, number of cash registers, cleanliness of stores and the 'exact position of each product on each shelf.' Worse, all these variables could mate with each other to produce even more variables. Nevertheless, after studying mathematics at the Massachusetts Institute of Technology, Mr. Manzi found his dream job as a strategic consultant trying to solve such puzzles for various companies. His approach began with: What makes experiments in science so good at producing reliable knowledge—and could the same principles and methods be applied to business and even social policy? The answer, according to Mr. Manzi, was a qualified yes."

Butterworth claims that "the hero" of Manzi's book is "the randomized controlled trial or, as Mr. Manzi prefers, the 'randomized field trial' (RFT)." In other words, experimentation. Butterworth writes:

"It should come as no surprise that the most successful companies in information technology—Google, Amazon and eBay—are relentless experimenters. Millions of consumers, for example, can be tested at little cost to find out whether pop-up ads are more effective on the left side or the right side of a computer screen. Google alone, says Mr. Manzi, 'ran about 12,000 randomized experiments in 2009, with about 10% leading to business changes.'"

Butterworth goes on to write, "It's one thing, for instance, to conduct experiments, but it's another to learn from them." Manzi believes that the best experiments are conducted by someone who is not emotionally attached to the innovation. "A company, Mr. Manzi says, 'is an alliance of individuals, and there are always competing theories, power centers, and knowledge silos within any firm.' Amid all the 'jockeying for control,' the most successful experiments are performed when the experimenters don't have a dog in any strategic fight."

Another proponent of experimentation is marketing technologist Scott Brinker. He wrote that "big testing" is critical in today's marketplace. ["The big data bubble in marketing -- but a bigger future," Chief Marketing Technologist, 21 January 2013] He writes:

"A few months ago, I wrote a post, "We want to test bold, new ideas that always work." It highlighted recent research by the Corporate Executive Board showing that the far majority of Fortune 1000 marketers think their organizations are not effective at test-and-learn experiments. One reason why: at least half didn't think an experiment should ever fail. I believe this is the single biggest obstacle most organizations face: their culture and politics dissuade people from trying experiments because failure of an experiment implies failure of the tester. Who wants to stick their neck out in that environment? So either people don't test anything, or they test in a non-controlled manner so that outcomes are comfortably subject to caveats and interpretations."

Brinker asserts that even companies claiming to embrace testing often conduct only superficial tests that involve little risk. "Perhaps the greatest damage they do," he writes, "is that they give people the illusion of engaging in real experimentation. ... Big testing is qualitatively different." When I think of the term "big testing," two things comes to mind: high risk and high reward. Face it, if you are not failing you are not experimenting. Jon Custer notes that one of America's most famous innovators was also famous for the number of failures he experiences during his experiments. ["The Power of Failure," CIPE Development Blog, 30 January 2013] He writes:

"Thomas Edison ... had to conduct thousands of failed experiments before hitting on a successful design for the electric light bulb, and then had to build the infrastructure to power them. In the 21st century, the corporate descendant of the company that Edison founded in 1880 ensures that today's entrepreneurs in and around New York City have reliable electricity to run their businesses. Edison famously said, 'I have not failed. I have just found 10,000 ways that won't work.'"

Scott Anthony, a managing partner at Innosight, writes, "You don't have to be Thomas Edison to be an active experimenter. Think about ... 'everyday experiments' you could run. Change the way in which you commute to work. Alter the order in which you do things in the day. Try eating different portion sizes or different foods at different times of the day. If the professors [Clayton Christensen Jeffrey Dyer, and Hal Gregersen] are right — and I think they are — the process will wire your brain in a way that makes it better at innovation." ["Innovators: Become Active Experimenters," HBR Blog Network, 29 March 2010]

Tim Kastelle believes that anytime you feel a bit stuck in the innovation process experimenting is a good idea to get the juices flowing again. ["Experiments – the Key to Innovation," Innovation for Growth, 28 March 2010] He writes:

"There must be something to this idea, because I’ve run across three different people saying basically the same thing in the past three days. The first was Dan Ariely:

'They asked me what I thought the best approach was. I told them that I was willing to share my intuition but that intuition is a remarkably bad thing to rely on. Only an experiment gives you the evidence you need. … Companies pay amazing amounts of money to get answers from consultants with overdeveloped confidence in their own intuition. Managers rely on focus groups—a dozen people riffing on something they know little about—to set strategies. And yet, companies won't experiment to find evidence of the right way forward.'

"Unsurprisingly, he goes on to make a case for the value of experimenting. Part of this reluctance is that experimenting leads to short-term losses – if you try several things to find out what works best, you have wasted resources by trying the ideas that end up not working. Or do you? Rita McGrath doesn’t think so:

'If your organization can approach uncertain decisions as experiments and adopt the idea of intelligently failing, so much more can be learned (so much more quickly) than if failures or disappointments are covered up. So ask yourself: are we genuinely reaping the benefit of the investments we've made in learning under uncertain conditions? Do we have mechanisms in place to benefit from our intelligent failures? And, if not, who might be taking advantage of the knowledge we are depriving ourselves of?'

"She includes a list of conditions that can lead to what she's calling 'intelligent failures', the approach that she outlines is both good and practical. Then I ran across this by Bob Sutton:

'The final point that Jeff Pfeffer and I make in Hard Facts is about failure. We emphasize that is impossible to run an organization without making a lot of mistakes. Innovation always entails failure. Most new products and companies don't survive. And if you want creativity without failure, you are living in a fool's paradise. It is also impossible to learn something new without making mistakes. … Failure will never be eliminated, and so the best we can hope for from human beings and organizations is that they learn from their mistakes, that rather than making the same mistakes over and over again, they make new and different mistakes.'

"To be innovative, we have to try out new ideas. Some of these will fail. If we're smart, we'll set up our experiments so that we can learn as much as possible from the ideas that don't work. We face an environment that is filled with uncertainty. This makes planning dangerous. The best possible way to meet this uncertainty is not with intuition and guesswork, but with experimentation."

One type of experimentation that doesn't immediately pop into most people's minds is gaming. Marla M. Capozzi, John Horn, and Ari Kellen, analysts at McKinsey & Company, report that some companies have improved their innovation process "by integrating war games into their innovation activities. By simulating the thoughts, plans, and actions of competitors, these companies are improving their products and services, while gaining a deeper understanding of how their innovation assets compare with those of rivals—insights that help them better identify, shape, and seize opportunities." ["Battle-test your innovation strategy," McKinsey Quarterly, December 2012] They conclude, "War games are a tried-and-true strategic tool, yet relatively few companies use them to innovate. Those that do so effectively can not only avoid the problem of overlooking what the competition might do but also determine how likely their new products and services are to survive in the crucible of the marketplace."

There are all kinds of experimentation that can be done. Don't be locked into any one method. The bottom line is that experimentation and innovation go hand in hand. The better you are at the one the better you will be at the other.

February 27, 2013

The Road to Innovation is Paved with Questions and Experiments, Part 1

"Innovation is a particularly sticky problem because it so often remains undefined," writes Greg Satell. "We treat it as a monolith, as if every innovation is the same, which is why so many expensive programs end up going nowhere." ["Before You Innovate, Ask the Right Questions," If you have read many of my posts about innovation, you will know that I'm a big believer in the notion that good solutions begin with good questions. Satell is also true believer in that dictum. He quotes Albert Einstein who stated (perhaps apocryphally), "If I had 20 days to solve a problem, I would spend 19 days to define it." I'm also a big believer in conducting experiments and using prototypes. Thomas Edison failed to find the right filament for his light bulb a thousand times. Edison didn't see this as 999 failures, but 999 steps in a 1,000-step process to success. Asking the right questions and being willing to conduct numerous experiments are surer paths to innovation than sitting in a room with a group hoping somebody comes up with a bright idea. In this post, I'll focus on the first of those methods -- asking good questions.

The first set of questions you should consider asking is about your organization's culture. Jack Uldrich, a futurist and best-selling author, believes that "most companies ... only pay lip service to the notions of creativity and innovation." ["25 Questions for the Truly Innovative Company or Organization," School of Unlearning, 1 November 2012] He recommends that companies ask themselves a series of 25 tough questions concerning innovation to determine whether they are made of the right stuff. He asserts, "If you can’t successfully answer a majority of these 25 questions, your company probably isn’t doing enough to create an innovative culture." His 25 questions are:

1. How does your company define innovation?
2. Is your company focused more on "innovative" projects or creating an innovative culture?
3. What is the role of senior leadership and managers? Do managers think of themselves as innovators? If not, why not?
4. Does innovation suffer because senior executives require "ironclad" assurances of success?
5. If asked, which current management practice does the most to stifle or kill creativity and innovation within the organization?
6. If failure is recognized as a necessary component of risk, how does your company deal with failure?
7. Is "innovation" listed in most employees’ job description? If not, why not?
8. How are employees encouraged or incented to be creative and innovative?
9. Who in the company is responsible for focusing on “what the organization doesn’t know?”
10. Who in the organization owns the “white space”?
11. Who in the company is responsible for throwing the organization off-balance?
12. Does your company have a system for challenging deeply held beliefs?
13. How does your company ensure discomforting information isn't ignored?
14. What's the "tomorrow problem" your company needs to begin working on today?
15. What’s the "can't do" that needs to become the "can do"?
16. Where is an area where short-term profitability might need to be sacrificed in order to achieve long-term success?
17. How does your company "sanction the unsanctioned"? Are employees granted free time or "dabble-time" to work on innovative projects of their own choosing?
18. How easy is it for an employee to get "experimental funding"? If there is a procedure for such funding, what is the dollar limit?
19. Are new ideas allowed to openly compete for support? If so, how does this procedure work?
20. Has your company ever conducted a "post-mortem" on a company that failed to innovate fast enough? (e.g. Blockbuster, Borders, Kodak, etc.)
21. Have they ever been asked to conduct a pre-mortem on your own company?
22. Does your company have a regular speaker series?
23. How does the organization ensure outside voices are heard?
24. How easy is it for your customers to contribute ideas?
25. How does the company know it isn’t over-investing in "what is" at the expense of "what could be"?

Those are all excellent questions that will help you determine whether or not your company has an innovative culture. Unlike Uldrich, Satell assumes your company does have the right culture and focuses his questions on better defining a problem that needs to be solved. He writes:

"Defining a managerial approach to innovation starts with developing a better understanding of the problem we need to solve. I've found asking two basic questions can be enormously helpful.

"How well is the problem defined? When Steve Jobs, who was a master at defining a clear product vision, set out to build the iPod, he framed the problem as "1,000 songs in my pocket." That simple phrase defined not only the technical specifications, but the overall approach. Unfortunately, some problems, like how to create a viable alternative to fossil fuels, aren't so easy to delineate. So your innovation strategy will have to adapt significantly depending on how well the problem can be framed.

"Who is best-placed to solve it? Once Jobs defined the iPod problem, it was clear that he needed to find a disk drive manufacturer who could meet his specifications. But, sometimes the proper domain isn't so cut and dried. Once you start asking these questions, you'll find that they clarify the issues quite quickly. Either there is a simple answer, or there isn't."

The answer to Satell's second question determines the "domain" in which the problem should addressed. If the answer to a challenge isn't simple, then extensive experimentation is probably the next step in process. More on that topic tomorrow. Satell claims that once you've asked the right "framing questions" you are in a much better position to "determine which approach to innovation makes the most sense." Satell offers a two-by-two matrix (shown below) for helping you select an approach. The approach selected depends on how well a problem has been defined.

Innovation matrix

As you can see from the matrix, the less well defined a problem is the more experimentation is likely to be needed. Satell discusses each of the quadrants beginning with Basic Research. He writes:

"Basic Research: When your aim is to discover something truly new, neither the problem nor the domain is well defined. While some organizations are willing to invest in large-scale research divisions, others try to keep on top of cutting edge discoveries through research grants and academic affiliations. Often, the three approaches are combined into a comprehensive program. ... Breakthrough Innovation: Sometimes, although the problem is well-defined, organizations (or even entire fields) can get stuck. For instance, the need to find the structure of DNA was a very well defined problem, but the answer eluded even the most talented chemists. Usually, these types of problems are solved through synthesizing across domains. Watson and Crick solved the DNA problem by combining insights from chemistry, biology, and X-ray crystallography. Many firms are turning to open innovation platforms such as Innocentive which allow outsiders to solve problems that organizations are stuck on. ... Sustaining Innovation: Every technology needs to get better. Every year, our cameras get more pixels, computers get more powerful and household products become 'new and improved.' Large organizations tend to be very good at this type of innovation, because conventional R&D labs and outsourcing are well suited for it. ... In essence, great sustaining innovators are great marketers. They see a need where no one else does. ... Disruptive Innovation: The most troublesome area is disruptive innovation, which target light or non-consumers of a category and require a new business model, because the value they create isn't immediately clear."

Satell makes it clear that, "while focus is important, no company should limit itself to just one quadrant." He concludes, "It's important to develop an effective innovation portfolio that has one primary area of focus, but also pursues other quadrants of the matrix and builds synergies between varied approaches. Innovation is, above all, about combination." Richard Veryard calls this "challenge-led innovation." ["Challenge-Led Innovation," Demanding Change, 1 December 2012] He offers another way of looking at the same characteristics contained in Satell's framework. It is a framework promoted by the Design Council. They call it the 'double diamond' design process model.


Veryard writes, "The first diamond is devoted to clarifying the problem or requirement, and the second diamond is devoted to solving a well-defined problem. If the challenge-led approach starts from a well-defined problem, then it is just doing the second diamond." Another way of interpreting the model is that the first diamond involves the activities discussed by Satell and the second diamond focuses more on prototyping and experimentation. Regardless of the model you like best, the bottom line remains the same. The better the questions you ask at the beginning of the process the better the solutions are going to be in the end.

February 26, 2013

The Search for Data Scientists

"Big Data can present companies with big challenges," writes Elyse Dupré and Melissa Mazza, from Direct Marketing News, "especially when there's a gap between what Big Data is and how it can be applied." ["Infographic: Mind the Gap," 8 February 2013] One of the "gaps" noted in the attached infographic created by Dupré and Mazza is people. There simply aren't enough of them with the right skills. According to Dupré and Mazza, nearly half of all companies that deal with big deal report a shortage of people with the right skills.

Big Data infographicOne of the most sought after individuals is the data scientist. "Outside of companies like Google that have long made use of rich rosters of PhDs," writes Connie Loizos, "there are nowhere near enough 'data scientists' — graduate-level candidates with backgrounds in machine learning or statistics — to analyze the massive streams of information that are being produced, and that gap is growing by the day." ["As Startups Produce More Data, the Search for Data Scientists Grows Frantic," PE Hub, 7 February 2013] "Data science [is] a crucial facet to the Big Data world," writes Doug Turnbull. That being the case, he asks the obvious questions, "What is a data scientist?" He writes:

"The traditional scientist builds an experimental design based on a hypothesis. ... How does a data scientist differ? Well it's still hypothesis driven research. The difference is in the nature of the experimental design. Instead of running an experiment in a sterile lab where we are carefully controlling everything from the humidity to the temperature to the expression on the experimenters face, we instead have a giant mass of data potentially from uncontrolled conditions. The experiment becomes a matter of combing through massive amounts of data after the fact. For example, finding enough cases where rats ran through the maze at a given temperature, light level, and all other variables constant, except for a varying humidity. Because we simply have massive and massive amounts of data, there are enough times of all things being equal except humidity, that we can go back and make a statistically significant assertion about what rats do when the humidity changes. This is particularly useful when there is simply no way to control all the independent variables, and notions of causality are weaker. For example, tracking children through education programs. There is simply no way to setup an experimental design where we force one set of children to undergo one set of circumstances and force another set of children to go through another. Moreover, we can’t ethically create an experiment that ensured each child had the exact same socioeconomic background, home life, cultural background, exercise, environment, and all the other dozens of factors that might influence their education outcome. So our only option is to collect tons and tons of data about kids, and see what shakes out. There may be enough times when certain variables are held steady except for one that a definite outcome could be measured."

Turnbull points out that getting to the hypothesis testing point is not a simple task. "Warehousing, and processing massive amounts of data efficiently to answer the data scientist's questions is in itself a hard problem." He continues:

"At the core of the problem is knowing what data structures are the best tool for the job. At OpenSource Connections, we have a pretty broad understanding of the strengths and weaknesses of various solutions such as distributed search, NoSQL databases, relational databases, and plain flat-file logs in Hadoop using Map Reduce. Matching up hard data problems with the right solution using the right data structure requires crucial collaboration between everyone. ... For example, what are the similarities and differences between natural language processing and search at scale vs collecting raw numeric statistics? Are there things that the two groups could learn from each other when it comes to how data is stored and processed?"

The answer to Turnbull's last question is a resounding, "Yes." Many of the solutions we are developing at Enterra Solutions involve bringing together two distinct philosophical and technological computing camps (i.e., Mathematical Optimization and Reasoning). At Enterra Solutions, we believe in bringing together smart people from different disciplines to address vexing problems. They are constantly learning from one another and making our solutions better for it. Turnbull asserts that another tool that any good data scientist needs in his or her kit is data visualization. I certainly agree with him. Good insights that aren't easily understood are little better than no insights. The end user of big data analytics must always be kept in mind.

Paige Roberts insists that data scientists play another important role in organizations. She claims they are bridge builders. ["Big Data Scientists Are Bridge Builders," SmartData Collective, 6 February 2013] Roberts came to this conclusion while reading an article written by Kathyrn Kelly in which Kelly argued "that the hype that has been building around the data scientist is over." ["Data Scientists Not Required: Big Data Is About Business Users," SmartData Collective, 5 February 2013] To be fair, Kelly wasn't dissing data scientists per se, she was arguing that the goal of vendors offering big data solutions should be to create systems that "use intuitive, interactive UIs to derive value from big data and avoid the dependency on data scientists." In other words, business people need tools they can use without having to turn to a data scientist every time they want an answer. Roberts agrees with Kelly on that point. Does that mean that data scientists should be placed on the short list of jobs that could be going away? Hardly. Data scientists are critical in the creation of the kind of systems that Kelly envisions. Roberts writes:

"I don't think that the data scientist/data analyst/statistician, whatever they've been called over the years, will disappear. The name 'data scientist' may be new, but the job isn't. And there's a good reason for that. Analytics, over time, becomes more and more accessible to the business person or average end consumer of that information, yes. The goal of most analytics projects and analytics software is to bridge the gap between the person who asks questions and the person who can find answers. Data scientists and the tools they use are the bridge for a time, then as the technology matures, they become the designers and builders and maintainers of the bridge. But once the bridge is built, anyone can walk across. When the person who asks the question and the person who can find the answer are the same person, the bridge is built, that area of questioning is mature."

Since good solutions start with good questions, an infinite number of questions are waiting to be asked. As Roberts put it, "Once we've reached the point where we can find all the answers to the questions we've been asking, [we'll] simply find tougher questions to ask." With each new set of tough questions, she writes, "You're right back where you were, needing a person with greater knowledge of analytics and analytic technology to dive in and find them, and be our bridge to that information." As a result, she concludes, "As long as human beings continue to ask questions, to push the envelope, and to be hungry for more and more information, i.e., as long as they continue to be human, data scientists will have a job."

Not only will they have a job, but they should have a good paying job. According to Loizos, "Pay for data scientists has rocketed ... even for those straight out of school." Robert F. Morison, from the International Institute for Analytics (IIA), reports that the IIA predicts:

  • "There will continue to be a shortage of data scientists. Companies will compensate by forming small analyst teams.
  • "The mystique of the data scientist will persist, but the lines between data scientists and other analytics professionals will blur." ["Building Data Scientist Capability," International Institute for Analytics, 1 February 2013]

If those predictions are correct, then, writes Morison, companies are faced with a real conundrum. He explains:

"Most businesses have difficulty attracting analytics professionals to begin with, let alone snag people in the top tier who possess multiple components of the data science skill set. Analyst talent supply is short, demand is high and growing, and competition can be fierce. Top data scientists tend to work in the hard sciences – physics, climatology, genomics – fields where they can immerse in new data and build original models. When they work for corporations, it’s usually for technology vendors or Wall Street firms or organizations with oversized data and computational challenges. Most companies must come to grips with the fact that they cannot hire enough data scientist caliber analysts, and that they can't get all the skills they need in one person. So they must focus more on the composition, development, and deployment of analyst teams that combine the needed skills and experience as coherently as possible."

If team building is really the answer to closing the data scientist skills gap, Morison asks, "How might the roles on analyst teams be delineated?" First, he notes that the work most teams will be asked to do "falls into four clusters, each with its own skills, demographic, and psychographic profile." Those clusters are programmers, data preparers, generalists, and business experts. He concludes:

"As people work on teams and learn from one another, roles overlap and lines blur. And as organizations approach data science as a composite capability, not an individual role, we'll probably see more 'data scientist' titles but view the holders as less rarefied. I can't see a day, at least anytime soon, when we have a surplus rather than a shortage of data science and advanced analytics skills. But aggregate capability is building as more professionals migrate to the field. And more people will have the jump start of academic programs. They will not churn out full-fledged data scientists, because many of the key skills come with practice and experience. But current academic programs (and others to follow, no doubt) enlarge the pool and accelerate development of the next generation top-tier analytics professionals. Most of the organizations I work with know that they’re just scratching the surface of the enormous opportunities with big data and advanced analytics. But they can't go deep and capitalize with simply a data scientist hire or two. They need to grab the best analysts they can, line up outside sources for skills they lack, and blend that talent on high-power analytics teams."

If you have the skills discussed by Morison and are looking for an exciting position in a company on the cutting edge of big data analytics, I hope you will consider working for Enterra Solutions. If you are interested in being considered for a position at Enterra Solutions, please submit your current resume in Word format to

February 25, 2013

Big Data is about People and Behavior

With all of the hype surrounding big data, we should remind ourselves that what is most important is how it can be used to help us better understand ourselves, the decisions we make, and the actions we take. Much of the data that is currently being collected comes from mobile devices. "To say mobility is huge – gargantuan, even – would be an understatement," writes Chelsi Nakano. As a result, she notes, a lot of attention has been given to the hardware (i.e., smartphones and tablets). She believes that focus is changing. "At today's intersection of intelligence, computing and massive amounts of ubiquitous and networked data," she writes, "mobility is no longer about the hardware – it's driving an entire shift in human behavior." ["Mobility: It’s About Behavior, Not Devices," Conspire, 4 February 2013] Marketers obviously want to understand that behavior so they can cater to changing tastes. Nakano continues:

"Lisa Weinstein, President, Global Digital, Data and Analytics at Starcom MediaVest Group, had a similar comment at CES this year: ... 'I actually think that we have to take off the channel lens [from] mobile as a device, and think more about the consumer and behavior, and I think when you do that there's some really, really interesting implications about the way that consumers are interacting with different types of experiences in places – whether that be at retail or with friends in a social type of environment through their personal device. And so I very much believe that the device is important because of the personalization, but the behavior that it drives is actually a much deeper opportunity for how brands can intersect with consumers in new and different ways. ... Because the focus is now on our actions rather than the tools we use to carry them out, we're going to see even more objects – not just phones, tablets or computers – connected to the Internet, further augmenting this behavior and providing even more touch points for data and information exchange."

Nakano believes that all of this connectivity (i.e., the Internet of Things) "will not only cause a widespread demand for better ways to obtain data, but also highlight the importance of getting it to the right people in the right forms." In other words, the Internet of Things will only increase the need for technologies that turn data into actionable intelligence -- once again connecting data to people. Last fall Jim Stikeleather, Executive Strategist for Innovation at Dell Services, also stressed the importance of the human connection with big data. "Machines don't make the essential and important connections among data and they don't create information," he wrote. "Humans do." ["Big Data's Human Component," HBR Blog Network, 17 September 2012] He continued:

"Tools have the power to make work easier and solve problems. A tool is an enabler, facilitator, accelerator and magnifier of human capability, not its replacement or surrogate — though artificial intelligence engines like Watson and WolframAlpha (or more likely their descendants) might someday change that. That's what the software architect Grady Booch had in mind when he uttered that famous phrase: 'A fool with a tool is still a fool.' We often forget about the human component in the excitement over data tools. Consider how we talk about Big Data. We forget that it is not about the data; it is about our customers having a deep, engaging, insightful, meaningful conversation with us — if we only learn how to listen."

Stikeleather offered a few other insights about the human component and its connection to big data. The first insight is that "expertise is more important than the tool. Otherwise the tool will be used incorrectly and generate nonsense (logical, properly processed nonsense, but nonsense nonetheless)." Cliff Cate, Senior VP of Customer Success for GoodData, believes that analytical expertise needs to be employed primarily by firms that offer analytical solutions not by the companies that use those solutions. "Companies need solutions that enable them to use and customize their data easily," he writes, "because it is the whole team, not just the individual analyst, that knows the business best." ["Data Scientists Not Required: Big Data Is About Business Users," SmartData Collective, 5 February 2013] He continues:

"By offering business users intuitive data solutions, we bypass the need for the data scientist, who works in isolation. In fact, most data scientists are associated with the old school of business intelligence, where systems were so complicated that they needed someone with a data science background to run and get value from them. The new generation of solutions, on the other hand, is making it easy for business users to engage big data. An interdisciplinary team will see and use the visuals provided, and collaborate on the best decisions on a regular basis."

Stikeleather agrees with Cate when it comes to the importance of visualization. He writes:

"Humans are better at seeing the connections than any software is, though humans often need software to help. ... We have eons of evolution generating a biological information processing capability that is different and in ways better than that of our digital servants. We're missing opportunities and risking mistakes if we do not understand and operationalize this ability. Edward Tufte, the former Yale professor and leading thinker on information design and visual literacy, has been pushing this insight for years. He encourages the use of data-rich illustrations with all the available data presented. When examined closely, every data point has value, he says. And when seen overall, trends and patterns can be observed via the human 'intuition' that comes from that biological information processing capability of our brain. We lose opportunities when we fail to take advantage of this human capability. And we make mistakes."

Stikeleather asserts that "there are many other risks in failing to think about Big Data as part of a human-driven discovery and management process." He provides examples of insensitivity and unexpected consequences when big data tools are "over-automated." He believes that keeping the human component in mind is central to turning data into information and insight. He explains:

"Although data does give rise to information and insight, they are not the same. Data's value to business relies on human intelligence, on how well managers and leaders formulate questions and interpret results. More data doesn't mean you will get 'proportionately' more information. In fact, the more data you have, the less information you gain as a proportion of the data (concepts of marginal utility, signal to noise and diminishing returns). Understanding how to use the data we already have is what's going to matter most."

The staff at the Social Media Observatory agrees that value of data relies on what they call "the human algorithm." ["The Human Algorithm: Redefining the Value of Data," Social Media Observatory, 11 December 2012] The article states, "Everything we share, everywhere we go, everything we say and everyone we follow or connect with, generates valuable information that can be used to improve consumer experiences and ultimately improve products and services." The article continues:

"This 'big' data that will help businesses evolve and adapt in a new era of connected consumerism. More importantly, the study and understanding of relevant big data will shift organizations from simply reacting to trends to predicting the next disruption and adapting ahead of competition — thus, marking the shift from rigid to adaptive business models. From business to education to government and everything in between, without studying how the undercurrent of behavior is evolving, organizations cannot effectively adapt to new trends and opportunities. Change though, cannot be undertaken simply because of pervasive data."

Like other pundits, analysts at the Social Media Observatory emphasize that the goal of gathering and analyzing data is to gain actionable insights -- with emphasis on action. They assert, "Without interpretation, insight and the ability to put knowledge to work, any investment in technology and resources is premature." The article continues:

"The reality is ... that how organizations connected with customers yesterday is not how customers will be served tomorrow. Meaning, the entire infrastructure in how we market, sell, help, and create now requires companies to not only study data and behavior but also change how it thinks about customers. This is a bona fide renaissance and to lead a new era of customer engagement requires knowledge acumen. I refer to the confluence of data and interpretation as the human algorithm—the ability to humanize technology and data to put a face, personality, and voice to the need and chance for change. Data tells a story, it just needs help finding its rhythm and rhyme. ... Much of what we see today is important, but it's measuring activity not translating behavior into creativity or strategy."

According to the article, "The human algorithm is part understanding and part communication. The ability to communicate and apply insights internally and externally is the key to unlocking opportunities to earn relevance." It concludes:

"Beyond research, beyond intelligence, the human algorithm is a function of extracting insights with intention, humanizing trends ad possibilities and working with strategists to improve and innovate everything from processes to products to overall experiences. The idea of the human algorithm is to serve as the human counterpart to the abundance of new social intelligence and listening platforms hitting the market every day. Someone has to be on the other side of data to interpret it beyond routine. Someone has to redefine the typical buckets where data is poured. And someone has to redefine the value of data to save important findings from a slow and eventual death by three-ring binders rich with direction and meaning. ... Even though sophisticated tools can help track data points that can lead to these insights, it still takes a human touch to surface them and in turn advocate findings within the organization. It's the difference between insights, actionable insights, and executed insights. ... Those who don’t plug in and invest in technology's human counterparts are in turn making an investment toward potential irrelevance. But remember, data is just the beginning. Data must always tell a story and that takes a human touch to extract data, surface trends, and translate them into actionable insights across the entire organization."

Clearly, the Big Data Era is inextricably connected to technology. We need to remember, however, that big data also is inextricably connected to people.

February 22, 2013

Advances in Artificial General Intelligence Continue

Last December Dario Borghino reported, "Researchers at the University of Waterloo have built what they claim is the most accurate simulation of a functioning brain to date." ["Scientists build the most accurate computer simulation of the brain yet," Gizmag, 6 December 2012] The computer system used by Waterloo researchers has what Borghino describes as a "seemingly unimpressive count of only 2.5 million neurons, (the human brain is estimated to have somewhere nearing 100 billion neurons)." Larger neural network computers have been built. In an earlier article, Borghino noted that "IBM has simulated 530 billion neurons and 100 trillion synapses – matching the numbers of the human brain." ["IBM supercomputer used to simulate a typical human brain," Gizmag, 19 November 2012] That's why the Waterloo claim of being the most accurate simulation of a functioning human brain is so impressive. Borghino continues:

"Spaun (Semantic Pointer Architecture Unified Network) is able to process visual inputs, compute answers and write them down using a robotic arm, performing feats of intelligence that up to this point had only been attributed to humans."

Peter Murray is also impressed with what Spaun has accomplished. "Instead of the tour de force processing of Deep Blue or Watson’s four terabytes of facts of questionable utility," he writes, "Spaun attempts to play by the same rules as the human brain to figure things out." ["Scientists Create Artificial Brain with 2.3 million Simulated Neurons," Singularity Hub, 10 December 2012] Murray continues:

"Instead of the logical elegance of a CPU, Spaun's computations are performed by 2.3 million simulated neurons configured in networks that resemble some of the brain's own networks. It was given a series of tasks and performed pretty well, taking a significant step toward the creation of a simulated brain. ... It was given 6 different tasks that tested its ability to recognize digits, recall from memory, add numbers and complete patterns. Its cognitive network simulated the prefrontal cortex to handle working memory and the basal ganglia and thalamus to control movements. Like a human, Spaun can view an image and then give a motor response; that is, it is presented images that it sees through a camera and then gives a response by drawing with a robotic arm. And its performance was similar to that of a human brain. For example, the simplest task, image recognition, Spaun was shown various numbers and asked to draw what it sees. It got 94 percent of the numbers correct. In a working memory task, however, it didn't do as well. It was shown a series of random numbers and then asked to draw them in order. Like us with human brains, Spaun found the pattern recognition task easy, the working memory task not quite as easy."

Researchers admit that other AI computer systems can perform some of the tasks better than Spaun can -- but that's not the point. Murray explains:

"What’s important is that, in Spaun’s case, the task computations were carried out solely by the 2.3 million artificial neurons spiking in the way real neurons spike to carry information from one neuron to another. The visual image, for example, was processed hierarchically, with multiple levels of neurons successively extracting more complex information, just as the brain’s visual system does. Similarly, the motor response mimicked the brain’s strategy of combining many simple movements to produce an optimal, single movement while drawing."

In an earlier post (Artificial Brains: The Debate Continues), I cited an article by George Dvorsky in which he writes, "It's important to distinguish between emulation and simulation. Emulation refers to a 1-to-1 model where all relevant properties of a system exist. This doesn't mean re-creating the human brain in exactly the same way as it resides inside our skulls. Rather, it implies the re-creation of all its properties in an alternative substrate, namely a computer system. Moreover, emulation is not simulation. Neuroscientists are not looking to give the appearance of human-equivalent cognition. A simulation implies that not all properties of a model are present. Again, it's a complete 1:1 emulation that they're after." It appears to me that Waterloo researchers are trying to emulate, not simply simulate, brain function. Borghino reminds us that the quest for artificial general intelligence is difficult. He writes:

"Save for a select few areas, our decades-old efforts in creating a true artificial intelligence have mostly come up short: while we're slowly moving toward more accurate speech recognition, better computerized gaming opponents and 'smart' personal assistants on our phones, we're still a very long way from developing a general-purpose artificial intelligence that displays the plasticity and problem-solving capabilities of an actual brain. The 'reverse engineering' approach of attempting to understand the biology of the human brain and then build a computer that models it isn't new; but now, thanks to the promising results of research efforts led by Prof. Chris Eliasmith, the technique could gain even more traction. Using a supercomputer, the researchers modeled the mammalian brain in close detail, capturing its properties, overall structure and connectivity down to the very fine details of each neuron – including which neurotransmitters are used, how voltages are generated in the cell, and how they communicate – into a very large and resource-intensive computer simulation. Then, they hardwired into the system the instructions to perform eight different tasks that involved different forms of high-level cognitive functions, such as abstraction. Tasks included handwriting recognition, answering questions, addition by counting, and even the kind of completion of symbolic patterns that often appears in intelligence tests."

Spaun takes a baby step down the road to AGI. Borghino reports "the model is still affected by some severe limitations. For one, it cannot learn new tasks, and all of its knowledge has to be hardwired beforehand. Also, Spaun's performance isn't exactly breathtaking: it takes the system approximately two and a half hours to produce an output that you and I could carry out in a single second." The following video gives a brief overview about how Spaun works.

Borghino noted that "the team's findings appear in a paper ... published in the journal Science. An open-access version of the paper is available here (PDF)." University of Waterloo researchers aren't the only scientists involved in the hunt for algorithms that can help computers think like humans. "Hiroyuki Akama at the Graduate School of Decision Science and Technology, Tokyo Institute of Technology, together with co-workers in Yokohama, the USA, Italy and the UK, have completed a study using fMRI datasets to train a computer to predict the semantic category of an image originally viewed by five different people." ["Training computers to understand the human brain," Medical Xpress, 8 October 2012] The article explains, "Understanding how the human brain categorizes information through signs and language is a key part of developing computers that can 'think' and 'see' in the same way as humans." The article notes that, even if the experiments don't lead to artificial general intelligence, "future application of experiments such as this could be the development of real-time brain-computer-interfaces. Such devices could allow patients with communication impairments to speak through a computer simply by thinking about what they want to say." Even if AGI is a long way off (or is never achieved), research efforts attempting to reach that goal will result in significant benefits in areas from healthcare to marketing.

February 21, 2013

Electronic Health Records: The Best is Yet to Come

There have been a number of articles published recently about the fact that savings that were predicted to flow from conversion to electronic medical records (EMR) -- sometime called electronic health records (EHR) -- have yet to be realized. Back in 2005, RAND Corporation analysts predicted, "EMR implementation and networking could eventually save more than $81 billion annually—by improving health care efficiency and safety—and that HIT [health information technology]-enabled prevention and management of chronic disease could eventually double those savings while increasing health and other social benefits." ["Can Electronic Medical Record Systems Transform Health Care? Potential Health Benefits, Savings, And Costs," by Richard Hillestad, James Bigelow, Anthony Bower, Federico Girosi, Robin Meili, Richard Scoville and Roger Taylor, Health Affairs, 28 September 2005] As Reed Abelson and Julie Creswell report, "The conversion to electronic health records has failed so far to produce the hoped-for savings in health care costs and has had mixed results, at best, in improving efficiency and patient care." ["In Second Look, Few Savings From Digital Health Records," New York Times, 10 January 2013]

EHR QualityThe "second look" referred to in the headline of Abelson's and Creswell's article was a reassessment conducted by the RAND Corporation. In an article published in Health Affairs, Dr. Arthur L. Kellermann, one of the authors of a reassessment, stated, "We've not achieved the productivity and quality benefits that are unquestionably there for the taking." The most important emotion to draw from that statement is not disappointment but hope. Dr. David Blumenthal, who helped oversee the federal push for the adoption of electronic records, told Abelson and Creswell that "technology ‘‘is only a tool. Like any tool, it can be used well or poorly." The popular opinion seems to be that the tool is currently being used poorly. Dr. David J. Brailer, who worked in the Bush Administration, agrees with Blumenthal that "tens of billions of dollars could eventually be squeezed out of the health care system through the use of electronic records." He believes, however, that the Obama Administration committed a "colossal strategic error" when it it included a huge amount of money for record conversion in the stimulus package. "The vast sum of stimulus money flowing into health information technology created a 'race to adopt' mentality," he told Abelson and Creswell, "buy the systems today to get government handouts, but figure out how to make them work tomorrow.”

In the latest Health Affairs article, Kellerman and his colleague, Spencer Jones, claim there are several reasons that "electronic medical records didn’t play out as expected." ["Why electronic health records failed," by Sarah Kliff, Washington Post Wonkblog, 11 January 2013]

"To start, doctors didn't adopt electronic records as widely as expected. RAND researchers expected that about 90 percent of doctors would digitize their records, hitting levels seen in the United Kingdom and the Netherlands. But doctors have adopted at a much slower rate in the United States: Most recent research suggests about half of doctors have fully digitized their records. For hospitals, the number hovers below 30 percent. Even for the doctors that did adopt electronic records, many have run into a huge stumbling block with interoperability. The records they use at their office may not connect to a different software used at a hospital. Seventy-percent of doctors cited the lack of interoperability as a frustration with electronic records in a recent Bipartisan Policy Center survey. 'As a result, the current generation of electronic health records functions less as "ATM cards," allowing a patient or provider to access needed health information anywhere at any time, than as "frequent flier cards" intended to enforce brand loyalty to a particular health care system,” the RAND authors write."

That's a huge problem. Standardization is critical if the predicted savings are to be realized. One man that is trying to rectify that situation is a radiologist named Michael Zalis. He got so tired of "logging in and out of separate electronic health records at Massachusetts General Hospital" that he "co-founded and wrote the code for QPID, which stands for Queriable Patient Inference Dossier, a natural language search tool that extracts relevant clinical information from an EHR." ["Making Electronic Health Records More Efficient, by Zina Moukheiber, Forbes, 14 February 2013]

Moukheiber writes, "Tools that help piece together a complete portrait of patients are increasingly in demand, as clinicians try to sort out and analyze data stored in a digital health record, with the goal of delivering better care." To highlight that fact, Moukheiber reports, "QPID has grown by word of mouth, and is now deployed across Boston-based Partners HealthCare, which includes Brigham and Women’s Hospital. It is used by 5,000 clinicians in 15 departments, such as gastroenterology, anesthesia, and surgery. QPID registers an average 5 million search requests a month. ... With QPID, an anesthesiologist or a nurse at Mass General can pull in less than two minutes, says Zalis, key information, such as allergies, heart problems, and medications that could be buried in a report or a lab panel." The bottom line is that with standardization productivity and patient care both improve.

Before getting too excited and predicting that QPID is going to solve the problems currently being experienced, remember that Kellerman and Jones pointed out that some very large companies actually like the current arrangement because it "enforce(s) brand loyalty to a particular health care system." That shouldn't be too shocking since the money involved is enormous. For example, Julie Creswell reports that one of the large companies involved in EMR services, Allscripts, has more than doubled its annual revenue "from $548 million in 2009 to an estimated $1.44 billion last year." ["A Digital Shift on Health Data Swells Profits in an Industry," New York Times, 19 February 2013] According to Creswell, other large companies in the sector have seen similar increases in revenue. Clearly, those companies don't want to see those revenues fall. As a result, Creswell concludes:

"As doctors and hospitals struggle to make new records systems work, the clear winners are big companies like Allscripts that lobbied for that legislation and pushed aside smaller competitors. ... Current and former industry executives say that big digital records companies like Cerner, Allscripts and Epic Systems of Verona, Wis., have reaped enormous rewards because of the legislation they pushed for. ... Executives at smaller records companies say the legislation cemented the established companies' leading positions in the field, making it difficult for others to break into the business and innovate."

If the Obama Administration is serious about making healthcare affordable, it must find a way to get the industry to agree on standards and open the door for smaller, innovative companies offering better products and services. Bill Bithoney, from Truven Health Analytics, believes that Meaningful Use (MU) incentives could be the tool to create better standards. ["Digital Health Records: Lower costs, better quality – eventually," Healthcare Blog, 22 January 2013] He writes:

"Meaningful Use I, and later MU II, should be viewed as laying the groundwork for electronic health record interventions which may ultimately result in improved health care. ... True advances in patient health outcomes, quality of care, and cost can only be achieved when electronic health records are used to share information across the entire care continuum from hospitals and nursing homes to rehab facilities to primary care physician offices. ... When most medical professionals can do things like supply discharge summary data and other pertinent medical information in electronic formats across the care continuum, we'll begin to realize health and well-being benefits of the impending healthcare digital revolution."

I'm naturally an optimist. I believe that a decent system will eventually be worked out and anticipated improvements in productivity and patient care realized along with desperately needed healthcare savings. The reason I'm optimistic is that I believe that healthcare providers will eventually force companies to make it easier to share records so that mistakes are minimized. At that point, if a company can't play nicely with others it will be asked to leave the sandbox. Clearly, it won't be easy -- but it can be done.

February 20, 2013

Secure Information Sharing Has Never Been More Important

If you are concerned about cybersecurity, you probably read the New York Times' article about the Chinese military carrying out hacking operations. ["Chinese Army Unit Is Seen as Tied to Hacking Against U.S.," by David E. Sanger, David Barboza, and Nicole Perlroth, 19 February 2013]. Sanger, Barboza, and Perlroth report:

"Mandiant, an American computer security firm, [has tracked] for the first time individual members of the most sophisticated of the Chinese hacking groups — known to many of its victims in the United States as 'Comment Crew' or 'Shanghai Group' — to the doorstep of the military unit's headquarters. The firm was not able to place the hackers inside the 12-story building, but makes a case there is no other plausible explanation for why so many attacks come out of one comparatively small area. ' Either they are coming from inside Unit 61398,' said Kevin Mandia, the founder and chief executive of Mandiant, in an interview last week, 'or the people who run the most-controlled, most-monitored Internet networks in the world are clueless about thousands of people generating attacks from this one neighborhood.' Other security firms that have tracked 'Comment Crew' say they also believe the group is state-sponsored, and a recent classified National Intelligence Estimate issued as a consensus document for all 16 of the United States intelligence agencies, makes a strong case that many of these hacking groups are either run by army officers or are contractors working for commands like Unit 61398, according to officials with knowledge of its classified content."

If you assume that the Chinese military is only going after government and military secrets, you would be wrong. The writers report that the "Comment Crew has drained terabytes of data from companies like Coca-Cola." Of even greater concern is the fact that the Comment Crew and other such groups are increasingly focused "on companies involved in the critical infrastructure of the United States — its electrical power grid, gas lines and waterworks." They note that "one target was a company with remote access to more than 60 percent of oil and gas pipelines in North America." More on that story below.

Not surprisingly, Chinese government spokesmen deny that China is engaged in cyber espionage. The truth, of course, is that China and many other governments (including the U.S.) have active cybersecurity units. To make that point, Sanger, Barboza, and Perlroth report, "Working with Israel, the United States has used malicious software called Stuxnet to disrupt Iran’s uranium enrichment program. But government officials insist they operate under strict, if classified, rules that bar using offensive weapons for nonmilitary purposes or stealing corporate data." Plausible deniability has always been an important tool in any government's kit. The Mandiant report makes such deniability a lot harder for the Chinese government to sell. The article reads like a best-selling "whodunit" novel whose prime suspects are named UglyGorilla and DOTA. The tale takes readers down dark cyber alleyways that lead to the streets of Shanghai and the murky ties with Chinese military.

As noted above, however, military and state secrets are not the only targets of Chinese hackers. Coca-Cola databases were attacked as its "executives were negotiating what would have been the largest foreign purchase of a Chinese company." During that time, the "Comment Crew was busy rummaging through [the company's] computers in an apparent effort to learn more about Coca-Cola’s negotiation strategy." The tale continues:

"The attack on Coca-Cola began, like hundreds before it, with a seemingly innocuous e-mail to an executive that was, in fact, a spearphishing attack. When the executive clicked on a malicious link in the e-mail, it gave the attackers a foothold inside Coca-Cola’s network. From inside, they sent confidential company files through a maze of computers back to Shanghai, on a weekly basis, unnoticed."

As a result of all of this activity, "Obama administration officials say they are planning to tell China’s new leaders in coming weeks that the volume and sophistication of the attacks have become so intense that they threaten the fundamental relationship between Washington and Beijing." That's not a good thing. When the world's two largest trade elephants start charging one another, the earth is going to shake. As Sanger, Barboza, and Perlroth write, "Mr. Obama faces a vexing choice: In a sprawling, vital relationship with China, is it worth a major confrontation between the world's largest and second largest economy over computer hacking?" The answer is likely to be in the affirmative given that Comment Crew has attacked organizations holding vital information about critical infrastructure. Sanger, Barboza, and Perlroth explain:

"The most troubling attack to date, security experts say, was a successful invasion of the Canadian arm of Telvent. The company, now owned by Schneider Electric, designs software that gives oil and gas pipeline companies and power grid operators remote access to valves, switches and security systems. Telvent keeps detailed blueprints on more than half of all the oil and gas pipelines in North and South America, and has access to their systems. In September [2012], Telvent Canada told customers that attackers had broken into its systems and taken project files. That access was immediately cut, so that the intruders could not take command of the systems. ... Security researchers who studied the malware used in the attack ... confirmed that the perpetrators were the Comment Crew. 'This is terrifying because — forget about the country — if someone hired me and told me they wanted to have the offensive capability to take out as many critical systems as possible, I would be going after the vendors and do things like what happened to Telvent,' [Dale]. Peterson of Digital Bond said. 'It's the holy grail.'"

The New York Times is not the only media outlet that been writing about Chinese hackers. Bloomberg Business published an interesting video and article on the subject as well. That's really the point of this post: No organization is completely safe from serious hacking efforts. "In an ever-increasingly digital world," writes John Casaretto, "many have become immune to the news of ongoing threats that persist on the internet, breaches, privacy, attacks happen every day and once in a while one of them is significant enough to hit the news." ["One World Labs Takes on Data Leaks in the Dark Web," SiliconANGLE, 18 February 2013] He continues:

"Sadly, even organizations fall under this false sense of security, feeling their risk and their security is solid, that all that is going on with their information is known and secured. The bare truth is that the breaches and data leaks we hear about form only the tip of the iceberg of what is really going on out there. The 'Dark Web' is probably the best way to describe this, the places where search engines do not go and things you can only find if you are looking for it; it includes botnets, anonymous networking, C&C networks - in places all over the world, including the U.S."

Casaretto warns that "once your data is out there, you are at risk." He explains:

"For anyone that thinks that their four-walls are secure, think again. No amount of egress security, DLP, predictive security models if you were even that far ahead of the pack can account for everything given so many variables for data leakage. Among the many vectors are smartphones, web browsing, social engineering, the risk of leaked information may even come from your own IT staff. As it turns out that time and time again, information is accidentally exposed even in the most innocent of circumstances."

Technology can help. Casaretto reports that "One World Labs (OWL) has developed a one-of-a-kind software engine that seeks, indexes and collects information on a platform called Open Source Intelligence Gathering (OSINT)." He continues:

"With 1% of the deep web accessible to the common person, it goes where no one else can, indexing deep into the nether reaches of the net. These are the kind of places where information is incoming, largely never even seen, much less shared by Google and typically unbeknownst to the company whose information has leaked. From countless forums, file-sharing sites, listservs, IRC channels, ftp servers and more there is a constant, nefarious publishing and sharing of information that could have your name on it. ... OSINT is a fascinating engine, comprised of analytics, semantics, and Big Data elements. The platform is built on a distributed clustered framework. As you can imagine, the index itself is distributed, encrypted and built with the utmost security layers throughout. Access to the system is tight. Network design comes into play as many, many sites are not appreciative of an index that could be scraping their information. The operation engages its tasks through a complicated and non-static web of anonymous networking and thousands of running proxy configurations designed to avoid detection and maintain access to sometimes super-secretive environments. If a company were to assign a human to engage in this type of discovery, they could quickly after some training, find and discover at a rate of about 1 page per minute. The OSINT engine scrapes 75 pages per second and is tuned to detect across the semantics of the most prevalent languages, complete with variations."

Another technology that can help involves Secure Information Sharing (SIS). Enterra’s Secure Information Sharing technology can be utilized to facilitate the sharing of information across partner communities. This framework is founded on an automated rules management process that monitors source policies and the environment where they are applied (i.e., situational awareness) and an Attribute Based Access Control (ABAC) model that coordinates information/data sharing across a federated group environment. The SIS framework applies policy-driven automated rules to ensure individuals are able to quickly and securely access their data resources to effectively perform their missions. It strengthens existing layered security solutions that are often not enough protection from cyber attacks, especially when orchestrated and/or aided by foreign governments like Russia and China.

February 19, 2013

Going Long with Big Data

Applied mathematician and network scientist Samuel Arbesman, a senior scholar at the Ewing Marion Kauffman Foundation and a fellow at the Institute for Quantitative Social Science at Harvard University, offers a unique take on the subject of big data. He claims that we should stop admiring how much new data we can accumulate and concentrate more of our attention on data sets that go back further in time. He wants us to appreciate history. "Our species can't seem to escape big data," he writes. "We have more data inputs, storage, and computing resources than ever, so Homo sapiens naturally does what it has always done when given new tools: It goes even bigger, higher, and bolder. We did it in buildings and now we’re doing it in data." ["Stop Hyping Big Data and Start Paying Attention to ‘Long Data’," Wired, 29 January 2013] It's hard to argue with Arbesman that we humans have a tendency to super-size things. If you don't believe it, read my post entitled The Supersized Supply Chain.

Arbesman isn't arguing that we should downsize. In fact, he acknowledges, "Big data is a powerful lens — some would even argue a liberating one — for looking at our world. Despite its limitations and requirements, crunching big numbers can help us learn a lot about ourselves." His concern is that we seem to be accepting the notion that big data represents all data. It's doesn't. "No matter how big that data is or what insights we glean from it," he writes, "it is still just a snapshot: a moment in time. That’s why I think we need to stop getting stuck only on big data and start thinking about long data." He explains:

Man with spyglass clear"By 'long' data, I mean datasets that have massive historical sweep — taking you from the dawn of civilization to the present day. The kinds of datasets you see in Michael Kremer's 'Population growth and technological change: one million BC to 1990,'which provides an economic model tied to the world's population data for a million years; or in Tertius Chandler’s Four Thousand Years of Urban Growth, which contains an exhaustive dataset of city populations over millennia. These datasets can humble us and inspire wonder, but they also hold tremendous potential for learning about ourselves."

In that respect, Arbesman agrees with futurist Peter Schwartz who argues that we need to master "the art of the long view." Whereas, Schwartz insists that companies need to contemplate events that could take place well into the future, Arbesman argues that we can learn from things that have occurred well into the past. He writes, "Because as beautiful as a snapshot is, how much richer is a moving picture, one that allows us to see how processes and interactions unfold over time?" Arbesman believes that the past is a prelude to the future. He explains:

"We're a species that evolves over ages — not just short hype cycles — so we can't ignore datasets of long timescale. They offer us much more information than traditional datasets of big data that only span several years or even shorter time periods. Why does the time dimension matter if we’re only interested in current or future phenomena? Because many of the things that affect us today and will affect us tomorrow have changed slowly over time: sometimes over the course of a single lifetime, and sometimes over generations or even eons. Datasets of long timescales not only help us understand how the world is changing, but how we, as humans, are changing it — without this awareness, we fall victim to shifting baseline syndrome. This is the tendency to shift our 'baseline,' or what is considered 'normal' — blinding us to shifts that occur across generations (since the generation we are born into is taken to be the norm)."

Before continuing with Arbesman's arguments about why we need to spend more time analyzing long data, I need to point out the obvious. Some of the insights that businesses want to gain from big data have little to no history to draw upon (like insights that can be obtained from social media databases). Arbesman undoubtedly agrees that such insights are useful for business purposes, but he is concerned about larger social and environmental issues that could have dramatic impact on the world as a whole. He continues:

"Shifting baselines have been cited, for example, as the reason why cod vanished off the coast of the Newfoundland: overfishing fishermen failed to see the slow, multi-generational loss of cod since the population decrease was too slow to notice in isolation. 'It is blindness, stupidity, intergeneration data obliviousness,' Paul Kedrosky, writing for Edge, argued, further noting that our 'data inadequacy … provides dangerous cover for missing important longer-term changes in the world around us.' So we need to add long data to our big data toolkit. But don't assume that long data is solely for analyzing 'slow' changes. Fast changes should be seen through this lens, too — because long data provides context. Of course, big datasets provide some context too. We know for example if something is an aberration or is expected only after we understand the frequency distribution; doing that analysis well requires massive numbers of datapoints. Big data puts slices of knowledge in context. But to really understand the big picture, we need to place a phenomenon in its longer, more historical context."

Even in a business setting, long data has its place. "Want to understand how the population of cities has changed," Arbesman asks. "Use city population ranks over history along with some long datasets." With the world becoming more urbanized each and every year, businesses know that their consumer base is going to be found in and around cities. That means that businesses need to know more about cities: how they grow, how ethnic groups tend to congregate, how trading systems between cities work, and so on. Arbesman continues:

"The general idea of long data is not really new. Fields such as geology and astronomy or evolutionary biology — where data spans millions of years — rely on long timescales to explain the world today. History itself is being given the long data treatment, with scientists attempting to use a quantitative framework to understand social processes through cliodynamics, as part of digital history. Examples range from understanding the lifespans of empires (does the U.S. as an 'empire' have a time limit that policy makers should be aware of?) to mathematical equations of how religions spread (it's not that different from how non-religious ideas spread today)."

It doesn't take much imagination to grasp that it is important for marketing purposes to understand how ideas spread. Arbesman argues that we are so focused on change that we lose sight of the fact that there may be some "constants we can rely on for longer stretches of time." He also argues that taking the long view allows us to make educated decisions about "what efforts to invest in if we care about our future." Arbesman then gets a bit more technical. He writes:

"If we're going to move beyond long data as a mindset, however — and treat it as a serious application — we need to connect ... intellectual approaches across fields. We need to connect professional and academic disciplines, ranging from data scientists and researchers to business leaders and policy makers. We also need to build better tools. Just as big data scientists require skills and tools like Hadoop, long data scientists will need special skillsets. Statistics are essential, but so are subtle, even seemingly arbitrary pieces of knowledge such as how our calendar has changed over time. Depending on the dataset, one might need to know when different countries adopted the Gregorian calendar over the older Julian calendar. England for example adopted the Gregorian calendar nearly two hundred years after other parts of Europe did."

In other words, there is still a need for liberal arts in a technologically-advanced world. In fact, the art and design crowd is making a concerted effort to get those subjects back on the priority list of educators and policymakers. "In this current moment of economic uncertainty, America is once again turning to innovation as the way to ensure a prosperous future," states a website dedicated to the subject. "Yet innovation remains tightly coupled with Science, Technology, Engineering, and Math – the STEM subjects. Art + Design are poised to transform our economy in the 21st century just as Science and Technology did in the last century. We need to add Art + Design to the equation — to transform STEM into STEAM." If historians, sociologists, and political scientists want to get in on the movement they could argue that you just need to place liberal arts in front of STEM to create LA STEM (but that may be too French for American tastes!). Arbesman hopes that we don't lose sight of history as we position ourselves for the future. He concludes:

"Long data shows us how our species has changed, revealing especially its youth and recency. Want data on the number of countries every half-century since the fall of the Roman Empire? That’s only about thirty data points. But insights from long data can also be brought to bear today — on everything from how markets change to how our current policies can affect the world over the really long term. Big data may tell us what we need to know for hype cycles today. But long data can reach into our past … and help us lay a path to our future."

Long data is the polar opposite of real-time data. Both types of data are valuable and have their place. The key to gaining insights from them is knowing when, where, and how to analyze different datasets.

February 18, 2013

Scott Brinker's Epicenters of Targeted Marketing

In two recent posts (The Future of Big Data, Part 2 and Part 3), I discussed some observations about the connections between big data and marketing made by marketing technologist Scott Brinker. In a new post, Brinker discusses "another way to visualize the relationship between big data and the other innovations happening in the marketing department." ["3 epicenters of innovation in modern marketing," Chief Marketing Technologist, 28 January 2013] His new visualization places the customer at the center of three activities: customer analytics; customer experiences; and customer communications.


Although the customer is not shown at the nexus of these three activities, he or she is clearly present. Brinker's activities surround the customer with personalization, targeting, and a broader community. It's clear from his writing that Brinker is a very smart guy. It's also clear that he believes targeted marketing is going to be one of the defining characteristics of the modern business landscape. He starts his discussion about his three epicenters with customer communication. He writes:

"The first [epicenter] is customer communications and the revolution brought about by social media. Marketing communications has, rather quickly, moved from being a one-way broadcast to a more personal, two-way interaction with customers, prospects, and influencers, all interconnected together. Organizations have had to embrace operational and cultural changes to adapt — many are still wrestling with those changes. But while technology certainly triggered this wave of innovation in customer communications, the software in this space is probably the least technically challenging for marketers to adopt. There is great innovation in this sphere, but it's not primarily technical in nature."

Since customer communications can have such a significant impact on a company, Brinker is justified in making it one of his marketing epicenters. Although he appears to be a bit dismissive about technologies involved in customer communications, it should be remembered that he is only writing about the technologies involved in the communications not in the collection and analysis of the data involved. Dealing with that kind of unstructured data is not a trivial challenge; which is why customer analytics is the next epicenter he discusses. He writes:

"The second [epicenter] is customer analytics, using analytics and big data to better understand and measure customer opportunities. There is unquestionably a lot of value to be unlocked here, which is why the 'data revolution' in marketing is so big right now. However, these innovations are generally much more technical in nature. While there are some easy first steps here — high-level dashboards, basic web analytics, and data visualization tools such as Tableau — the technical water quickly gets deep. I picture it like a 'continental shelf' of shallow data analytics. Once you drop off the edge into the deep data ocean, the software is considerably more complicated and isn't nearly as plug-and-play. To really master customer analytics requires harder skills, such as statistics, modeling, predictive analytics, programming, data mining, etc. Acquiring these skills and integrating them into the marketing team will likely take more time and effort than the adoption curve associated with social media."

To highlight how challenging it is to deal with unstructured data, Amazon just awarded its grand prize in innovation to a Utah-based start-up company named ContactPoint. Last year the company "launched LogMyCalls, software that records incoming phone calls from customers and generates data that companies can analyze to update their marketing, develop sales leads, chart close rates and improve customer service." ["Utah's ContactPoint wins Amazon grand prize for innovation," by Paul Beebe, Salt Lake Tribune, 30 January 2013] The final epicenter discussed by Brinker is customer experience. He writes:

"The third [epicenter] is customer experience, delivering remarkable customer experiences at every touchpoint in the customer lifecycle. In our digitally-malleable world, marketers can now wield technology to craft customer experiences that are powerfully differentiated — on a scale that was impossible to conceive not too many years ago. I believe that the scale of this 'experience revolution' will dwarf the other two."

Providing an extraordinary customer experience is important whether or not that experience takes place online or in a traditional store setting. A Deloitte report entitled, The Next Evolution: Store 3.0, concludes that, in order to keep brick-and-mortar stores relevant and inviting for consumers, retailers must "deliver a tailored experience" for shoppers. The report states:

"It is an experience that begins before customers enter the physical store and continues long after they leave. Through the lens of the desired future customer experience, retailers should step back and ask themselves hard questions about where they are and where they need to go."

One of the reasons that experience is important is because brick-and-mortar stores have been suffering from a malady called "showrooming" that involves consumers going into stores specifically to examine products that they later purchase online. Allen Weiner, Research Vice President with Gartner, believes that content marketing is one way that traditional stores can provide shoppers with experiences and counter the showrooming trend. ["Can Content Marketing Beat Showrooming?, Gartner, 25 January 2013] He writes:

"For those businesses with adequate capital budgets, solutions such as Nearbuy Systems allow you to turn your Wi-Fi network into a proprietary content delivery mechanism which offers customers a personalized in-store shopping experience (like Barnes and Noble offers for customers who bring their Nooks to the store) via their mobile devices. Even better is the ability to track a customer’s in-store behavior such as how long he spent in a particular aisle."

Brinker believes that customer experience is so critical that the remainder of his article focuses primarily on that subject. He writes:

"There's already a growing customer experience movement in the marketing department — with the embrace of user experience and design (not just art/graphic design) professionals. But ultimately code is the clay from which most of these experiences are sculpted. It's with a mix of pre-packaged marketing software, custom developed applications, technical configuration, and plenty of 'script' glue to tie everything together that these experiences are actually built and delivered — leveraging the state-of-the-art in hardware technologies (e.g., the latest smartphones and tablets) and major web services (e.g., the latest capabilities in Facebook). Understanding how these technologies can be synthesized into compelling customer experiences — and having the technical capabilities to execute on those visions — presents a significant challenge for most marketing teams. But the opportunity is huge, strategically elevating marketing from communications to experiences. Marketing is still the champion of the brand. But the brand is now fully a direct function of customer experience."

Analysts at TIBCO Software agree with Brinker that technology lies at the heart of targeted marketing and the experiences it can provide. "Today," they write, "customer intent clues are widely available via comments on social networks, previous browsing behavior, email open rates, call center and sales department interactions and past receptiveness to offers and promotions. But this big data gold mine is spread across multiple channels that often fall outside a company's internal network landscape. And that means determining customer intent requires data analysis to mine for the insight that can bolster sales." ["4 Ways to Use Data Analysis to Pinpoint Customer Intent," Trends and Outliers, 30 January 2013] TIBCO analysts recommend four steps to help identify customer intent using data analysis. They are:

  1. "Identify where the customer journey begins. 'Is it on the search engine or email message, or is it on the landing page? Often, the path to the source prior to the website visit is crucial to understand intent,' notes the ClickZ post. 'Were they on a competitor's site? Were they reading an industry article? Were they searching for free shipping offers?'
  2. Collect all data from customer actions. Capture data from kiosks, call centers, mobile apps and web data. Hone in on product views, shopping basket additions, comments, searches, video views and help requests.
  3. Study patterns. Determine which customers are reading or writing product reviews. When do they look at shipping information. What products are being compared or which product bundles are studied? After analytics uncovers patterns, companies can then tailor offers.
  4. Use data as research in lieu of expensive surveys. 'Use your web data to assess what people actually do. Look for patterns in segments as well as feedback behavior – do reviews feature certain product characteristics? Highlight those in your marketing and up-selling, 'according to ClickZ. 'Web data can also improve your segmentation strategy, because it lets you segment on how (and potentially why) customers shop, not where they shop or who they are.'"

Brinker believes the end result (i.e., the "mission") is the "big experience." He concludes:

"Big experience is nearly unlimited in the innovations that lie ahead. They're not all technical in nature, but a significant portion of them are rooted in technology. To pursue these opportunities, marketing must expand its technical capabilities beyond data analytics. This is why marketing technologists, not just data scientists, should be a part of every marketing team's growth plans moving forward. Of course, some of the most fascinating areas in marketing are at the intersections of these three domains. For instance, the intersection between customer analytics and customer experience, where personalization and big testing are sprouting. Fundamentally, modern marketing is about combining all these innovations into a cohesive strategy and operational structure. That is why, although big data is big, it shouldn't distract marketing leaders from constructing an organization that is capable of orchestrating all of these innovations together into a powerful and differentiated brand for the 21st century."

In previous posts about innovation, I have made the point that a lot of (if not most) innovation takes place at the intersections of disciplines. Along these boundaries is where you find differing perspectives and new ways of combining existing ideas and technologies to create something entirely new -- like a big experience.

February 15, 2013

Urbanization, Innovation, and Wealth

Richard Florida, Co-Founder and Editor at Large at The Atlantic Cities, recently penned an article that focuses on two of my favorite topics innovation and cities. ["Innovation and the Wealth of Cities," The Atlantic, 1 February 2013] He begins his article by asking, "Is America losing its innovative edge?" It's a fair question because it doesn't have an easy answer. Smart people offer very different answers to it. Florida first lists a few smart people who do believe the U.S. is becoming less innovative. He writes:

"Tyler Cowen, a professor of economics at George Mason University and the author of The Great Stagnation, and Robert Gordon author of influential recent study which asks "Is US Economic Growth Over?" both argue that innovation has plateaued; that the great life-changing inventions — automobiles, airplanes, electric lights, antibiotics, refrigeration — are in the past, and that the economic effects of today's technological breakthroughs will ultimately be incremental. Venture capitalist Peter Thiel was recently quoted in The Economist declaring that innovation in America is 'somewhere between dire straits and dead.'"

Those pundits obviously believe they have the evidence to back up their claims. On the other hand, Florida notes that there is "a growing group" of people who believe just the opposite. He writes:

"Others - including the Daily's Beast's Daniel Gross, Irving Wladawsky-Berger, former vice-president of technical strategy and innovation at IBM, W. Brian Arthur of the Santa Fe Institute, and Philip Auerswald, author of The Coming Prosperity - counter that America is poised for a new era of innovation and growth. Reviewing the current evidence, The Economist recently concluded that: 'The idea that innovation and new technology have stopped driving growth is getting increasing attention. But it is not well founded.' This in line with the detailed empirical research by the influential late innovation economist, Christopher Freeman, who found that innovation slows during the highly speculative times leading up to great economic crises, only to surge forward as the crisis turns toward recovery."

In other words, both groups could be right. Innovation might have slowed or stalled during the Great Recession, but there is little reason to believe that innovation is dead or dying. In fact, Florida insists, "Detailed evidence from which to evaluate just how dead or lively America's innovation system has been hard to come." Florida introduces urban areas into the discussion of innovation by noting that the Brookings Institution recently released a study entitled Patenting Prosperity: Invention and Economic Performance in the United States and its Metropolitan Areas. Florida calls it "eye-opening." He writes:

"Authored by Brookings's Jonathan Rothwell and Mark Muro, Deborah Strumsky of the University of North Carolina at Charlotte, and José Lobo of Arizona State University, the report's conclusions provide more reasons for optimism than concern. The study tracks America's rate of innovation (measured by patenting) over the long sweep of U.S. history, from 1790 to 2011. Despite their recognized weaknesses, patents are the most commonly used measure for innovation, and the report takes great pains to make sure the patent data they use is accurate and reliable."

Included in Florida's article is a graph from the study (shown below) that "outlines the broad cycles of U.S. innovation. It shows the incredible uptick in the rate of patented inventions which occurred in the late 19th and early 20th century alongside the rise of a more science and technology based capitalism powered by the growth of industrial R&D. The rate of patented innovation leveled off in the early 20th century, as the graph shows, only to decline precipitously from the Great Depression to the 1990s. But over the past several decades the rate of patenting has picked up considerably. This might seem to suggest that the rate of invention is increasing as opposed to stagnating."

US Patents per capita

Florida's optimism about the future lies in the fact "that patent quality (measured as citations), as well as quantity, ... has been on the uptick [since] in 2010." He continues:

"That said, it's important to point out that this recent period has also seen the globalization of science and technology and of U.S. patenting. Nearly half of U.S. patents are awarded to foreign inventors. I asked Deborah Strumsky, one of the coauthors of the report and a leading international expert on patent data how the study accounted for this, and she told me that: 'Our counts are similar to U.S. GDP, it is about what happens within the U.S. border regardless of the company headquarters that produces it.' Adding that: 'These are all US filed patents. If the inventor was living outside the U.S., then they are not included.' The study and chart thus include domestic patents granted to U.S. both native- and foreign-born inventors. From where I sit, this ability of the U.S, to attract foreign companies and foreign-born inventors to its shores acts to broaden its range of innovative capabilities and bolster its technological and economic advantage overall."

Politicians are beginning to recognize this fact as well and it is one of the driving forces behind the latest debate about immigration reform. But Florida's post isn't about immigration and innovation, it's about innovation and urbanization. He reports, "The study delves deeply into the geography of innovation across U.S. metros. Like so much else in our increasingly spiky world [PDF], the landscape of American innovation is clustered, concentrated and uneven." The greatest spikes can be found in metropolitan regions or in university towns. "The report ranks all 366 U.S. metro regions on patent levels and growth." The report concluded:

"The 100 largest metro areas are home to 65 percent of the U.S. population in 2010, but they are home to for 80 percent of all U.S. inventors of granted patents since 1976 and 82 percent since 2005. Few patents are invented outside of metro areas. Just 20 metros, accounting for roughly a third of the U.S. population, generate almost two-thirds (63 percent) of patents, and just the five most patent-intensive metro areas account for about 30 percent of total U.S. innovation. The San Jose metro — home to Silicon Valley — leads both in terms of total patents and its rate of patenting per capita. San Francisco, New York, Los Angeles, Seattle, Boston, Chicago, San Diego, Minneapolis-St. Paul, and Detroit make up the top 10 in terms of total patents, while university towns like Burlington, Vermont; Rochester, Minnesota; Corvallis, Oregon; Boulder, Colorado; Ann Arbor, Michigan; Austin, Texas; Santa Cruz, California; and Poughkeepsie, New York (home to IBM) round out the top 10 in terms of patents per capita. The report substantiates the view that research universities increasingly act as anchors or hubs of the innovative, knowledge-based economy."

This latest study adds more evidence to the fact that cities can be catalysts for growth and prosperity. Since the world is becoming more urbanized, this should be welcomed as good news. For more on this subject, read my post entitled Tapping the Economic Power of Mega-Cities. Florida reports that the study "finds a connection between patents and regional development (measured as economic output per worker)." The report concludes:

"The results clearly show that patenting is associated with higher metropolitan area productivity. The analysis cannot rule out that the link is caused by some missing variable or reverse causality, but given the control variables and the fact that patents were lagged ten years in the analysis, the most likely explanation is that patents cause growth."

Although Florida is not convinced that "patents cause growth" per se, he claims that he is confident that "the pattern is fairly consistent over time." He continues:

"My take remains that only a relatively small subset of metros have the robust innovation ecosystems that enable them to effectively turn local innovations into economic growth. In many more, the fruits of local innovation flow away to other places. To be truly effective, cities and metros need to do more than produce innovations they must also have the 'absorptive capacity' to productively make use of them."

What Florida is arguing for is a real plan to make cities more productive and attractive for skilled workers. Alan Berube, a senior fellow and deputy director of the Brookings Institution's Metropolitan Policy Program, believes that one way to do that is to return cities to their historical position as centers of trade. ["The Return of the Trading City," Project Syndicate, 25 January 2013] He writes:

"Cities unite people who seek common space to exchange goods, services, and information. ... In the nineteenth century, the English economist Alfred Marshall described how cities are really 'agglomeration economies' that gather the infrastructure, workers, and information needed to promote innovation and trade. And, in 2008, Paul Krugman received the Nobel Prize for his work explaining how, amid increasing capital and labor mobility, metropolitan areas remain crucial nodes for trade. In short, cities make trade possible."

Berube believes that "the United States and other advanced economies" have failed to recognize this critical role for cities and, as a result, adopted the wrong trade policies. Good trade policies, he insists, would "support the distinct comparative advantages of cities and regions in global markets." He continues:

"Furthermore, local policymakers often forget that trade increases city residents' prosperity by bringing in new wealth, in turn contributing to job creation and bolstering demand for services in the local economy. In recent decades, too many American cities have relied on vanity projects – such as stadiums, casinos, convention centers, and shopping malls – to stimulate economic growth. But, while such projects may attract limited out-of-town revenue, they are more likely to recirculate local money. At the same time, they fail to capitalize on rising demand in global markets – for which the growth of emerging-market cities is largely responsible."

Economists routinely point out that the future of the global economy is shifting towards emerging market countries and will depend on the strength of the growing global middle class. Cities in developed countries will only benefit from a growing global economy and increased prosperity if they can find a way to tap into the emerging market money flow. Berube concludes:

"Global trade is not pleasant; it is fiercely competitive, and policymakers must address the short-term costs that it routinely imposes on people and places. But global trade also provides a route to long-term prosperity – one that runs squarely through cities. Two millennia after the opening of the Silk Road, a global network of trading cities is beginning to reemerge. Local and national trade policy should aim to advance this process."

Clearly, Florida and Berube believe that cities hold the key to America's economic future (as well as to the future prosperity of other nations as well). Whether policymakers come to a similar conclusion and act to strengthen the economic attraction of cities remains to be seen.