Site moved to enterrasolutions.com/2013/07, redirecting in 1 second...

« June 2013 | Main | August 2013 »

23 posts from July 2013

July 31, 2013

Analytics 2.0: Big Data, Big Testing, and Big Experiences -- Part 2

In Part 1 of this two-part series, I discussed a Harvard Business Review article written by Wes Nichols, cofounder and the CEO of MarketShare, a global predictive-analytics company headquartered in Los Angeles. ["Advertising Analytics 2.0," March 2013] In that article, Nichols argues that old analytic techniques are simply too limited to meet the complexities of today's marketplace. He asserts that analytics 2.0 involves three broad activities: attribution, optimization, and allocation.

Big_data_testing_cex_600
Source: Scott Brinker

Also in that post, I discussed Scott Brinker's view of how big data should be used. He stated that marketing's future involves using big data and big testing to provide the consumer with a big experience. ["The big data bubble in marketing -- but a bigger future," Chief Marketing Technologist, 21 January 2013] As I stated in that post, only analytics 2.0 is capable of dealing with all three "big" activities. If you don't believe that "big experiences" are important, Lisa Arthur might convince you. She reports, "Poor customer experiences result in an estimated $83 billion loss by US enterprises each year because of defections and abandoned purchases." ["Four Ways To Improve The Buyer Experience Starting Today," Forbes, 26 June 2013] "Unfortunately," she continues, "most marketers remain unsure about how to improve relationships with today's empowered consumers. What 'exactly' should you be doing? How can you start creating a better customer experience?" In an interview, Bob Thompson, CEO of CustomerThink Corp., told Arthur, that "the first step towards creating a better customer experience is to put the customer at the center of all you do." His first recommendation for doing that involves "thinking in terms of Buyer Experience Management." Arthur writes:

"As Bob explained, Buyer Experience Management (BXM) means understanding how buyers perceive their interactions with a brand, and then delivering value during those interactions so buyers (and non-buyers) become brand advocates. Thinking in terms of BXM helps tear down silos, especially the ones between sales and marketing. 'Keep in mind that one of the big problems in B2B marketing/sales is the silo mentality,' Bob told me. 'Marketing generates leads, sales closes deals. Each has their own set of goals and processes. Left out is an appreciation for what buyers are going through as they navigate from marketing to inside sales to field sales.' A derivative of Customer Experience Management (CEM), BXM focuses on the marketing/sales function. It's a way of looking at the buyers' complete journey as they perceive it. 'The core issue is that marketing is still viewed by most as pushing a message and/or generating leads. It's part of the "CRM" mentality which is really company-centric – designed to extract value, not add value,' Bob said. 'BXM is about taking a customer-centric view of the buyer’s journey, and asking how the buying experience is adding value and creating loyalty, even with prospects that don’t end up buying.'"

Clearly, in order to understand how buyers perceive their interactions with a brand, companies need to gather and analyze big data. Getting to know your customer (i.e., taking a walk in your customer's shoes) is the next recommendation Thompson related to Arthur. She continues:

"Bob estimates that less than 10% of B2B firms truly understand what experience buyers receive, even though virtually all agree that experience is important to revenue performance. Granted, B2C firms may not be quite as infected, with 'silo-itis,' but Bob suspects similar problems exist for B2C marketers, as well. ... 'Good customer/buyer research is essential to figuring out what the target market really values, and what they are experiencing on their journey. You can’t be "value adding" unless you know what customers think is valuable!' I wholeheartedly agree. It's time to stop walking the talk."

Thompson told Arthur that, in order to achieve the desired goal of understanding the customer's path to purchase, they need to "eliminate touchpoint amnesia." Arthur explains:

"A truly satisfying omnichannel experience requires an integrated approach. According to Bob, a lack of channel integration clashes with consumer expectations –and it negatively impacts sales. 'Companies have automated channels bit by bit, so they can claim to be multi-channel. Yet customers find (about 80% of the time, in my research) that a multi-touch experience is not remembered. I call this 'touchpoint amnesia and found that it has significantly reduced customer loyalty and propensity to buy,' he told me. 'Omni-channel experiences should make it easy to customers to navigate channels as they wish, and not lose information the customer has already provided.'"

Data integration is not easy, but the kind of integration Thompson is talking about is more structured than many kinds of data and should be relatively easy to accomplish. Customers certainly believe that, which is why they find it frustrating when they have to provide the same information time and again. Thompson's final recommendation for companies is to "keep learning." Persistent learning can only be achieved through the collection and analysis of big data. This subject interests me because many of my company's offerings use the Enterra Cognitive Reasoning Platform™, which ingests structured and unstructured data, understands the nature of the data, learns from known and discovered relationships, and takes actions within decision cycles to obtain desired outcomes. The Platform addresses all four of Big Data's dimensions: Volume, Velocity, Variety, and Veracity.

  • Volume: This dimension addresses the size of data. In today's world, that volume is enormous and getting larger.

  • Velocity: This dimension addresses the timeliness of information. There was a time when quarterly reports were fast enough to keep up with the clock speed of businesses. Today, for truly time-sensitive data, two minutes may be too long.

  • Variety: This dimension addresses the fact that useful data can be messy. It is no longer found solely in neat rows and columns of spreadsheets. Most new data is unstructured and must be filtered and understood to be of value.

  • Veracity: This dimension addresses the trustworthiness of data. If you can't trust the data, you can’t act confidently act upon the insights obtained from it. As the number of sources of data grows, challenges associated with the veracity of that data also increases.

Some analysts add two more "Vs" to that list: Visualization and Value. If information is presented in a way that is not easily understood, it is little better than having no information at all. Value is not so much a dimension as it is an outcome of the other "Vs" discussed above. Enterra's Cognitive Reasoning Platform brings together two distinct philosophical and technological computing camps (i.e., Mathematical Optimization and Reasoning). This melding of methods allows the Platform to perform rapid computations as well as discover and explore new relationships.

Enterra's Cognitive Reasoning Platform addresses marketplace needs for today's computer systems to be able to Sense, Think/Learn, and Act™ about the environment (industry/domain) in which they operate. At the level of so-called "Small Data," the amount of data can be too overwhelming for a human to process. At the level of Big Data, it is impossible to manage without some kind of artificial intelligence help. Therefore, computers need to evolve from executing tactical instructions to thinking and making sense of the data in a way more similar to humans. And they can't take excessively long periods of time to conduct the analysis. Enterra's Platform attempts to do this by pairing an ontology and rules engine with the muscle of dedicated analytic data processors to perform rapid computations.

As Wes Nichols made clear in his article about analytics 2.0, it's easier to talk about doing it than actually being able to accomplish it. As he puts it, "The opportunity is clear, but so is the challenge." At Enterra, we are excited about helping develop some of technologies that can rise to the challenge and help consumers have a big experience by better understanding them.

July 30, 2013

Supply Chain Trends: Are Things Going as Predicted?

TrendsRecently Adrian Gonzalez penned an article about supply chain trends. ["What Are Your Top 10 Supply Chain Trends?" Logistics Viewpoints, 10 July 2013] The inspiration for Gonzalez' article was a presentation given by Tom Linton, Chief Procurement and Supply Chain Officer at Flextronics, at the Crossroads 2013: Supply Chain as Future Enabler conference at MIT earlier this year. The presentation was entitled A Vision of Supply Chain Evolution. During that presentation, Linton offered his top 10 list of supply chain trends. Since we have crossed the hump for 2013, I thought it might be interesting to compare Linton's list with some of the predictions made about the supply chain at the end of last year. First, let's look at Linton's list as summarized by Gonzalez, with added commentary about some of the predictions made last year. I'm drawing on a 4-part series I wrote entitled The Future of Supply Chain Management (Part 1, Part 2, Part 3, Part 4) for those predictions.

#10: Cloud Computing: Low cost and reliable cloud solutions for global supply chains are starting to emerge; “apps” will transform supply chains.

Last year, Gonzalez predicted that "big data, social media, cloud computing, and mobile technologies will continue to dominate the headlines and that user interfaces for supply chain apps would get a social makeover. Those predictions have certainly proven to be true. Of all of the topics mentioned by Gonzalez, big data and mobile technologies have probably garnered the most attention.

#9: Business Process Convergence: Instead of automating inefficient processes, companies will eliminate them — e.g., replace inter-company business documents such as purchase orders with sense-and-response systems; companies will more seamlessly integrate inter-company functions.

Last year, Kerry McCracken, vice president of business solutions for the Integrated Network Solutions Segment of Flextronics, predicted that companies are going to get wiser in the area of business-to-business electronic commerce. They have to, she insists, because so many "industry giants" currently find themselves "lagging in the area." She stated, "Some folks can't even figure out how to set up an order electronically." She lamented that some "companies are still struggling to embrace the possibilities offered by modern technology." McCracken believed that many young companies were in a better position than large ones because they are more agile and not overburdened with legacy infrastructure.

#8: Global Labor Costs Equalize: Labor arbitrage is in decline as labor costs in Mexico match China's, and India, Ukraine, and Indonesia become more cost competitive.

McCracken asserted that the business world should "expect a 'rearranging of the deck chairs' by companies moving production around the globe to take advantage of the lowest-cost sources." As labor costs equalize, one would expect to see less, not more, rearranging of deck chairs.

#7: Raw Material Scarcity is driving innovation in materials, with companies replacing copper with aluminum, gold with copper, and steel with resin in certain cases. Companies will also need to better manage conflict minerals and rare earth metals.

The National Intelligence Council believes that resource scarcity is going to be a major shaper of the future. Analysts at KPMG agree. In a report published early last year, KPMG analysts identified "ten sustainability megaforces that will impact each and every business over the next 20 years." Resource scarcity was among the sustainability megaforces they discussed.

#6: Skill Specialization: New supply chain skills will emerge, such as managing supply chains in the cloud, and undergraduate and MBA programs will become more specialized.

Gonzalez predicted there would be "more programs and partnerships to address the talent shortage problem." This may be one of Gonzalez' safest predictions. A number of supply chain analysts, including Lora Cecere, have lamented that not enough is being done to attract, train, and retrain talented people. For more on that subject, see my post entitled More Supply Chain Talent Needed.

#5: Regional & Local Sourcing will expand and supply ecosystems will emerge as economies grow; "Made in USA" and "locally-sourced" will drive sourcing.

This is not a new trend but it is a growing one. For more on the subject, read my posts entitled Regional, Local, and Sustainable Sourcing and Local Sourcing Gaining Ground.

#4: Emergence of Control Towers as supply chains become more virtual (few or no factories). Supply chain winners will have a global footprint and be transparent, reliable, and flexible.

For more on this topic, read my post entitled Have You Heard about Supply Chain Control Towers? Control towers are all about transparency and connectivity. As a result, they play an important role in Linton's #1 trend non-zero supply chains (see below).

#3: Predictable Unpredictability: Predictability becomes a competitive advantage; supply chains break through barriers to become faster, more cost efficient, and safer.

Supply chain analyst Bob Ferrari agrees. Last year he stated, "The world economy continues to provide an environment of high uncertainty and 2013 will undoubtedly provide more reinforcement."

#2: Corporate Social Responsibility Becomes Fundamental: It won't be an option any more, policies expand globally, emerging country laws catch up, and foreign corporations will follow global norms.

Last year, Mark Buck, a global supply chain and procurement leader with Bio-Rad Laboratories, predicted that we would "see producers taking greater responsibility for launching and complying with green initiatives." Gonzalez predicted "increased adoption of alternative fuel vehicles." I anticipate that the trend towards sustainability will not only continue but accelerate.

#1: "Non-Zero" Supply Chains Win: In other words, supply chains focused on greater collaboration between everyone in the ecosystem will win. This will result in end-to-end supply chain solutions that will create new value for customers.

Collaboration is easier said than done. For more on this topic, read my posts entitled Supply Chain Collaboration and Dynamic Collaboration: Inside and Out.

Some of the predictions made by pundits at the end of 2012 weren't covered by Linton, including:

Cybersecurity

Craig Cuffie, vice president of supply chain and chief procurement officer with Clearwire Communications, predicted that "Cyber-security will be a growing concern in the coming year." Cuffie was right in both anticipated and surprising ways. Earlier this year, there was a lot of discussion about hacking being pursued by the Chinese military. More recently, leaks by Eric Snowden concerning NSA monitoring has grabbed a lot of attention.

Digital Path to Purchase

Devin Fidler, research director of the Technology Horizons Program, predicted that "within five to 10 years, the internet will become more of a tool for 'disrupting commerce.'" Robert J. Bowman, managing editor of SupplyChainBrain explained what Fidler meant: "Businesses will address the 'last-mile' problem by routing more product directly to customers. Even the U.S. Postal Service is getting into the act, with the introduction of same-day delivery in select cities."

Stagflation

Buck predicted the coming year would be an economist's nightmare, "a stagflation kind of year." Bowman explained: "The battle between the Obama Administration and Republicans over the federal budget, tax policy and the self-created 'fiscal cliff' will have a dampening effect on the economy. Prices will flatten or decline. 'No one's going to be investing cash,' Buck said. 'You're going to see things start to crumble. Maybe things will start to price up at the end of the year.'" Things haven't gone quite as badly as Buck feared, but the term stagflation is probably still a good descriptor.

Reshoring

Craig Cuffie, vice president of supply chain and chief procurement officer with Clearwire Communications predicted that there would only be "a 'limited' amount of re-shoring of manufacturing from China back to the U.S." It's fair to say that reshoring has been limited.

Robotics

Fidler predicted that "robotics will play a huge role in transforming physical distribution, especially in the warehouse. ... [He] likened this moment in history to the transition from the Arpanet to the internet." Frankly, I'm surprised that this didn't make Linton's top ten list.

Additive Manufacturing

Jim Miller, vice president of worldwide operations with Google, predicted that increased use of additive manufacturing (or 3D printing) would change the business landscape. Bowman expanded: "Miller cited the science of 3D printing, in which digital technology makes possible the layering of materials to create solid, three-dimensional objects. Applications include circuit boards, apparel, medical equipment, automotive, engineering and construction. The implications are huge for manufacturers, who could turn out precisely tailored products for customers in extremely small production runs – even one or two items. The technology's full impact might not be felt in the next three to five years, said Miller, 'but there’s more evidence that 3D printers are going to [be] a pretty disruptive force.'"

All in all there appear to be few surprises when it comes to supply chain trends. Most of the trends that have emerged over the past several years are gaining momentum. Companies that haven't adjusted to these trends (believing that they are simply passing fads) should reconsider their positions.

July 29, 2013

Reaching Your Target Audience

"Wanting to reach your target audience is one of the things that can help business owners grow their business," writes Kim Beasley. "Keep in mind that just marketing anywhere and everywhere won't always bring your target customers to you." ["How to Determine Your Market and Reach Your Target Customers," Technorati, 26 February 2013] Sounds like a good plan, right? Kevin W. McCarthy claims that too many companies believe that the terms "target market" and "target audience" are the same thing. As a result, he believes, too many marketing messages fall short of their potential. ["Do You Know Your Target Audience?" 14 February 2013] He provides a brief description of the problem in the following video.

McCarthy may have a point and Beasley's article provides a good example. Even though Beasley talks about reaching a "target audience," she goes on to discuss how to determine and connect with a "target market." Clearly, Beasley understands the difference because she discusses the importance of establishing "a marketing plan to reach those target customers in your target market." Nevertheless, using the terms interchangeably can cause confusion. McCarthy is correct that companies need to understand the difference. He asserts, "Defining your target audience's needs, wants, hopes, and aspirations helps offer assurance that you understand them and how to help solve their problem. Their comfort that you can identify their specific problem draws them to a conclusion that you are more appropriate and capable of caring for them." Beasley recommends asking four questions to begin the process of developing a targeted marketing strategy. They are:

  • Targeted customersWho will benefit from my products or services?
  • Where can I find these target customers?
  • What geographic locations do I want to focus on?
  • What industry or occupation would benefit from my business?

The last question is the first one that needs to be asked. The other questions begin to peel the onion after that question has been satisfactorily answered. Adrianne Glowski offers some additional tips about how to peel the onion to find your target audience. Her tips all rely on your doing your homework. "Broadly think about who might be interested and who may benefit from having access to what you offer," she writes. "Figuring out your selling point is the first step in identifying your ideal target audience. Next, think about what information you need to know and why. What do you need to know about your potential customers in order to reach them?" That's just the beginning of the research you need to do. She explains:

"Start with secondary research. There are a lot of existing sources that can help you pull together information about your industry, the market, your competition, and the broad potential customer you have already identified. The best part is that someone has already done the work and, in many cases, the information won’t cost you anything. The downside is that the information may not be focused in a manner that is 100% useful for your purposes. Nevertheless, it's always a good idea to do some searching. You never know—the research you need may indeed exist."

Glowski writes that the next step is "to create a customer profile. This is more than a brief statement; it's an in-depth description of who your typical customer may be and includes demographic and psychographic information." That information could include things like:

  • "Demographic information: This may include age, gender, location, ethnic background, marital status, income, and more.

  • "Psychographic information: This type of information goes beyond the 'external' and identifies more about a customer's psychology, interests, hobbies, values, attitudes behaviors, lifestyle, and more."

Glowski insists that "both types of information are essential for developing your customer profile." She explains:

"Demographic information will help you identify the type of person who will potentially buy your products and services. Psychographic information goes one step further and nails down why that potential customer may buy."

The next step, she writes, involves virtual location. "Find out which websites they visit and which social networks they most frequently check. Are they glued to their email? Are they addicted to apps? The information you put together for your customer profile, combined with knowing where your audience hangs out online or how they use technology, will facilitate the delivery of your message." Her final recommended step involves keeping engaged by monitoring activity and adjusting to emerging trends. Mickael Bentz, Product Marketing Manager at Neolane, believes that much of data needed to fill out the customer profiles recommended by Glowski can be obtained from Facebook. "In addition to serving as a channel to develop brand awareness and improve engagement with customers," he writes, "Facebook is a personal information goldmine." ["Facebook Likes: Marketers' Secret Weapon for High-Quality Qualification," The Cross-Channel Conversation, 27 June 2013] He continues:

"This information can be captured through what we call the 'social opt-in.' The social opt-in occurs when Facebook users accept applications requesting personal information on facebook.com or use Facebook to log in to third-party websites. A lot of information can be requested, including email, pages liked and declared interests. In two Neolane studies of websites with Facebook Login and Facebook applications, we realized that the 'like' capture is not very popular among marketers for the moment. About 25% of Facebook applications analyzed require users' likes, while only 17% of websites using Facebook Login collect the likes."

Companies are obviously eager to collect and analyze any data that helps them better reach their targeted audience, but, as explained below, Facebook may be losing its cachet. Jeff Fraser asserts, "Consumers feel like they're being drowned with irrelevant messages, and they're willing to share their data if it means getting fewer, better-targeted ads." ["INFOGRAPHIC: 'You Can Have My Data, But Stop With the Emails', Consumers Say," Marketing, 26 June 2013] Fraser reports that real message is fewer ads. He explains:

"According to a report by loyalty management agency Aimia, 46% of 6,000 consumers surveyed in Canada, the U.S. and U.K. think they receive too many emails from marketers. Most respondents said more than 20 emails from marketers in one week is too much, and they'd prefer to receive email from a company at most once a week."

What I found most intriguing about Fraser's article was the following:

"In spite of feeling they receive too much advertising (or maybe because they feel that way) consumers are intrigued by the possibility of better-targeted marketing through data collection, the survey says. More than half of global respondents said they would be happy or very happy with receiving product recommendations based on lifestyle data, and 70% said they would like to receive offers on products they buy regularly. Asked what information they would be comfortable sharing, fewer than 15% of respondents were unhappy about marketers knowing their name, email, hobbies, occupation, or address. Feelings were more mixed about income and purchasing history, and most were uncomfortable about their browsing history or mobile number being collected by marketers."

Fraser goes on to report, "Evidence from the survey suggests sentiment is already turning against big data providers who are perceived as disingenuous about data privacy." He mentioned two such companies: Facebook and Google. Martin Hayward, Aimia's vice-president of global digital strategy and lead author on the report, told Fraser, that "although consumers may feel comfortable sharing data right now, that may change if companies don't do a better job communicating with consumers about how they use it." Fraser continues:

"If consumers don't feel a sense of control over their own data, he said, they may refuse to share it altogether, either by employing third-party data privacy tools or pushing government to regulate data collection. 'We're on the cusp of having all this wonderful information,” [Hayward] said. 'The challenge for all of us is to use it responsibly, and not kill the goose that lays the golden eggs.'"

Fraser concludes his article with an infographic that shows some of the other results from the survey. They are very enlightening. The infographic shows, for example, how many consumers regret "liking" a brand or signing up online. I agree with Hayward that companies need to be much more careful about how they handle consumer data. If the "goose" dies, both companies and consumers will be the losers.

July 26, 2013

Institutional Innovation

Each day, dozens of articles are written about the importance of innovation in the business world. Most companies believe that their future survival depends on their ability to be innovative. For example, Danny Baer and Luc Charbonneau, write, "Innovation is the primary force that can catapult a company to market leadership and keep it ahead of its rivals." ["Ernst & Young Insights: The innovation engine: Your company’s success may depend on it," Financial Post, 1 February 2013] Vinnie Mirchandani, however, believes that talking about "innovative companies" is a silly idea. ["Companies do not innovate. People do." Deal Architect, 30 January 2013] He writes:

Corporate Innovation"Apple is Apple because Steve Jobs brought together an amazing team with Tim Cook, an operational genius, Jonathan Ive, a design genius, Ron Johnson, a retail genius, Philip Schiller, a marketing genius and many more. Apple is Apple because it leveraged the innovations created by people at Corning, Foxconn and countless other suppliers. Innovation happens at a cellular level [rather] than a 'company' [level]."

Although it may seem that Baer's and Charbonneau's position is mutually exclusive from the position expressed by Mirchandani, I don't believe it is. During the last U.S. Presidential race, Mitt Romney became infamous for declaring, "Corporations are people." He was ridiculed for the remark, but his point was that people are at the heart of corporate activities, including innovation. If, in fact, you equate a company with its people, then both positions (i.e., that companies and people can be innovative) can be reconciled. That really seems to be what Baer and Charbonneau are saying. They explain:

"As a business grows, you need to keep the spirit of creativity alive — it's too easy to snuff out the creative spark with a stifling layer of process and bureaucracy. Successful companies focus on more than just growth, profit and the bottom line. They build in new capabilities, functions or even departments that centre on creative, disruptive and sustainable ventures."

Obviously, the spirit of creativity and the creative spark can only be kept alive in people. The fact that they discuss new capabilities, functions, or even departments also aligns well with Mirchandani's point that innovation happens at the cellular level. If you buy into the concept that when talking institutionalized innovation you are really talking about how to create the environment, culture, and processes that will help make people more innovative, then you should have no trouble accepting the notion that innovation can involve a structured, repeatable process. Robert Brands, founder and president of Brands & Co. LLC, believes "a structured process must be put into place." ["Can Innovation Be a Structured, Repeatable Process?" IndustryWeek, 23 July 2012] He writes:

"Although it sounds counterintuitive to say 'structure' and 'ideation' in the same sentence, organizations need to conduct at least two ideation sessions each year in order to foster continued growth. A good innovation leader has the foresight to schedule regular ideation sessions year after year, and not just when sales are dwindling. Ideation, or idea management, is part of a long-term innovation effort that, if facilitated intelligently, leads to successful new products or services. Even if a small percentage of concepts make it through the process, the payoff could be significant for the company."

Brands also provides some recommendations about how a company can best structure those ideation sessions. He writes:

"Here are some tips for hosting ideation sessions that will lead to the best possible outcomes.

  • Break up teams into people who know each other but are not 'that friendly' with each other in order to minimize groupthink.
  • Vary the format as well as locations and times of ideation sessions. Predictability can kill ideation. Mix it up to get people out of their comfort zones.
  • Accept ALL ideas and get them written down on the board. You never know when a concept can be recycled for future use.
  • Build a database of ideas from which new combinations and solutions can be derived.

"By holding regular ideation sessions, your organization is adopting a proactive strategy in the new product development process."

I have two concerns with Brands' approach. First, it may lead employees to believe that new ideas are only wanted a couple of times a year and only in formal ideation sessions. Leaders need to ensure that their subordinates understand that there is no bad time to bring up a good idea. Second, his emphasis on ideation (i.e., coming up with ideas) isn't where most companies fail. Tim Kastelle notes that out of hundreds of organizations he has had his students assess, only about 5 percent of those organizations have had any difficulty coming up with new and good ideas. So coming up with ideas is rarely the problem. Kastelle agrees with Brands, however, that "it’s much better to think of innovation as a process than to think of it as an event." ["How to Manage Innovation as a Process," Innovation for Growth, 6 November 2012] Like Brands, he thinks of innovation as idea management. He explains:

"In order to innovate effectively, you not only have to generate great ideas, but you have to select the ones that you want to invest in, then execute them, figure out how to keep people inside the organisation committed as you go through the process, then get the new ideas to spread out in the world. And if one part of that process goes wrong, then your innovation efforts will likely fail. That's kind of scary."

Kastelle indicates that he uses an "innovation value change" to help companies assess where they are in relation to innovation. He also uses an innovation matrix as depicted in the graphic below. ["Tools Don’t Solve Problems, People Do," Innovation for Growth, 16 February 2011] From the title of Kastelle's post, it's easy to see that he agrees with Mirchandani that innovation is all about people. In fact, Kastelle calls it "the last people-centric process" in the corporate world.

Innovation matrix

Kastelle asserts that the innovation matrix "is useful – because tools and skills are two separate things. You can increase one without affecting the other." He offers three lessons learned from using the matrix.

  • "Innovation tools and innovation skills are two separate things: people often think that they can solve their innovation problems simply by finding the right tool. This is rarely true. In general, to improve innovation you have to improve skills and capabilities. Tools can be used to facilitate this process, but they can’t do it on their own.

  • "One of the biggest obstacles to innovation is lack of time: if innovation is important, people need the time and space to work on developing, testing, and spreading new ideas. If you are a manager and you want your people to be more innovative, you have to give them the time needed to do this.

  • "Tools don't solve problems, people do: this is why innovation is still people-centric. It's more important to remove obstacles to innovation than it is to give people tools.

Oana-Maria Pop, an Associate Editor at InnovationManagement.se, also believes that innovation must have a systematic approach. ["Systematic Innovation and the Journey Towards a Unified Innovation Management Standard," 19 November 2012] She indicates that there are four "key insights arguing in favour of a systematic approach to innovation." They are:

1. The Concept of Innovation is Becoming Broader and More Complex

... Innovation is undergoing a major shift. It can now start in emerging markets and not only in mature ones; it has become open, and collaborative allowing more and more areas of the organization to be involved. Paradoxically, broader involvement is not entirely good news. More participants in the innovation process mean more complex decision-making – a genuine burden when there is little guidance available.

2. Innovation Strategy and Culture Matter Enormously

In order to thrive, organizations need to have the relevant cultural component in place and this component involves securing the right attitude towards innovation. ... Companies need to ensure the correct and continuous integration of innovation in the overall company strategy.

3. Innovation does not just 'happen'

... With customer demand and competitive pressures increasing and operation excellence making entities leaner and leaner, the odds of innovation happening by chance have decreased. Evidence points towards a more proactive approach to developing new products, services and business models.

4. Confusion and its lasting reign

The final insight to consider is the puzzlement among entities – especially regarding what innovation is, what it can do and why it should be formally managed. Even companies at the forefront of new product or service development struggle to maintain a robust model that sustains innovation. In addition, organizations agree that they need to innovate more and also acknowledge the existence of a knowledge gap in terms of a systematic approach to their innovation processes.

Even if all the pundits agree that innovation is a people-centric process and that it is required if businesses are to thrive, there are no silver bullet solutions about how to institutionalize innovation. Baer and Charbonneau correctly assert that the innovation process must be tailored to the company (and that may mean you need to tailor it to the people in the company).

July 25, 2013

The Future of Urban Transportation: Moving Goods

In an earlier post (The Future of Urban Transportation: Moving People), I discussed some of the solutions that researchers are considering to help overcome traffic congestion challenges in urban areas. Moving people around in urban environments, however, is only half of the challenge. "In the grand scheme of urban mobility," writes Eric Jaffe, "it's easy to lose track of commercial freight movement." ["The Forgotten Urban Transportation Problem We Should Be Trying to Fix," The Atlantic Cities, 22 May 2013] No matter where people live they require food, clothing, and other consumer goods; however, not many people think about how those goods are delivered into the city. The more congested the city, the more difficult the challenge. Jaffe continues:

Key-visual-city-logistics
Source: Delivering Tomorrow
"Commuters are the primary source of traffic coming into and out of the city, and parking causes much of the street-to-street congestion within it. Fact is, says transport scholar Genevieve Giuliano of the University of Southern California, it's so easy to forget about freight that metropolitan areas have done so for years — at their own peril. 'Any of us who live in cities and metropolitan areas are very dependent on urban freight, because that's how all of the goods and services we purchase get here,' says Giuliano. 'It's fascinating to me that it's never been a part of city planning.' The consequence of this historical oversight is that handling cargo has become the 'newest urban transportation problem,' according to Giuliano. While cities have been places of trade and exchange for as long as they've existed, planners have only recently begun to give freight its due consideration. Even the new wave of smart growth strategies — with its emphasis on reduced road capacity as well as mixed-use development — has created some unintended complications for commercial movement. 'The more that you follow these types of strategies without thinking about how freight actually gets delivered, the more problems you're going to generate,' Giuliano says."

In the previous post on urban transportation mentioned above, I noted that one of the problems with getting people to use public transportation is that the so-called "first and last mile" challenge. It has yet to be solved. The "first and last mile" challenge for freight can be even more problematic. In fact, Jaffe says that challenge is the first of three significant categories of problems that plague cities. Giuliano calls it the "metro core" problem. Jaffe explains, "Essentially, [the problem involves] the congestion and double-parking that occurs in city centers when trucks aren't well-managed during the first and last mile of delivery." The second category of challenges identified by Giuliano involves "the environmental impact of moving freight through the metro area." She labels the final challenge, "the hub dilemma — the additional layer of commercial traffic that accrues at international nodes like Los Angeles (for port shipping) or Chicago (for rail freight)."

Jaffe reports that a survey conducted by Giuliano and some colleagues concludes that cities outside of the United States handle urban freight management better than American cities. The abstract for the survey states:

"The authors use three categories to describe urban freight strategies: last mile/first mile deliveries and pickups, environmental mitigation, and trade node strategies. The authors find that there are many possibilities for better managing urban freight and its impacts including labeling and certification programs, incentive-based voluntary emissions reductions programs, local land use and parking policies, and more stringent national fuel efficiency and emissions standards for heavy duty trucks. More research is needed on intra-metropolitan freight movements and on the effectiveness of existing policies and strategies."

As you can sense from that abstract, a great deal of emphasis appears to be on the environmental impacts of freight management in urban areas. Jaffe reports:

"London ... recently established a low-emissions zone in the metro area. The zone targeted the worst environmental offenders, including heavy diesel trucks, and the early results are at least a little encouraging. One new study found a measurable change in fleet quality as well as a small improvement in air quality."

The video found below (which was created by Oliver O'Brien, a researcher at University College London) demonstrates why controlling emissions in London is critical. London is a magnet for workers who commute in and out of the city each day. The video "is an animation of Oyster Card (commuter smartcard) taps in and out of London's tube and rail stations. Taps are recorded in 10-minute intervals, and red represents flow into the system, while green indicates exiting a station." ["Get Lost in These 19 Fascinating Maps," by Lauren Drell, Mashable, 24 April 2013] Road traffic data (both private and commercial) only add to the commute.

Jaffe reports that Paris "is way ahead of the curve when it comes to experimenting with potential solutions to freight congestion." Although he admits that Paris' scheme requires additional handling of goods and increases costs. He explains:

"The city's most ambitious program may be its model of consolidating shipments outside the metro area then shipping them into the city center for redistribution. The plan isn't perfect — for one thing, handling goods an extra time increases costs — but it does address the classic urban freight problem of partly full trucks taking up space on city roads."

Frankly, I'm a bit surprised that solutions to the "first and last mile" challenge haven't progressed any further than they have. A couple of years ago I published two posts on the subject of the "Surmounting the Last Mile Delivery Challenge is Urban Areas." In the first of those posts (Part 1: Pipe Dreams), I discussed some ideas that used pipes (either new ones or existing underground systems) to move packages from centralized warehouses situated outside of cities at rapid speeds into delivery centers within the city. The beauty of these kinds of systems is that they don't congest city streets. The drawbacks to such systems include increased costs as well as the limited size and quantity of things that can be transported in this manner. In the second post (Part 2: Small and Clean Vehicles), I discussed some of the new zero-emission vehicle designs that are generally smaller than the trucks and lorries used to make deliveries today. Many of those vehicles are already in use around the globe.

Because cities are at the bottom of the legal pecking order (i.e., federal and state laws take precedent), Jaffe reports that "Giuliano believes the most promising approach to freight problems in U.S. cities will be pacts negotiated directly with companies and operators." I expect to see a lot more public/private cooperation in the years ahead. Companies that opt out of collaborating with cities may, in a very real sense, find themselves on the outside. This will be especially true if those public/private partnerships involve the construction and operation of transways (e.g., roads, rails, canals, tunnels, pipes, etc.). Giuliano told Jaffe, "As states we can't impose regulations because of protection, so the next best thing is to have these negotiations to see what we can accomplish by providing incentives. The models we see in Europe, they're always initiated by government, but essentially they're partnerships: "We have a problem, let's figure out how we're going to solve it".'"

I would expect automobile/truck manufacturers, trucking/delivery firms, and railroads to play a major role in helping figure out solutions to the three major categories of challenges Giuliano noted above; but trucking/delivery firms will probably play the largest role. Typically about 80 percent of freight with a local destination is carried by truck.

To be of most use, these solutions will have to be integrated and that means that Big Data will play an essential role in helping make the delivery of goods in urban areas as efficient and effective as possible. There are currently experiments ongoing in Europe to demonstrate how "automatic data capturing and information sharing will make it possible to harmonize the urban transport to achieve environmental and economic benefits." ["Project Demonstrations," STRAIGHTSOL, 3 January 2013] In the post about moving people in and about in urban areas, researchers concluded that there is no silver bullet solution to the challenge. The same is certainly true when it comes to the movement of goods. A combination of strategies will have to be employed if progress is to be made. City planners will continue to ignore the movement of urban freight at their own peril. Elichi Taniguchi writes, "The need is urgent for more efficient and effective freight transport systems that not only address costs but also fully tackle environmental issues such as noise, air pollution, vibration and visual intrusion. ... It’s time to create real visions for City Logistics." ["The Future of City Logistics," 29 October 2012] He agrees with Jaffe that "logistics providers have an important role to play in in all of this." He concludes, "In the end we need to see a change in attitude among all stakeholders if we are to facilitate City Logistics. They need to recognize the importance of working together in the initial planning stages. If they do, everyone benefits."

July 24, 2013

The Future of Urban Transportation: Moving People

In a post entitled Smart Cities and Traffic, I discussed how big data analytics are helping to reduce traffic challenges in urban areas. I also indicated that I would discuss future transportation concepts that could help reduce or eliminate traditional automobile traffic in crowded urban environments. I'm going to split that discussion into two parts. In this post, I'll discuss some concepts that have been proposed concerning the movement of people. In a second post, I will discuss concepts relating to the movement of goods.

Tiffany Dovey Fishman, a manager with Deloitte Research at Deloitte Services LP, wrote a report entitled Digital-Age Transportation: The Future of Urban Mobility in which she asserts there are no silver-bullet solutions to urban transportation gridlock. All, however, is not lost. She claims that "next generation urban transport systems will connect transportation modes, services, and technologies together in innovative new ways that pragmatically address a seemingly intractable problem." She depicted a number of those ways in the following graphic.

Future of Urban Transportation 02

You will notice that automobiles still play a prominent role in Fishman's integrated concept. In the post referred to above, I stated that the automobile industry can feel the earth moving beneath its feet and smart automobile makers are going to evolve their business model so that they can successfully traverse this new landscape. Audi is one of these automakers. In collaboration with researchers from Columbia University, Audi worked on the "Extreme Cities Project 2050: seeing cities as an opportunity for mobility." ["Audi forecasts a kinder, gentler, more collaborative urban future," by Danny King, Green Car News, 8 May 2013] For this project, five hypotheses were explored. One of those hypotheses was called Asymmetric Mobility. An Audi press release discussing the project described the hypothesis this way (remember they are talking about the year 2050):

"'Getting from A to B' used to mean taking a clear decision. Does it make more sense to go by train or by car to an evening event – or is it better to call it off and spend the evening at home, because getting into the city simply takes too much time. Today it can already be observed that asymmetric patterns of mobility are continually on the increase, which means it is no longer necessary to take decisions. People use various means of transport to get around day by day and also to carry out the tasks of their daily lives. While sitting in a train they can attend to their emails by smartphone or take part in a video conference linked to the other side of the world using a headset and camera. The asymmetric mobility hypothesis underlines the fact that mobility will be much more flexible in the year 2050. Changing between different modes of transport could be made much simpler and more efficient, and be more of an experience, in the future."

Several factors must come together to make Asymmetric Mobility a reality. The first is already in work -- mobile technologies. I suspect by 2050 the penetration of mobile technologies will be nearly 100 percent. Second, a variety of transportation options need to be available. Otherwise the options will be too limited for the hypothesis to work. Third, analytic insights powered by artificial intelligence systems and connected to the cloud must be available. Finally, those transportation options need to be affordable. If you have any doubts about why big data analytics is required to make this work, watch the following video that depicts 12 hours of commuting on public transportation throughout New City's five boroughs. The New York City MTA transports approximately seven million people every day.

General Electric shares a similar vision of the future as the one hypothesized by Audi. "Imagine urban transportation as a seamless whole, an integrated system of shared transportation options powered by renewable energy. Such a system would replace disconnected modes of transport, such as cars, bikes, buses and trains, with an integrated system that incorporates all of them." ["The Future of Urban Transport: Multi-Modal System," by GE Look ahead, The Economist, 26 April 2013] The article reports that Deutsche Bahn, Germany's largest train operator, "is leading a demonstration project in Berlin known as BeMobility. The project involves more than 30 public and private stakeholders who together are demonstrating how electric vehicle sharing can be integrated into the public transport system."

General Motors would like to see an electric vehicle sharing scheme succeed, because it thinks it has the perfect vehicle for it -- the Electric Networked Vehicle (EN-V). ["Wired Wheels: Taking a Spin in the Future of Urban Transportation," by Larry Greenemeier, Scientific American, 10 January 2011] You can watch a video about the vehicle by clicking on this link. Nissan's candidate in this field was revealed in 2009 and is called the Land Glider. ["Nissan Land Glider Concept," by Tony Swan, Car and Driver, October 2009] SmartCar has also entered the field with a new " Smart Electric Drive" model. ["Have You Seen the All-New Electric-Drive smart?" AskPatty.com, 20 June 2013] Another interesting candidate is the AIRPod, a car that uses compressed air instead of electricity that is being developed in France. ["The AIRPod: 'The future of urban transportation'," @TBD, 16 February 2012] The article is accompanied by an interesting video. Even though shared electric or compressed air vehicles would cut down on pollution (and presumably the number of vehicles travelling on city streets), they are still basically personal automobiles. Most urban planners would like to see more public transportation built and other forms of transportation used.

Paul Schilperoord notes that the vision of multi-modal transportation has been around for decades, but not enough transportation options have been available to make multi-modal transportation convenient nor has a way of integrating various modes of transportation been possible; but, those shortcomings aren't due to a lack of ideas. ["Future Tech: Urban Transport," Dark Roasted Blend] He writes:

"Governments have tried to get people out of their cars and onto public transport such as trains, trams, buses and metros, as well as using non-polluting bicycles. Unfortunately, this has mostly proven to be unsuccessful. At least partly to blame here are poor connections between different modes of transport, longer traveling times and delays, as well as a lack of comfort and privacy. Designers and engineers are now exploring new public transport concepts, which are more capable of competing with the car by offering quicker, and in some cases individualized, transportation, as well as introducing a new range of personal city vehicles."

Schilperoord's article is packed with pictures of some of the concepts designers have come with including flying trains, maglev personal monorail pods, electric vehicles that remind you of a ride at Disneyland, and new kinds of buses. Ian J. Faulks, Julia Irwin, Richard Howitt, and Robyn Dowling believe that too much attention is being given to the middle part of public transportation and not enough to the beginnings and ends. They note that one shortcoming of most public transportation schemes is that "people are reluctant to walk the 'first and last mile'." ["Electric unicycles, minifarthings and the future of urban transport," Climate Spectator, 8 May 2013] They're correct. If you can't conveniently get people to and from public transportation, they won't use it. That's why the authors conclude, "Personal mobility devices, or PMDs, are worth a fresh look as a solution to urban travel." They describe a number of one-, two-, and three-wheel PMDs in their article. Fortunately, for inner city dwellers, convenience isn't as great a challenge because most embark/debark locations for public transportation are within easy walking distance. The further out from the inner city that a resident lives the more acute the problem becomes.

I agree with Fishman that it will be a combination of ideas that helps solve the urban transportation challenge in the future. It will be interesting to see which concepts will prove to be winners. The one thing that almost all urban planners agree upon is the fact that the car culture, as currently understood, is not a sustainable alternative for urban transportation in the decades ahead.

July 23, 2013

Big Data and Ethical Corporate Behavior

"We can now gather, correlate, and analyze information in ways that were unthinkable in the past," writes Timo Elliott. "However," he continues, "with great power comes great responsibility. Analytics is a very powerful weapon, and weapons can be abused." ["The Ethics of Big Data: Vendors Should Take A Stand," Business Analytics, 12 June 2013] Ethically responsible data collection and analysis skyrocketed to the top of current news topics after former Booz Allen employee Edward Snowden leaked classified information revealing that the U.S. National Security Agency has been accessing telephone records and data from technology companies. Reactions to these revelations reaffirm the fact that privacy remains a very sensitive issue.

Paparazzi phone clearJohn Gapper reminds us that people are sensitive about privacy regardless of who is collecting and analyzing it -- governments or companies. He writes, "Companies that hold rapidly expanding amounts of personal information are using new kinds of data analysis and artificial intelligence to shape products and services, and to predict what customers will want." ["Big data has to show that it’s not like Big Brother," Financial Times, 12 June 2013] Elliott sees nothing sinister in this. "I have been working in analytics for over twenty years," he writes, "and have witnessed first hand how these technologies have made the world a better place. I've seen thousands of examples, from every type of corporate efficiency imaginable, to improving customer satisfaction at theme parks and making better use of limited blood supplies. Yet we've only seen the tip of the iceberg when it comes to 'big data analytics'." But the potential for abuse, he explains, is very real.

"The past clearly shows that without proper controls, there can be irresistible temptations for companies and governments to combine data in ways that threaten personal liberties. Misuse of every previous data gathering technology has eventually come to light, sometimes only decades after the facts, leading to new laws re-establishing privacy limits. Modern technology makes the potential threat much greater than in the past. Combining 'metadata' from online activities, mobile devices, payment systems, surveillance cameras, medical histories, and social networks can reveal every nuance of our online and offline lives."

Frankly, the data collection genie is out of the bottle and its release can't be reversed. Patrick Tucker writes, "Modern data science is finding that nearly any type of data can be used, much like a fingerprint, to identify the person who created it: your choice of movies on Netflix, the location signals emitted by your cell phone, even your pattern of walking as recorded by a surveillance camera. In effect, the more data there is, the less any of it can be said to be private, since the richness of that data makes pinpointing people 'algorithmically possible,' says Princeton University computer scientist Arvind Narayanan." ["Has Big Data Made Anonymity Impossible?," MIT Technology Review, 7 May 2013] The bottom line: Big data is out there and it can be abused. Since big data analytics is predicted to alter the business landscape forever (and companies want to take advantage of the insights it can offer), it is essential that companies handle that data ethically and responsibly. Elliott writes:

"Analytics is, at best, a wonderful opportunity to shine light into the dark, to reveal what was previously concealed, and make it better. People and governments must be in the forefront of establishing clear, transparent guidelines that make the right tradeoffs between the public good and citizen's rights. We should not wait for abuses to come to light before acting."

Unfortunately, in the United States, expecting action from Washington is much like expecting to win the lottery. Chances aren't good. That means that companies should assume responsibility for the secure and ethical handling of the data they collect and analyze. Bill Franks writes, "When it comes to deciding how your organization will develop privacy policies for big data, there are at least three distinct sets of guidelines to consider. Without consideration for all three of these areas, you will put your organization at risk." ["Helpful or Creepy? Avoid Crossing The Line With Big Data," International Institute for Analytics, 7 May 2013] The three sets of guidelines to which Franks alludes involve answers to three different questions. They are:

  • What is legal?
  • What is ethical?
  • What will the public find acceptable?

"In an ideal world," Franks writes, "these three considerations would lead to the same result. In practice, however, the three are often not in sync and can in fact, point to totally different decisions. It will be important for your organization to decide how you want to balance the results to guide your actions when the three criteria diverge." And you thought it was going to be easy to determine the best way to handle big data. Franks points out that largest gray area involves behavior that might not be illegal but nevertheless could be unethical. In such circumstances, he asserts, "it is important to consider what is right and ethical, not just what is legal. If you're the first to ponder a new type of analysis, you need to think through these considerations before you even start down the path."

Franks goes on to point out that even if a company utilizes a clearly legal and ethical analytic methodology, customers might still have strong reactions. "What the public finds acceptable," he writes, "can often be even more stringent than what is legal and ethical." He concludes:

"My belief is that an organization will be well served to routinely sit down and explicitly discuss the legal, ethical, and consumer perception of its analytic policies in detail. After examining the legal, ethical, and consumer perspectives, I recommend defaulting to pursuing strategies that fall within bounds of the most restrictive of the three considerations. Given the rapid change of the legal environment and consumer acceptance of the use of their data, you can expect your decisions to be fluid and changing over time. What seems ok today may not be ok a year from now. While it may not be the most exciting process, keeping on top of your privacy policies will help avoid much bigger issues, such as legal problems and PR fiascos, down the road."

Elliott believes that "vendors of analytics software" also have a major role to play in ensuring that data is handled legally and ethically. He believes they should "help encourage data safety and transparency, and provide the technology features that make it easy for organizations to support these initiatives." Companies are constantly trying to strengthen customer loyalty. One of the best ways to do that is by strengthening the trust that consumers have in how the company handles their personal data. Despite claims that they anonymize data, consumers now know that true anonymization may no longer be possible. David Meyer explains:

"When it comes to protecting privacy in the digital age, anonymization is a terrifically important concept. In the context of the location data collected by so many mobile apps these days, it generally refers to the decoupling of the location data from identifiers such as the user’s name or phone number. Used in this way, anonymization is supposed to allow the collection of huge amounts of information for business purposes while minimizing the risks if, for example, someone were to hack the developer's database. Except, according to research published in Scientific Reports, ... people's day-to-day movement is usually so predictable that even anonymized location data can be linked to individuals with relative ease if correlated with a piece of outside information. Why? Because our movement patterns give us away. ["Why the collision of big data and privacy will require a new realpolitik," Gigaom, 25 March 2013]

Meyer writes that we need to be realistic when it comes to data collection and analysis. "We are not going to stop all this data collection," he writes, "so we need to develop workable guidelines for protecting people." He agrees with Elliott that vendors pushing analytics solutions have a critical role to play. "Those developing data-centric products," he writes, "have to start thinking responsibly – and so do the privacy brigade. Neither camp will entirely get its way: there will be greater regulation of data privacy, one way or another, but the masses will also not be rising up against the data barons anytime soon." He doesn't believe the masses will be rising up because he thinks only some kind of catastrophic event will motivate such an uprising. Meyer penned his article shortly before Snowden's leaked information was published; but, those disclosures probably don't represent the catastrophe envisioned by Meyer. He concludes, "I suspect the really useful regulation will come some way down the line, as a reactive measure. I just shudder to think what event will necessitate it." Whatever the event, your company doesn't want to be a part of it. That much I know for sure.

July 22, 2013

The Internet of Things Looks Like Big Business

The world changed forever when people started connecting over the Internet. The world is going to change again as billions of devices and machines start connecting over what is being called "The Internet of Things" (IoT). Brian Proffitt writes, "The rise of the Internet of Things means billions of physical objects will soon generate massive amounts of data 24 hours a day. Not only will this make traditional search methods nearly impossible to use, it will also create an environment where instead of looking for things in the world, those things will be seeking us out to give us all sorts of information that will help us fix, use or buy them." ["How The Internet Of Things Will Revolutionize Search," readwrite, 26 April 2013] Proffitt continues:

"When talking about the Internet of Things, it is important to get past the hype and explain exactly what it is: vast numbers of automated physical devices and objects connected to the Internet. These devices are usually routers, switches, phones … but increasingly devices like security cameras and remote climate sensors are being added -- and over time we can expect everything from cars to refrigerators to join the party."

Kenton Williston notes that as "embedded systems grow increasingly interconnected, fragmentation is becoming a major problem." He reports that Intel is trying to solve the problem "with a set of interoperable solutions that can scale across applications. The framework brings together hardware, OSs, and software for connectivity, security, manageability." ["Intel Intelligent Systems Framework Simplifies 'Internet of Things'," Intel Embedded Community, 11 September 2012] Intel sees the Internet of Things as depicted in the following graphic.

Intel IoT

Although the Internet of Things seems to be the name with the greatest degree of stickiness, it has been referred to by other names as well. Early this century, for example, my friend Thomas P.M. Barnett predicted its rise and called it the "Evernet." Others prefer the term "Internet of Everything" (IoE). General Electric (GE) prefers the term "Industrial Internet" since it will primarily connect machines. For more background, read my post entitled Machine-to-Machine Communication. GE also believes that the Industrial Internet is going to be good for business. "GE believes the Industrial Internet will spur accelerated economic productivity, potentially boosting GDP by an average of $4,600 to $7,000 per person in the U.S." ["'Industrial Internet' Could Boost GDP $2 Trillion by 2020," by Patrick Brogan, USTelecom, 14 June 2013] Brogan reports that a study released by GE last November concluded, "The 'Industrial Internet,' the growing network of machines and sensors across all sectors of the economy linked through Internet communications networks, could add an estimated $1.5 trillion to $2.3 trillion to annual U.S. Gross Domestic Product (GDP) by 2020."

General Electric isn't the only company that sees a big future for the Internet of Things. "The value at stake for the 'Internet of Everything' is $14.4 trillion that businesses and customers can capture in the next decade, according to Cisco. In other terms, Cisco is projecting that the Internet of Everything has the potential to grow global corporate profits by 21 percent in aggregate by 2022." ["Cisco: 'Internet of Everything' to yield $14.4 trillion in value," by Rachel King, CNET, 13 March 2013] At a Cisco-sponsored conference, Rob Lloyd, the company's president of sales and development, told the audience "that 99 percent of electronics in the world today still aren't connected to the Internet." King continues:

"The next step, therefore, is the Internet of Everything in which those devices will be brought online. 'If you look at those business imperatives and think of them in the context of those major technology trends, there is an entirely new role of IT coming out,' Lloyd said. 'The role of the network is critical to unlock these major market trends.' For the Internet of Everything, Lloyd said that means taking people, process, data -- all the things done so far in connecting the first 10 billion connected devices -- to unlock capabilities we haven't seen yet."

Brogan points out that most of the technologies required to make the Internet of Things a reality already exist. They include, radio frequency identification (RFID) sensors, real-time data analytics, cloud computing, machine-to-machine communications, mobility, and visualization of data. King reports that Cisco believes the Internet of Things will probably be implemented vertically before being connected horizontally. She writes:

"Cisco believes that there are at least seven verticals that will move more quickly, starting with manufacturing followed by the public sector, energy/utilities, health care, finance/insurance, transportation, and wholesale/distribution. But Lloyd stipulated that the Internet of Everything will be driven by business funding -- not IT funding -- as we embrace consumer devices (aka bring your own device, or BYOD), the cloud, and data analytics to drive insights. Cisco's chief strategy and technical officer, Padmasree Warrior, explained further, asserting that the next decade of work will be about making all of these processes more efficient. For the network, Warrior outlined some of the technology implications, asserting that networks need to be more programmable, automated, dynamic, aware, agile, and secure in the face of a growing 'app-based economy'. 'The network needs to be much more orchestrated rather than just being configurable,' Warrior added. Warrior also reflected that the discussion -- at least around big data -- is finally moving from data collection to data usage."

Once a robust Internet of Things is developed, pundits imagine all sorts of things will become possible. Bob Violino writes, "The most common examples are smart cars, IP-addressable washing machines and Internet-connected nanny cams." ["The Internet of Things: Coming to a network near you," Network World, 22 April 2013] Tom Soroka, Vice President for Engineering and Technology at USTelecom, sees huge potential in the industrial sector. "When we marry the power of a global Internet with the power of global industry," he writes, "one can just imagine the massive potential of an industrial-grade network built just for the purpose of developing, manufacturing, ordering, delivering and operating commerce around the world." ["Explaining the Industrial Internet," USTelecom Newsletter, subscription required] Proffitt envisions the day when the IoT will schedule maintenance work and direct customers to RFID tagged products. He writes:

"This world is not far off. Smartphones and other mobile devices can already tap into public search engines to discover more about the world around them. You can use augmented reality to see results displayed graphically on device screens. As more and more objects join the Internet, they'll create information that will be added to the potential data you can receive, raising the level of information available by orders of magnitude. This will be both a boon (more data to help make decisions) and a curse (so much data you could drown)."

Analysts at ABI Research agree that M2M networks and services are going to prove to be "golden eggs" of profitability for those who master the domain. However, they have found "porous security is exposing vulnerabilities in a large number of use-case scenarios" and that vulnerability threatens to slow the growth of the Internet of Things. ["Machine-to-Machine Application Market Grows, But Poor Security Is Major Issue, Report Finds," SupplyChainBrain, 13 February 2013] The analysts argue, "The horizontal evolution of M2M will require full end-to-end security. Significant efforts need to be invested into M2M application security in order for the M2M market to fully evolve. Whether this is through open source initiatives or standards development, the demand for increased M2M application security will have to be answered, and sooner rather than later."

Most analysts, however, are convinced that the Internet of Things will come about and that it will be big business. Earlier this year, Cisco CEO and chairman John Chambers, stated, "I believe that businesses and industries that quickly harness the benefits of the Internet of Everything will be rewarded with a larger share of that increased profitability. This will happen at the expense of those that wait or don't adapt effectively. That's why the value is 'at stake' – it's truly up for grabs." ["Economic impact of the 'Internet of Everything' will be US$14trn – Cisco's Chambers," by John Kennedy, Silicon Republic, 19 February 2013]

The U.S. Telecom Association expects its members to get a share of the trillions of dollars at stake. Brogan explains, "Central to this operation are the broadband networks that link machines and sensors together, connecting data centers hosting computers that collect data, process it into actionable information, and display the information in usable formats to end users or connected machines. To keep this system running, fiber-optic networks and systems will need to be built, and large quantities of routers and switches will need to be deployed." Obviously, there are still hurdles to be jumped and challenges to be met before a full-fledged Internet of Things emerges. The biggest players, however, are already involved and it won't be long before the Internet of Things is much larger than the Internet/World Wide Web used to connect humans.

July 19, 2013

Happy Birthday Barcode

"Back in 1973," the staff at Marketing informs us, "the captains of industry and commerce selected the single standard for product identification that the world now knows as the GS1 Barcode. The decision created a global language of business that allows visibility in the entire supply chain across all industry sectors." ["Turning 40, the humble barcode clocks up 6bn scans per day," 3 July 2013] The article continues:

Barcode birthday"For 40 years, the GS1 barcode has revolutionised the way we do business and still remains the most widely used identification system and supply-chain standard in the world. A year after the barcode was selected, a pack of Wrigley's gum became the first product to be scanned with a GS1 Barcode, on 26 June 1974, at 8:01am to be precise. Today, more than five billion products are scanned globally each day."

In the fast-paced technology world, 40 years is long lifespan. Other inventions introduced in 1973 that have enjoyed long lives include the ethernet, gene splicing, and the disposable lighter. That same year, a patent application for the "radio telephone system" (which we now call mobile phones) was filed. Yet, the lowly barcode barely gets a mention on most lists of important inventions. One rare exception to that trend is Lilith eZine, which ranks the barcode at number 6 on its list of "101 Inventions That Changed the World." That article notes:

"Barcodes were conceived as a kind of visual Morse code by a Philadelphia student in 1952, but retailers were slow to take up the technology, which could be unreliable. That changed in the early 1970s when the same student, Norman Woodland, then employed by IBM, devised the Universal Product Code. Since then, black stripes have appeared on almost everything we buy, a ubiquity fueled by their price – it costs about a tenth of a penny to slap on a barcode."

Mary Bellis reminds us that Woodland wasn't alone in conceiving the notion of pattern recognition technology. She writes:

"In 1948, Bernard Silver was a graduate student at Drexel Institute of Technology in Philadelphia. A local food chain store owner had made an inquiry to the Drexel Institute asking about research into a method of automatically reading product information during checkout. Bernard Silver joined together with fellow graduate student Norman Joseph Woodland to work on a solution. Woodland's first idea was to use ultraviolet light sensitive ink. The team built a working prototype but decided that the system was too unstable and expensive. They went back to the drawing board. On October 20, 1949, Woodland and Silver filed their patent application for the 'Classifying Apparatus and Method', describing their invention as 'article classification ... through the medium of identifying patterns'." ["Bar Codes," About.com]

Barcode_roundWoodland's and Silver's design wasn't the familiar barcode you see today. Their initial patent application featured a bull's-eye design (as shown in the attached image from their patent application). Valerie J. Nelson reminds us that much of the credit for selecting today's familiar barcode design belongs to Alan L. Haberman. ["He led bid to pick design for bar code," Los Angeles Times, 22 June 2011] She explains:

"Haberman chaired the industry committee that settled on the bar-code symbol in 1973. ... When Haberman was asked to help modernize grocery-store technology in 1971, he was the chief executive of First National Stores, a New England supermarket chain. The dozen members of his committee were 'young, intense, brilliant,' [Stephen A. Brown wrote in the 1997 book 'Revolution at the Checkout Counter.'] ... The group spent more than two years deciding on the format for the bar code, invented in 1949 by two engineers, Norman Joseph Woodland and Bernard Silver, who favored a bull's-eye design. The bull's-eye ended up losing to one by IBM -- the now-familiar vertical black lines anchored by a series of numbers. The widely used version of the bar code is known as the universal product code. ... 'The proof that this is successful,' Haberman told the Associated Press in 1999, 'is that everybody takes it for granted'."

John LaVacca, vice president at IBM Global Business Services, told the Marketing staff, “40 years after the invention of barcodes they continue to make a positive contribution to supply-chain efficiency, global trade, and consumers' lives.” The staff reports, "From traceability to automatic restocking of store shelves to faster and more efficient export and import, the barcode has made a hyper-efficient supply chain possible through a global system of standards." Other benefits identified by staff include, "the eradication of data-entry errors and waiting time at point of sale. It is also beneficial to business owners as it increases sales, delivers better customer service and better tracking of stocks and store flows."

Because of its ubiquity and low-cost, some people believe the barcode will live to see 80 and beyond. For example, Ian Dunn, senior business manager at Woolworths, told the Marketing staff, "The last 40 years of GS1 Barcodes in the retail, food and grocery sectors has been the foundation of efficiency and accuracy in the supply chain from manufacturing and distribution through to the seamless delivery of product to consumers on supermarket shelves. The next 40 years will be a great opportunity for other sectors to get involved to realise the many benefits and potential of GS1 standards. Having one global standard that is accepted and adopted by every player will undoubtedly protect the global supply-chain market."

BarcodesAlthough the barcode is getting on in years, it has undergone some changes over its lifetime. For example, Sarah Nassauer, reports, "Package design has become so artful, it has come to this: Even the barcode, the style runt of product labeling, is getting gussied up." ["Art in Aisle 5: Barcodes Enter Expressionist Period," Wall Street Journal, 22 June 2011] Nassauer continues:

"Beer, granola, juice and olives are sporting barcodes that integrate famous buildings, blades of wheat and bubbles into the ubiquitous black and white rectangle of lines and numbers. Consumer-goods companies hope these vanity barcodes will better connect with customers. The trend is popular with smaller companies, and even one of the world's largest food companies, Nestle SA, is trying out vanity barcodes on its smaller brands. ... A handful of companies that specialize in making vanity barcodes have cropped up in recent years, though some companies create them in-house. Some vanity-barcode designs aim to be elegant, others quirky."

Nassauer reports that some vanity-barcodes aren't functional and aren't meant to be. They have simply become part of the packaging. In fact, the barcode as art has taken many forms, including expensive fine art. But its principal value is making supply chains more effective. Tim Piper, Victorian director of the Australian Industry Group, told the Marketing staff, "With the focus now on the barcode technologies of the future, we need to harness business activities in parallel with barcode technology and make alignments to improve productivity and increase effectiveness to assist with business profitability."

In the age of big data, gathering and analyzing as much data as possible is going to be important. Although barcodes contain a lot of information, there are limits. It may be those limitations that eventually make the barcode fall out of favor. But, for the moment, the barcode still reigns supreme. Happy 40th birthday.

July 18, 2013

Has the Age of Quantum Computing Finally Arrived?

In a post entitled Quantum Computing: Is the Future Here?, I noted that Lockheed Martin had purchased a quantum computer from the Canadian firm D-Wave. Now Google and NASA have teamed to purchase another D-Wave machine that will be the centerpiece of a new Quantum Artificial Intelligence Lab hosted at NASA's Ames Research Center. ["Google, NASA Open New Lab to Kick Tires on Quantum Computer," by Robert McMillan, Wired, 15 May 2013] According to McMillan, the computer is a D-Wave Two, the same model purchased by Lockheed Martin. On its corporate website, D-Wave describes their latest offering this way:

"The D-Wave TwoTM system is a high performance computing system designed for industrial problems encountered by Fortune 500 companies, government and academia. Our latest superconducting 512-qubit processor chip is housed inside a cryogenics system within a 10 square meter shielded room."

The D-Wave Two is four times as powerful as its first quantum computer which housed a 128-qubit chip. The following video shows how the components in a 128-qubit Rainier processor fit together.

Katherine Foley reports that the Google/NASA computer cost $15 million and is capable of producing "unheard-of calculation speeds 3600 times faster than those of conventional computers." ["NASA Google Quantum Computer: The World's Most Expensive Computer Thinks Like a Human," Policymic, June 2013] Foley continues:

"The Canadian D-Wave-Two is the first commercially available computational system that supposedly utilizes quantum tunneling to solve complex mathematical equations. This process represents a complete overhaul of the way computer scientists have thought about processing."

Foley's use of the modifier "supposedly" reflects the fact the D-Wave system still has its detractors. McMillan explains:

"D-Wave has ... met some skepticism from the quantum computing community. In part, it's because D-Wave is taking a different approach to quantum computing. But it's also because it hasn’t produced the kind of peer reviewed research on its systems that academics require."

Clearly, D-Wave is confident enough in its system that it has been able to convince several large organizations, whose ranks are filled with some pretty smart people, that their quantum computer works. McMillan writes, "Although it's still in the early days of experimentation, quantum computing could herald a new era of number-crunching." He explains:

"That’s because it uses quantum physics to break computer processing out of the binary computing paradigm that has dominated for the past half-century. Instead of binary bits, these computers measure qubits, which can simultaneously represent many more values."

In a blog post announcing the establishment of the Quantum Artificial Intelligence Lab, Hartmut Neven, Google's Director of Engineering, wrote that the company's interest in obtaining the D-Wave Two was to help it advance its research in machine learning. ["Launching the Quantum Artificial Intelligence Lab," Research Blog, 16 May 2013] He wrote:

Tut-hardware-system-assembly-big
Source: D-Wave

"We believe quantum computing may help solve some of the most challenging computer science problems, particularly in machine learning. Machine learning is all about building better models of the world to make more accurate predictions. If we want to cure diseases, we need better models of how they develop. If we want to create effective environmental policies, we need better models of what's happening to our climate. And if we want to build a more useful search engine, we need to better understand spoken questions and what’s on the web so you get the best answer. ... Machine learning is highly difficult. It's what mathematicians call an 'NP-hard' problem. That's because building a good model is really a creative act. As an analogy, consider what it takes to architect a house. You're balancing lots of constraints -- budget, usage requirements, space limitations, etc. -- but still trying to create the most beautiful house you can. A creative architect will find a great solution. Mathematically speaking the architect is solving an optimization problem and creativity can be thought of as the ability to come up with a good solution given an objective and constraints. Classical computers aren't well suited to these types of creative problems. Solving such problems can be imagined as trying to find the lowest point on a surface covered in hills and valleys. Classical computing might use what's called 'gradient descent': start at a random spot on the surface, look around for a lower spot to walk down to, and repeat until you can't walk downhill anymore. But all too often that gets you stuck in a 'local minimum' -- a valley that isn't the very lowest point on the surface. That's where quantum computing comes in. It lets you cheat a little, giving you some chance to 'tunnel' through a ridge to see if there’s a lower valley hidden beyond it. This gives you a much better shot at finding the true lowest point -- the optimal solution."

Neven indicates that some of the theoretical mathematics has already been developed by Google and it's been waiting for a computer powerful enough to execute the calculations. He explains:

"We've already developed some quantum machine learning algorithms. One produces very compact, efficient recognizers -- very useful when you’re short on power, as on a mobile device. Another can handle highly polluted training data, where a high percentage of the examples are mislabeled, as they often are in the real world. And we’ve learned some useful principles: e.g., you get the best results not with pure quantum computing, but by mixing quantum and classical computing. Can we move these ideas from theory to practice, building real solutions on quantum hardware? Answering this question is what the Quantum Artificial Intelligence Lab is for. We hope it helps researchers construct more efficient and more accurate models for everything from speech recognition, to web search, to protein folding. We actually think quantum machine learning may provide the most creative problem-solving process under the known laws of physics."

McMillan states, "The trick is getting these systems far enough along to solve real-world problems." He goes on to report, however, that there is good news regarding the D-Wave Two.

"Researchers at Simon Fraser University and Amherst College presented a paper studying the D-Wave chip's performance. They found that it worked pretty well on certain computing tasks."

In more good news, Cade Metz reports that "researchers at the University of Southern California published a paper that comes ... much closer to showing the D-Wave is indeed a quantum computer." ["Google’s Quantum Computer Proven To Be Real Thing (Almost)," Wired, 28 June 2013] Metz continues:

"When those in the scientific community hear the term [quantum computer], they tend to think of a 'universal quantum computer,' a quantum computer that can handle any task. The D-Wave doesn't work that way — it's geared to particular calculations — but according to [Daniel Lidar, a professor of electrical engineering, chemistry, and physics at USC], the concepts behind it could be used, in theory, to build a universal quantum computer. Whatever you call it, the D-Wave is useful, helping to solve what are known as combinatorial optimization problems, which turn up in everything from genome sequence analysis and protein folding to risk analysis."

The real value of the D-Wave computer will only be proven once results start coming in. Clearly, Lockheed, Google, and NASA believe there will be results and fairly soon.