Site moved to enterrasolutions.com/2012/04, redirecting in 1 second...

« March 2012 | Main | May 2012 »

21 posts from April 2012

April 30, 2012

Manufacturing and the Skills Gap

For several years now analysts have been predicting that some manufacturing now done abroad is going to return to U.S. soil. The reasons for these predictions have been fairly straight forward: international wages (especially in China) are rising; oil prices continue to be high (making the transportation of goods more expensive); and natural and climate-related disasters are disrupting global supply chains at an increasing rate. Nobody predicts that all overseas manufacturing will come home. If it makes sense to make things closer to consumers in the U.S. it also makes sense to make things closer to consumers overseas. This post, however, is not about the chances of manufacturing returning from overseas, but is about the workforce and skills that will be needed if manufacturing does return.

In previous posts, I've pointed out that, even if manufacturing does return to U.S. soil in some significant way, the jobs that return won't be like those that were lost or involve the same number of workers. A year ago, the editorial staff at Supply Chain Digest wrote about trends involving manufacturing jobs ["US Manufacturing - is the Glass Half Full or Half Empty?" 24 March 2011] The article reported:

"As Rex Nutting of MarketWatch recently noted, at one point, during World War II, more than a third of all American workers were employed in a factory. That share has been falling steadily ever since, and now only 9% of workers have a factory job. Factory employment peaked in 1979 at 19.6 million; it’s now down to 11.6 million, even as output has soared since that time. As just a comparison, consider what happened in the agricultural economy. The US produces far more agricultural output today than it did 100 years ago, when farming jobs represented almost 40% of the labor force. Today, just 2.6% of US workers are found in farming. The same thing has been happening in factories. According to the Bureau of Labor Statistics, American factories are producing 57% more output by value today than they did in 1987, but they are doing so with 33% fewer workers today. The average US factory worker on average is connected to more than $180,000 of annual manufacturing output, triple the $60,000 in output per worker in 1972, using constant dollars. That trend shows no sign of letting up in the long term. Manufacturing productivity rose by 6.7% in the US last year, though since the output growth was even higher, 112,000 manufacturing jobs were added in the US last year."

New manufacturing plants are highly automated and people required to operate them must be highly skilled. Technology is designed to save labor. That's the rub. As Emily Maltby reports:

"[According to a study conducted by Catherine L. Mann, professor of global finance at Brandeis], among the technology-intensive manufacturing firms, ... jobs contracted 34.3% in the 2001 to 2009 period. [Michael] Dell, [founder of Dell Inc.], acknowledges that technology can eliminate the need for some jobs. He points to the transformation of the agriculture industry in the U.S., which was once labor-intensive and is now much more efficient despite having fewer employees. 'Every time a certain machine is created, there is job displacement,' he says. 'There's no question jobs have to evolve. We have to march forward and advance skills.'"

A couple of months after the staff at Supply Chain Digest wrote its article, Peter Coy wrote, "The only hope for American factory workers to save their jobs is to be so skilled and productive that they can justify the pay multiple they earn vs. their counterparts in South Korea or China or Mexico." ["The Case for Making It in the USA," Bloomberg BusinessWeek, 5 May 2011] Interestingly, Coy pointed out that "roughly half of GE's manufacturing jobs, as well as overall revenues, are outside the U.S." Earlier this month, however, General Electric announced that it was investing a billion dollars "to bring back to Louisville, Kentucky, hundreds of jobs that had been outsourced to Mexico and China." ["GE takes $1bn risk in bringing jobs home," by Ed Crooks, Financial Times, 2 April 2012] Crooks reports:

"'Reshoring' production is a strategy being tried by many American manufacturers, as rapid wage growth in emerging economies and sluggish pay in the US erodes the labour cost advantage of offshore plants. The US has added 429,000 factory jobs in the past two years, replacing almost a fifth of the losses during the recession."

GE is not alone in its decision to reshore jobs. Coy reports that "blue chips such as Caterpillar, Ford, and NCR have all announced that they are returning some manufacturing from abroad, even as they continue to expand abroad." He reiterates, however, that "a renaissance in U.S. manufacturing output does not necessarily imply a hiring binge; efficient companies might just do more with fewer people." Given the trend towards reshoring, Stephen Gold, president and CEO of Manufacturers Alliance for Productivity and Innovation (MAPI), an executive-education and business-research organization in Arlington, Va., writes, "The question is, will the employees be ready for manufacturing? According to a majority of top-level executives, as of right now the answer is a resounding 'no.'" ["Providing 21st-Century Skills for 21st-Century Manufacturing," IndustryWeek, 14 March 2012] Gold continues:

"As the latest skills-gap survey by the Manufacturing Institute and Deloitte shows, despite continued high national unemployment rates, more than 80% of manufacturers say they are experiencing a moderate to severe shortage of skilled production workers. By some estimates, 600,000 manufacturing positions remain unfilled due to companies' inability to find people with the right skills. That's because the changes in manufacturing processes over the past decade have been dramatic. The typical shop floor today has more computers than people. Those computers are hooked up to machinery requiring a level of technical sophistication that would leave even the most ardent teenage video-game enthusiast in the dust. In fact, the employees who operate this advanced equipment are more technologically savvy and typically know more trigonometry and calculus than most American citizens, including the policymakers who are stumping on their behalf."

By most accounts, the biggest obstacle to increasing manufacturing in the U.S. is the skills gap that Gold discusses. He asks, "Given the challenge of not enough workers entering the workforce with sufficient knowledge and skills, what are manufacturers doing to bridge the skills gap with the employees they're currently hiring?" He goes on to discuss some of "the ways that American manufacturers are training their employees." They include:

  • In-person classes taught by internal instructors as well as in-person classes taught by outside instructors.
  • More emphasis to enhancing skills for working as part of a team. In addition, companies are changing their training programs to give more emphasis toward enhancing the capacity to think through the logic of a process.
  • More emphasis on having engineers gain a capacity to understand the needs of customers.
  • More emphasis on having production employees obtain multiple course certificates at a technical school or community college.
  • Paying for bachelor's degree programs for production employees.

Gold reports that with companies now realizing the value of human capital "there is a significant evolution occurring in how American manufacturers recruit and train employees of various levels. With STEM requirements in the business world outpacing the education system, companies are developing innovative ways to compete against each other and the rest of the world to maintain a competitive workforce." Having read the Manufacturing Institute and Deloitte study, Derek Singleton, an ERP Analyst for Software Advice, adds his two-cents about what can be done to decrease the skills gap. ["Three Ways to Overcome the Manufacturing Skills Gap," 6 March 2012] He writes:

"Much of the recent coverage around the manufacturing skills gap has focused on its root causes, which are by now familiar: baby boomers are retiring, shop floor automation is increasing the technical skills required in manufacturing jobs, and youth are disinterested in pursuing a manufacturing career. Whatever the causes, we now need to work together as a nation to overcome the skills deficit."

He suggests the three following ways to shrink the gap:

  • Strengthen educational partnerships;
  • Invest in corporate in-house training programs; and,
  • Energize the workforce of tomorrow.
  • He discusses each of his recommendations beginning with strengthening educational partnerships.

    "Technical colleges (and other parts of academia) are perfectly positioned to equip a new manufacturing workforce with the right skills. There is already an extensive network of schools that partner with manufacturers to teach relevant skills. These partnerships need to be strengthened. One such partnership is the Society of Manufacturing Engineers' collaboration with Tooling U – an online training program that provides curricula for everything from CNC machining to welding. Tooling U partners with colleges, trade associations, media groups and industry to develop training programs that align with the skills manufacturers need. Since its inception, Tooling U has helped 100,000 individuals revamp their skill set to find jobs at roughly 1,200 companies. Partnerships like those developed at Tooling U need to grow in number and size because they are proven models for workforce development that can have an immediate impact on the skills deficit."

    President Theodore Roosevelt once stated, "Far and away the best prize that life has to offer is the chance to work hard at work worth doing." There is no work more worth doing that creating something for others to use. Around the world, including in the U.S., we see young people with college degrees but no employable skills. We need to stop emphasizing degrees and start emphasizing skills. Singleton next discusses investing in in-house training programs.

    "The success of programs such as Tooling U prove that manufacturers can make a difference when they get involved in workforce training. Manufacturers that are serious about hiring the right people should implement their own skills training programs. We have a model that shows that training in-house is highly effective: the Training Within Industry program. Hugh Alley, President of First Line Training, pointed out in a recent conversation that this program helped train two million women and eight million men after WWII. According to Alley, firms that use this program usually achieve close to a 25 percent reduction in the time it takes to train an employee. Over the last three decades, however, in-house training and apprenticeship programs have steadily declined across the industry. Many of these programs were cut for budgetary reasons. A recent study of UK manufacturers suggests that domestic manufacturers should bring these programs back. Semta – a UK manufacturing association–analyzed the value of apprenticeship programs to manufacturers. Roughly 80 percent of surveyed UK manufacturers said that their apprenticeship program makes them more productive. Furthermore, 83 percent stated that they will rely on apprenticeships to fill future work needs. While it may be difficult to find workers with the exact skills to match job openings, manufacturers can train people with the right aptitude. Investing in a talented individual can limit staffing problems and pay substantial dividends for manufacturing productivity."

    Historically training programs are the first things to be axed during economically difficult times. That is a very short-sighted action that has long-term consequences. Singleton's final discussion involves energizing the workforce of tomorrow.

    "Solving the workforce needs of today does little good if the next generation is disinterested in working in manufacturing. In the longer-term, manufacturers will need to get youth interested in manufacturing by exposing them to it in a fun, engaging way. One example of this is a Tampa Bay program called STEM Goes to Work. The program takes students on manufacturing facility tours. While there, students get to talk with manufacturing employees, management and CEOs. They learn about manufacturing careers and what it takes to land one of those jobs. According to Janet Bryant, Director of Corporate Development at iDatix, the tours also incorporate a fun element. For instance, when students visited a gear manufacturer, they were given a challenge to build workable gears out of Styrofoam. Here in Austin, National Instruments gets young people interested in manufacturing and engineering through their Lego Mindstorms project. Lego Mindstorms features a combination of lessons and competitions where students are tasked to build simple robotics. While these kinds of projects don't develop manufacturing-specific skills directly, Reut Schwartz-Hebron of Key Change Institute notes that they 'help foster critical thinking ability, which ultimately makes it much easier to learn manufacturing skills later in life.'"

    We need to re-instill in rising generations the honor they should feel about holding any meaningful job. Of course, they should also expect to receive adequate compensation for their skills and effort. President Bill Clinton once stated, "I do not believe we can repair the basic fabric of society until people who are willing to work have work. Work organizes life. It gives structure and discipline to life." Even people who are willing to work must have the skills they need to fill the jobs that are available. Thomas Carlyle once wrote, "A man willing to work, and unable to find work, is perhaps the saddest sight that fortune's inequality exhibits under this sun." The days are long past when willingness to work alone was sufficient. Today Carlyle would probably write, "A man with the right skills who is willing to work, and unable to find work, is perhaps the saddest sight that fortune's inequality exhibits under this sun."

    April 27, 2012

    Developing Supply Chain Talents and Skills

    The folks at SCM-Operations note, "Logistics and Supply Chain Management is a relatively new discipline. It's the crossroads of diverse subjects, covering various branches of business, management and engineering." ["History of Logistics and Supply Chain Management [INFOGRAPHIC], March 2012] Because supply chain management involves so many diverse subjects, pundits routinely write about what kind education, talents, and skills current and future supply chain professionals require. The "infographic" below depicts how supply chain management has evolved over the past century and why maintaining a changing portfolio of skills is important to remain at the top of one's game.

    History of Logistics and Supply Chain Management

    Bob Ferrari commented on a Wall Street Journal article that reported that the management development strategy for General Electric was changing from one where management personnel were exposed to a broad range of diverse businesses to a strategy that keeps them in one business longer so that they can gain deeper expertise in specific areas. Ferrari writes, "The reasons for this shift are those that many in our community have likely observed. The pace of global business requires a much more intimate knowledge of the aspects of customer needs, product development, supply chain tradeoffs and go-to-market strategies." ["Supply Chain Management Competencies: Broad vs. Deep?" Supply Chain Matters, 26 March 2012] He continues:

    "In our view, this trend is also a reflection on accountability, staying in a leadership position for the time to make longer-term initiatives successful and avoiding the constant 'parachuting' into and out of programs without a consistency in leadership and follow-through to initially targeted results. Broad initiatives directed at implementing a company-wide S&OP process, implementing advanced technology or shepherding a multi-year supply chain transformation effort can often lose momentum or perspective from frequent changes in leadership."

    Ferrari believes that it is time for the "supply chain community to reflect on the functional and leadership skills that are required in this new era of dynamic business change, globally extended supply chains and risk exposures." He goes on to discuss what he believes are "required skills [that] reflect broad functional supply chain skills and deep business and program management skills." He continues:

    "Regarding functional knowledge, not everyone can effectively contribute within this new and faster clock speed of business without broader supply chain functional knowledge. That is why current certification programs offered by either APICS or CSCMP test on broad based functional knowledge in areas such as customer relationship management, procurement, planning, transportation and logistics, among other areas. The goal of certification is to reflect a fundamental baseline knowledge of the processes involved in the supply chain, and we would add, the newest price of admission into the function. Beyond acquiring certification are years of actual experience working within and across many supply chain functional areas in implementing business and functional program needs. Thus, broad supply chain horizontal skills and practical knowledge remain extremely pertinent to success."

    Speaking of certification, a new certification program targeted for non-degreed supply management professionals went into effect last December. The Certified in Supply Management (CSM) credential is offered by the Institute for Supply Management (ISM). ["New ISM Supply Management Certification for Non-Degreed Professionals Goes Live," Supply Chain Digest, 6 December 2011] I agree with Ferrari that a balance of subject matter knowledge and managerial experience is the ideal. He concludes:

    "At the management level, we submit that deep understanding of the business, effective communication to senior management, coupled with demonstrated leadership at implementing needed strategic, tactical and operational change are clearly new stakes for global supply chain leadership. It may be no secret that some current managers within individual functional domains have risen to leadership roles because of their deeper functional and tactical leadership skills vs. broader understanding of either supply chain multi-functional requirements or needs to directly integrate supply chain business process and information technology initiatives with required longer-term business outcomes. This is often where initiatives for 'taking cost out of the supply chain' conflict with 'providing enhanced services' for innovative new products. Tomorrow's supply chains require leaders who can articulate how supply chain capabilities impact a required business outcome or desired metric of performance. They are leaders who build their resume on facilitating timely strategic and tactical change vs. multiple assignments implementing short-term objectives."

    Paul Teague insists that we need to worry about the skills of today's supply chain managers as well as those of tomorrow's managers. ["Talent development starts with you," Procurement Leaders, 28 November 2011] Although he is writing specifically about procurement specialists, I think his advice is applicable across the supply chain sector. He writes:

    "There has been a lot of talk ... about the skills and knowledge that the next generation of procurement professionals will need to succeed, and how CPOs can attract and develop that future talent. But what about the skills and knowledge that current procurement managers need? CPOs need to address that issue too. And they can start by looking in the mirror."

    As an anonymous pundit once wrote, "Your schooling may be over, but remember that your education still continues." I think that sentiment is at the heart of Teague's message. His primary focus, however, is on developing people skills (i.e., the ability to make those around you better). He writes:

    "As important as negotiation skills and financial knowledge are, pure people-management skills are essential. You need them to build a team and get team members to work together to achieve well-thought-out goals. Your managers do too. But, often, those management/leadership skills are missing at middle-and-top-management levels, sometimes in procurement, sometimes elsewhere. You don’t need to know the history of such infamous managers as Sunbeam's 'Chainsaw' Al Dunlop to recognize the problem. ... Even The Harvard Business Review, has said that management is often the least efficient activity in an organization. Don't believe that? Just think of some of the characters you might have worked for yourself early in your career. Hopefully, most were supportive and role models. No doubt, though, some were idiots. As the Gallup polling organization has reported, people leave managers, not jobs."

    That's a great quote! There is a reason that so much is written about leadership and management. People matter and yet too many executives seem to forget that truth. Teague concludes:

    "In his excellent new book, Next Level Supply Management Excellence, Bob Rudzki, president of Greybeard Advisors and former procurement executive at Bayer Corp. and Bethlehem Steel, says that good managers successfully cope with complexity, but good leaders successfully cope with change. I agree, and add that you can't be a good manager without being a good leader. In fact, I like to think of the word 'leadership' as an acronym for a series of management activities:

    Listening to goals, concerns, and ideas of others, staff members and stakeholders alike

    Empowering others to think and act creatively

    Attacking the right supply chain problems

    Defining clear objectives

    Engaging in the detail to be sure you understand issues and to set an example

    Revising and regrouping when reality collides with theory

    Saluting those who perform well

    Helping those who don’t so they can improve

    Institutionalizing a collaborative mindset

    Persuading everyone to believe that their job is the most important one in the company

    "Management and leadership go hand in hand. So, besides looking for good future managers/leaders, develop the ones you have now, starting with yourself."

    Tony Pittman, director of global procurement at Hewlett Packard, told the editorial staff at SupplyChainBrain, "From sourcing to delivery, the supply chain is now the most horizontal function in the enterprise. And any role that has such wide-ranging impact, from how money is spent to affecting the customer's experience, needs highly trained managers and workers." ["Skills for Supply Chain 2.0," 3 April 2012] The article continues:

    "Most people have at least a general idea of what's done by folks in finance, accounting, sales and marketing. But supply chain? Not so much, Pittman acknowledges, but that has been changing over the last 10 or more years. Whereas people may have thought of warehousing and transportation to some extent, now there is a better understanding of supply chain, if only because it touches so many areas of the enterprise. 'The supply chain function has the capability to reach all parts of the enterprise, all customers, and all supply networks. It may be the most horizontal function in the enterprise today.'"

    The fact that supply chain management is the most horizontal function in an enterprise affects the debate over the importance of "broad" knowledge versus "deep" knowledge. The scales still tip toward deep knowledge, but the offset over broad knowledge is not very much. The article concludes:

    "[The supply chain] 'function' and its leaders are better positioned than ever before to have a strategic impact today. Companies that have risen to the forefront have done so, Pittman believes, in large part due to innovation, investment and capabilities in supply chain. How key is technology to supply chain success? 'I really feel that while technology is important, sometimes the importance can be overdone,' says Pittman. 'Supply chain is a function that truly requires you to have good people, processes and technology. The most overlooked part of that equation is the people. Technology is very important and will always be, but is having a world-class supply chain hinging solely on technology? You must have the skills that people bring to the table, and the processes that your company can build on that. You can go with second- or third-tier technology and still rise to the top with the right people and processes.' At the same time, it's vitally important to develop skills in those employees."

    One might be surprised that an executive from a technology company would reiterate the importance of processes and people as well as technology; but, he is absolutely on target. Companies that forget that their people are really their most important asset are almost certainly not as good as they could be if they paid more attention to how to improve and use their human capital.

    April 26, 2012

    Big Data Analytics: Technology, Processes, and People All Matter

    In this post, I'll be discussing three articles from SupplyChainBrain that all touch on why processes, technology, and people matter when discussing Big Data analytics. In the first article, Mark Kornbluth, Managing Director of Client Technology at Kroll Associates, writes, "Former U.S. Secretary of Defense Donald Rumsfeld famously addressed the absence of evidence of weapons of mass destruction in Iraq with a statement that was oddly prophetic for today’s global business: 'There are known knowns; there are things we know we know. We also know there are unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don't know we don’t know.'" ["Spotting Unknowns in a Sea of Data & Separating Critical Compliance Risks from the Noise," SupplyChainBrain, 11 November 2011] Kornbluth continues:

    "Fast-forward nine years and the vast majority of multinational corporations are now fighting their own battle with unknown unknowns lurking in their global supply chains. The phenomenon is the result of dueling trends: as more firms expand into high-risk emerging markets, governments have ratcheted up their enforcement of anti-bribery laws."

    I start this discussion by mentioning bribery laws because "Wal-Mart Stores Inc. faces significant legal risks after it disclosed that it is investigating its operations in Mexico for possible violations of the U.S. law that prohibits bribery in foreign countries, legal experts said." ["Wal-Mart Faces Risk in Mexican Bribe Probe," Wall Street Journal, 22 April 2012] It wasn't technology or processes that placed Wal-Mart in a compromising position; it was people. Too often when we think about analytics we forget the people part of the equation and that can be a mistake. Jessica Wohl reports, "Allegations that Wal-Mart Stores Inc stymied an internal investigation into extensive bribery at its Mexican subsidiary are likely to lead to years of regulatory scrutiny and could eventually cost some executives their jobs." ["Wal-Mart probe could cost some executives their jobs," Reuters, 23 April 2012]. Kornbluth reports that preventing illegal actions by employees should be a top concern of executives because enforcement is on the rise. He writes:

    "According to data tracked by the law firm Gibson Dunn & Crutcher, the number of Foreign Corrupt Practices Act (FCPA) enforcement actions increased 85 percent from 2009 to 2010, with 48 new DOJ cases and 26 new SEC actions filed. In total, companies paid a record $1.8bn in financial penalties to the DOJ and SEC in 2010, according to data from both agencies."

    Kornbluth believes that technology can help companies gather and analyze data to help them monitor employee behavior so that they can either prevent or mitigate illegal behavior. He continues:

    "Multinationals have responded with an aggressive ramp-up of compliance efforts complete with data and analytics on everything from vendor background checks to regional country risk monitoring. The result? As The Wall Street Journal reported in September, 'Companies are being inundated with data. … But many managers struggle to make sense of the numbers.'"

    As I frequently point out, data that is not analyzed or results that are not properly presented to decision makers are simply not useful. In fact, as Kornbluth points out, it can be unhelpful. He claims that in his company's work, he and his associates "have found that the data overload problem is most frequently the result of decentralized data management processes that have not been standardized across an organization." So having begun with people, Kornbluth next turns to processes and technology to show how they can help provide a solution. He continues:

    "Too often, the process of screening and monitoring international fraud risk is done manually with e-mails back and forth to third-party vendors, documents stored in a variety of locations and individuals in different business units following different processes. Thus, despite a surfeit of available data, companies are missing key red flags through simple mismanagement of resources."

    Kornbluth recommends "a four-step data management process to help multinationals spot unknowns more effectively." Those steps are:

    "1. Define a Third-Party Screening Policy: Amazingly, many multinationals are collecting terabytes of data from their global operations with no unified corporate policy on how to use that data across the organization. The first step in any risk management project of this scale is to clearly define what key criteria a company is screening for, notification rules in the case that red flags are found and specific report types that will be produced worldwide. Without this core set of guiding principals, companies are bound to quickly become slaves to their data.

    "2. Build a Centralized Online Database: A typical multinational operating in high-risk emerging markets will have thousands of vendors and agents working on its behalf. Basic background checks on each of these entities would create an overwhelming data deluge without standardized processes. By hosting all third-party screening data in a secure, encrypted, centralized database, it is possible to set up rules for easy review.

    "3. Standardized Reporting: The only way to accurately analyze the myriad of different red flags that crop up around the world is to use a consistent reporting structure. To be useful, third-party screening reports must report the same data in the same order globally. This includes global compliance database checks, adverse media in the local language on the company and its management, address history, corporate registry information, civil court checks, criminal court checks, bankruptcies and several others in a uniform format. An organization must agree on what they are screening across the organization and stay consistent in their approach.

    "4. Annual Review: Risk profiles change over time, making it important to regularly screen third-parties for any changes in their structure, management team, lines of business or regions of operation. Ideally, the review process should be initiated annually. When it comes to systematically identifying potential fraud risks before they result in enforcement actions, there is no shortage of data. The key to a successful compliance program is accessing the correct data to analyze threats and draw clear conclusions."

    I agree with Kornbluth that in any area that a company analyzes "accessing the correct data" is critical. Unfortunately, the area that Kornbluth writes about is only one of hundreds or thousands of areas in which data must be collected and analyzed by multinational enterprises. That makes the matter of collection and integration even more important. In the second article, Tim Rey, director of advanced analytics with Dow, offered "a glimpse into that company's use of analytics and complex mathematics to examine multiple areas of its global supply chain" to the editorial staff at SupplyChain Brain. ["The Benefits of Advanced Analytics at Dow Chemical Company," 19 October 2011] The staff wrote:

    "The application of advanced analytics requires substantial resources drawn from multiple parts of the organization. 'It's a balance of people, process, methods and technology,' says Rey."

    You see the theme here: people, processes, and technology are all critical when it comes to Big Data analytics. In Rey's case, people are not the source of trouble but the source of value. The article explains:

    "Individuals must be highly trained in math, machine learning, forecasting, simulations and operations research, to name a few key areas. At Dow, many of the people who participate in advanced analytics have a background in research and development, where they were already doing mathematics and modeling for manufacturing processes. Most possess advanced degrees, says Rey. Experts in Six Sigma and master black belts can be of great help in the effort."

    Although some people claim that Big Data analytics, and the technologies that support it, are mature, most pundits believe that we are still in its infancy -- but maturing fast. The article explains:

    "While the concept of advanced analytics isn't new, it hasn't been applied to the business side of the organization until relatively recently. A handful of universities are beginning to certify students in that area. Dow is speeding the development of the discipline by bringing in graduate students and putting them to work on the analysis of business processes. The company has already done a wide range of work in areas such as fraud detection in auditing, strategy, portfolio optimization, forecasting and model, purchasing cost forecasting, and the use of purchasing decisions to minimize cost. It's essential to get access to data as quickly as possible, Rey says. 'Waiting six to nine months for a report to be available doesn't work.'"

    One of the benefits of using cloud computing to conduct Big Data analytics is that it can provide the benefits of fast computing without having to invest valuable corporate resources into infrastructure and maintenance. Rey explains that getting external support is often a good idea. He explains:

    "Commercially available software can help; companies don't need to build optimization algorithms from scratch. In any case, says Rey, one needn't wait for all corporate data sources to be structured before undertaking an analytical process. There's always the risk of getting too complex in one's calculations. 'All models are wrong,' notes Rey. 'Some models are useful.' The trick lies in striking the right balance between theory and reality. And companies must always be aware of the 'garbage in, garbage out' nature of data. 'You have to be careful,' says Rey. 'Sometimes the quality of the data doesn't merit the complexity of the model.'"

    Those are all great points. By implementing a Sense, Think/Learn, Act™ system like the one my company, Enterra Solutions, offers, poor quality data is eventually winnowed out as the system learns what is important. That's the real power of a good analytical system. It can also help discover some of the unknown unknowns discussed by Kornbluth.

    The final article reports some of the highlights of an interview that Greg Gorbach, vice president-collaborative manufacturing at ARC Advisory Group, conducted with Paul Boris, vice president of collaborative manufacturing at SAP. ["The Power of Real-Time Analytics," SupplyChainBrain, 23 April 2012] The article states:

    "Boris says the right technology can free information and processes to enable real-time analytics, which in turn can drive business processes and deliver performance-enhancing information directly to individuals, wherever they are working. 'For example, you might deliver engineering information directly to the hands of the operator working on an asset, in real time, and perhaps wrap that information in 3D to provide a richer view,' he says."

    Once again you see experts stressing the importance of processes, technology, and people as they relate to Big Data analytics. As with most things in life, describing how things should work is always easier than actually getting them to work. The article concludes:

    "Creating and using real-time analytics in this way requires capabilities in several areas, Boris says. These include applications, analytics, and modeling around on demand, on premise, on device. The latter 'speaks to having a hybrid cloud on premise as well as the necessary database,' he says, 'but the technology already exists.' ... Social media also is having an impact. Boris sees it as a way to share knowledge and best practices, which is becoming increasingly important as the workforce ages and a company's knowledge base is lost through attrition. ... Boris says his vision for the next generation of manufacturing and supply chain is one of different processes that can be plugged or unplugged, giving incredible agility and more sustainability with less labor required."

    One of the main take-aways from the interview was, "Having information and processes trapped in operational silos is a continuing problem that keeps many businesses from performing as well as they should." I couldn't agree more. Siloed information hurts all of the big three -- processes, technology, and people. And they all matter.

    April 25, 2012

    Building Tomorrow's Supply Chain, Part 2

    Yesterday, I discussed the views of three supply chain professionals who believe that many of the supply chains currently operated by companies need to transform. Those professionals were: Mark Pearson, managing director at Accenture; Kelly Thomas, Senior Vice President of Manufacturing at JDA Software; and Lora Cecere, Founder of Supply Chain Insights. Generally, the three analysts were in agreement about the factors they believe make supply chain transformation necessary and, for the most part, about the direction that supply chains must take. I continue that discussion in this post. In my mind, data integration is the key capability necessary to achieve their goals. Thomas believes that planning processes in six core areas must be aligned to achieve a dynamic (or market-driven) supply chain. They are: sales and operations planning (S&OP); demand planning; inventory planning; master planning; factory planning and scheduling; and collaborative supply planning. ["Building the Supply Chain of the Future," SupplyChainBrain, 29 march 2012] He discusses each in turn beginning with S&OP.

    "Sales and operations planning (S&OP). The S&OP process should be a continuous process in which short-term demand predictions are reconciled with long-term organizational goals. S&OP must occur at both the operational and the executive levels, bringing both views together in a closed-loop planning process focused on consensus. Across every part of the organization, the S&OP process provides a disciplined cadence for monitoring and synchronizing demand, production, supply, inventory and financial plans via a rigorous Plan-Do-Check-Act process. The entire supply chain can share a common perspective on any issue and agree on an appropriate path for resolution."

    The only way for the entire supply chain to share a common perspective is to achieve some level of data sharing and integration. That is why cloud computing is becoming so important in the business world. Cecere believes that S&OP is just one of processes that must be built horizontally to achieve the desired results. She writes:

    "We are just beginning to build horizontal supply chain processes that allow us to span the end to end supply chain. While we have talked for a decade about the building of the end to end supply chain, we now know that it is about MUCH more than the connection of your ERP to my ERP like a Lego set. The processes of demand sensing and shaping, demand orchestration, revenue management, supplier development and Sales & Operations Planning (S&OP) are the start of the building of horizontal processes. The winning supply chains of the future will turn the supply chain on its ear." ["My Take: Supply Chain Future," Supply Chain Shaman, 4 October 2011]

    Thomas next discusses demand planning. He writes:

    "Demand planning. Advanced statistical modeling, supported by multiple algorithms tailored to unique item characteristics, must be applied to ensure that sourcing, production, inventory, transportation and distribution functions are optimized based on a shared forecast. Advanced demand planning tools should account for the impact of promotional and external events that will have repercussions across the global supply chain."

    Since my company is involved in providing Big Data analytic services, I obviously agree with Thomas on this point! Cecere also believes that Big Data analytics will characterize future supply chains, she writes:

    "Big Data Supply Chains. Traditional Supply Chain Management (SCM) focuses on transactional efficiency: order to cash and procure to pay. eCommerce spawned catalog and shopping cart technologies. This is not sufficient. These early versions of technologies drive an efficient, but most often a STUPID response. Lean practices can make stupid happen with less waste. These processes are inside-out (enterprise to trading partner or market) not outside-in (market to enterprise systems). They focus on vertical silo efficiency not on supply chain trade-offs and alignment. We have a new world of opportunity. As we enter the world of big data supply chains, we have the opportunity to combine the internet of things – where physical objects like inventory, machines and documents have presence and transmit a signal– with the omnipresence of mobility (anytime and anywhere) with awareness of location and the sensing of true customer wants and needs through the open and interest graph. New capabilities in analytics – in process memory and new techniques for predictive analytics – enable convergence of these technologies. It is up to supply chain professionals to reskill and think about more than speeds, feeds and cases in the warehouse to redefine supply chain processes to take advantage of new technologies."

    Thomas next discusses inventory planning.

    "Inventory planning. 'One-size-fits-all' inventory plans fail to recognize the differences among products. Instead, leading manufacturers are using advanced tools to create highly customized 'designer' inventory strategies based on key product attributes. Products are segmented based on their critical characteristics and managed accordingly. Wherever possible, inventory decisions are postponed to minimize financial risk and inventory levels are managed by exception to maximize time and cost efficiency."

    I find it interesting that Thomas talks about segmenting products based on their critical characteristics. Most supply chain analysts talk about segmentation as it applies to multi-channel order fulfillment. Both discussions of segmentation are appropriate and I believe this is what Pearson means when he talks about the importance of maintaining a portfolio of supply chains (see Part 1 of this discussion). Thomas next discusses master planning.

    "Master planning. Leading manufacturers are reviewing, analyzing and updating supply plans daily, instead of monthly or quarterly, to maximize customer satisfaction, while also protecting their profits. Through a problem-oriented design, their planners are intuitively guided to monitor performance issues and exceptions in the global supply network. In addition, a layered planning approach allows planners to rank their business objectives and make informed trade-offs. With clear visibility into the root causes or constraints to problems, planners can interactively adjust constraints and business rules to make continuous performance improvements."

    Thomas touches on two important points: speed and visibility. Things move fast in today's business climate and decision-makers must have tools that permit them to make decisions within the industry's decision cycle if those decisions are going to have a positive impact. The late concept military strategist and USAF Colonel, John Boyd, called this decision cycle the OODA loop -- OODA standing for observe, orient, decide, and act. You can't act effectively if you don't have good data sharing that provides the required visibility (i.e., the proper orientation). Thomas next discusses factory planning and scheduling.

    "Factory planning and scheduling. Optimized production plans should be defined for plants by scheduling backward from the requirement date, while material and capacity constraints are simultaneously considered to create feasible plans. Advanced solutions should streamline and align the activities of production control, manufacturing, and procurement planning teams by automating mundane tasks and shifting the focus to more important functions. These tools should also provide time-phased reporting on key factory performance metrics, enabling planners to take corrective measures for both short- and long-term planning. By applying a management-by-exception approach, planners can eliminate unnecessary work, minimize planning fatigue and assess a variety of scenarios when the unexpected occurs."

    What Thomas just described can't be achieved with some level of integration, synchronization, and/or alignment. I agree with Thomas that a management-by-exception approach is critical for ensuring that decision-makers aren't wasting time monitoring routine operations. Their skills are best used managing the exceptions that can't (or shouldn't) be handled by automated processes. Thomas' final discussion is about collaborative supply planning.

    "Collaborative supply planning. Most manufacturers purchase parts from a range of diverse and global suppliers, and these parts have different lead times, demand profiles and inventory strategies. Advanced tools enable manufacturers to manage this diversity through customized business rules that track performance exceptions. Dashboards, exception-based reporting and early warning systems allow supply issues to be identified and resolved before they impact the global network. Planners can also track the entire life cycle of procured parts, for example, to adjust forecasts, production schedules and transportation plans when late deliveries impact the manufacturing flow."

    I'm pleased that Thomas stressed the importance of getting the right information to the right decision maker in the right format at the right time. Analyzed data is of no value if it's not used and it won't be used if the systems providing it are not user-friendly. On this point Thomas, Pearson, and Cecere appear to be in complete agreement. Pearson called for systems that "Sense, Shape and Respond." Cecere calls for learning systems that "Listen, Test and Act." She continues:

    "Supply chains do not listen and they also do not play by fixed rules. Yet, technologies like Business Process Monitoring (BPM) and linear optimization try to force this outcome. Advanced pattern recognition and the use of rule-based ontologies are fueling the first generation of supply chain systems that can learn. It was great to see Progress Software showcasing a supply chain event monitoring application built on pattern recognition of streaming data for a supply chain control tower application. (Progress has traditionally focused on financial data. It is good to see them with a new focus.) Enterra Solutions, IBM, and Transbase are also focusing on learning systems. We are very early in the building of learning systems, but I find it VERY EXCITING."

    I'm pleased that Cecere recognized Enterra's efforts in this field. Thomas concludes:

    "Manufacturing businesses of all types can realize substantial improvements across their global supply chains by transforming from a 'push' mindset to a more nimble 'pull' stance. They can realize diverse improvements, including revenue increases, inventory reductions, better asset utilization, lower materials costs and service enhancements such as fewer stock-outs. Overall, the average manufacturer is expected to realize a 15- to 25-percent operating margin improvement by synchronizing all supply chain planning activities with the pull of actual market demand. Looking toward the future, there is only one real certainty: that uncertainty will continue to prevail. Consumers will continue to shift in their confidence and spending habits. Materials and transportation costs will keep fluctuating. Channel preferences will evolve. The only way to manage your worldwide supply chain profitably in this uncertain environment is to understand true market demand at the earliest possible stage — then synchronize all your planning processes and make the right decisions based on that insight. The ability to create a synchronized, agile, pull-based supply chain will separate the leaders from the followers as economic uncertainty continues."

    I believe that all of the analysts whose views have been discussed in this two-part series arrive at the same conclusion. Pearson puts it this way:

    "By converting to a dynamic supply chain model, companies can create and sustain multiple supply chains with a level of agility that allows them to respond to both opportunities and threats. As a result, businesses that make this transition will be positioned to readily embrace the unpredictable." ["The Dynamic Supply Chain," Industry Week, 8 March 2012

    Thomas, Pearson, and Cecere insist that "agility" or "flexibility" is going to be an important characteristic of future supply chains. Cecere concludes, "New technology capabilities allow us to raise the bar to reduce fat AND improve flexibility. But, this can only happen if we adapt practices and adopt new technologies." In a previous post [entitled Supply Chain Agility: Does it Matter?], I highlighted a discussion about agility that you might find interesting. The bottom line is that the future always begins with today and today is a good day to start building the supply chain of the future.

    April 24, 2012

    Building Tomorrow's Supply Chain, Part 1

    Mark Pearson, managing director at Accenture, writes, "Fickle consumers. Shrinking product life cycles. Product and service characteristics continually influenced by social channels. The needs and buying methods of the modern consumer are both changing and accelerating, and manufacturers are struggling to catch up, let alone get ahead. Mix in geo-political chaos, tumultuous currency markets and natural disasters, and companies are trying to somehow reconcile their supply chains with volatility that is nearly twice as much as anything they have experienced in the past 30 years." ["The Dynamic Supply Chain," Industry Week, 8 March 2012] Kelly Thomas, Senior Vice President of Manufacturing at JDA Software, agrees completely. He writes:

    "The past few years have brought radical changes to the world of supply chain management. The business climate today is not only more complex, due to shorter product life cycles, increasing service demands from channels, price erosion and global customers with specialized needs but much more uncertain due to supply risks. Manufacturers around the world are grappling with the challenge of meeting fickle market demand in a new economy, without making risky investments in high inventory levels and costly production assets." ["Building the Supply Chain of the Future," SupplyChainBrain, 29 march 2012]

    Another analyst who agrees with this view of the world is Lora Cecere. She writes:

    "Demand cycles are shortening. Supply cycles are lengthening. Commodities are scarce. Volatility abounds. The design of the supply chain needs to change and adapt frequently. ... I think that this is the future. The enlightened thinkers know this. The laggards are just beginning this journey. ... In my opinion, the opportunity for inventory policy, network design, and adaptive supply chain processes are still ONLY understood by supply chain leaders." ["My Take: Supply Chain Future," Supply Chain Shaman, 4 October 2011]

    All three of these analysts believe that supply chains must transform in order to meet the challenges they point out. Thomas writes, "The good news is that by synchronizing inventory, production and distribution processes as closely as possible with actual demand levels, manufacturers can make their supply chains both agile and profitable." He continues:

    "The obstacles to a truly agile, synchronized supply chain lie in traditional planning processes, which have largely been siloed, lacking integrated decision making across multiple functional areas, as well as any involvement from supplier and channel partners. Supply chain leaders are overcoming these obstacles by implementing fundamental changes to their planning processes, allowing a single physical supply chain to support multiple channels and different operating models. They are shifting from the push-based approach of the past to a supply chain management model in which actual market 'pull' forms the basis for every supply chain decision."

    Readers of this blog know that I'm a big proponent of breaking down corporate silos through the intelligent sharing of information and the integration of data sources so that the entire organization is operating from a single version of the truth. Pearson calls the type of supply chain described by Thomas (i.e., one that breaks down internal silos while integrating supplier and customer data) a "dynamic supply chain." He asserts that this kind of supply chain "enables businesses to balance opportunities to drive new economic value and growth against the downside risks of any disruptive events that might occur." Cecere talks about "Market-driven Supply Chains that Sense." She continues:

    "Today’s supply chains respond. They do not sense. Despite the exponential investment in sensing technologies like RFID, 2-D bar codes, temperature sensors, GPS, and QR codes, today’s supply chain focuses on orders and shipments. Traditional applications cannot scale to use the exploding volume of unstructured data and combine it with the output from the number of sensors being installed. Additionally, supply chain latency is accepted and not questioned. We have not conquered the bullwhip effect, and the translation of demand from retail shelf to a manufacturer remains unchanged. ... We have built long supply chains that translate, not sense demand. The use of sensor data, market data, temporal data (weather, traffic, etc.) to sense and reduce latency remains an opportunity. Social/ mobile/ digital/ ecommerce convergence is changing the 'heart' of the supply chain. Leaders will combine transactional data with unstructured data to sense market to market outside-in with near real-time latency while laggards get squeezed from both ends."

    The three analysts all mention the volatility that has characterized the business environment in recent years. Pearson states that we now in an era "of permanent volatility. And, this constant change that is part and parcel of today's macros business environment is the 'new normal.'" Cecere writes, "Volatility abounds." Thomas asserts that the "volatility of the past several years has only complicated this challenge" of improving supply chains. Thomas continues:

    "Adding complexity is the geographically scattered nature of the supply chain in a global business environment. Product cost issues are much more complex now. Manufacturing executives must understand and manage the total landed cost of all goods and services, with its intricate web of offshore suppliers, multiple transportation and distribution nodes, and flexible manufacturing options. While individual facilities used to be managed vertically, the global supply chain extends beyond the four walls of a facility, encompassing a network of trading partners who collaborate closely with one another in serving the end consumer's needs while also protecting the overall profitability of the supply network."

    Pearson reports that "70% of executives who responded to a recent Accenture survey expressed concern about their inability to predict future performance, and more than 80% worried about the overall resilience of their supply chains in the face of unrelenting market challenges." The characteristic that Pearson believes is most damaging to today's supply chains is rigidity. He writes:

    "'Supply chain integration' -- the mantra of the recent past -- helped globalizing organizations secure key relationships across extended supply lines by tying the operations of critical partners together. The downside of this structure is that the slowest supplier defines a company's ability to respond to market changes. Today, companies with such rigid supply chains are unable to keep pace with the current pace of change, which also impacts the dexterity with which they can deliver new products to market."

    I'm not sure that "integration" is the concept that leads to the rigidity that concerns him. Integration is normally a good thing because it makes alignment easier to achieve. Rigidity is often the result of adopting lean principles and just-in-time processes. If that is what Pearson means by "integration," then I agree with him. I believe that Thomas would agree with me that "integration" is probably not the right descriptor for the challenge that Pearson is trying to describe. Thomas, it appears, believes that more, not less, integration is necessary. He explains:

    "Supply chain leaders are implementing powerful closed-loop planning processes which synchronize all core activities, including demand planning, inventory planning, master planning, factory scheduling and supplier collaboration. Via this closed-loop process, actual performance is continuously monitored against planned results, and adjustments are quickly made to reflect the new reality. When a deviation occurs in one area – for example, when a supplier fails to deliver, a customer cancels an order, or labor costs rise in an offshore facility – there is a synchronized, consistent impact felt across the entire network. Plans across the supply chain are immediately adjusted to reflect this event. This closed-loop process ensures that, even when the unexpected occurs, the end-to-end value chain can be re-set with speed and agility to continue its support of top-level operational and profitability goals. Processes are orchestrated across the end-to-end global supply chain, so that the end result is a synchronized, highly effective response to changing business conditions."

    Clearly, Thomas is describing the same kind of dynamic supply chain as Pearson, but it seems to me like Thomas believes that integration is the key to achieving it rather than the obstacle that prevents it. Pearson writes:

    "Moving from the 'integrated' to 'dynamic' supply chain model enables companies to view their supply chains as adaptable ecosystems of processes, people, capital assets, technology and data. They strive for flexibility where it matters and focus their efforts on operational agility that drives profits, and not just short-term efficiencies."

    If you insert "rigid" in place of "integrated" in that paragraph, you can see that Pearson and Thomas are really talking about the same thing. Their approaches for achieving a dynamic supply chain are also complementary. Thomas stresses the use of "powerful planning processes and linked technology solutions ... to sense demand shifts and automatically balance a number of priorities, including costs, customer service levels, supply risks, production constraints and environmental targets, to achieve the best possible outcome." Pearson discusses "three initial steps that any company can take to jumpstart the process" of implementing a dynamic supply chain. They are:

    "Think 'portfolio of supply chains.' A company should first define the supply chains within its organization. This is done by aligning with overall business strategy, then segmenting supply chains based on product, customer and geography. Each chain is then evaluated by functional area to define which characteristics are considered unique and which are standard.

    "Define the dynamic operating model. There are four key capabilities within a dynamic supply chain, which when executed simultaneously, form a model that enables the right level of flexibility, adaptability and responsiveness.

    "Sense, Shape and Respond: Is the company equipped to translate data into insights that can be shared instantly with decision-makers across your company?"

    Thomas and Pearson do have a slight disagreement on one point. Pearson, like many other supply chain analysts, recommends segmenting supply chains (i.e., think 'portfolio of supply chains), while Thomas earlier recommended "a single physical supply chain to support multiple channels and different operating models." I believe their differences lie more in language than in philosophy. Tomorrow I will discuss the six core areas that Thomas believes must be aligned in order to achieve a dynamic supply chain and some of things that Cecere believes "we need to do to put flexibility back into the supply chain." She also addresses the topic of sensing, shaping, and responding.

    April 23, 2012

    Business Analytics: Good for Companies of All Sizes

    In past posts, I have noted that the World Economic Forum has declared data to be a new asset class. All businesses, regardless of their size, generate data; but, not all businesses use that asset to its full extent. Malory Davies, editor of Supply Chain Standard, reminds us, "What gets measured gets managed." He goes on to state, "The corollary to that, of course, is that if you are to manage the right things then you have to have effective ways of measuring them." ["Measure for measure," Supply Chain Standard, 25 July 2011] To come full circle, we need to remember that measuring involves data. Obtaining the right data, Davies insists, is "easier said than done." He reminds us that "there are always Donald Rumsfeld's famous 'unknown unknowns' to deal with – those things that we don't know we don’t know." That's where Big Data analytics can help. Programs that think and learn can help discover some of the unknowns so that they can be measured. Davies, however, asserts that companies should withstand the temptation "to measure everything and then try to work out what it all means afterwards." Such efforts, he writes, are "wasteful and inevitably will throw up large amounts of confusing data." If you begin by analyzing data that you know you want and need, and let an intelligent system discover other things that may be of importance, you avoid being wasteful while simultaneously being wise. Davies continues:

    "Certainly, there is plenty of evidence that the complexity and scope of global supply chains means measuring performance still remains a challenge for many companies. It makes sense to focus on the quality of metrics rather than quantity for effective performance measurement and improvement. ... Ultimately, good metrics require people, tools and processes right across the enterprise taking into account company strategy to show meaningful performance."

    David F. Carr reports, "Advances in analytic technologies and business intelligence are allowing CIOs to go big, go fast, go deep, go cheap and go mobile with business data." ["5 Business Analytics Tech Trends and How to Exploit Them," CIO, 23 March 2012] He further notes, "In interviews, CIOs consistently identified five IT trends that are having an impact on how they deliver analytics: the rise of Big Data, technologies for faster processing, declining costs for IT commodities, proliferating mobile devices and social media." He discusses each of those trends in turn beginning with Big Data. He writes:

    "Big Data refers to very large data sets, particularly those not neatly organized to fit into a traditional data warehouse. Web crawler data, social media feeds and server logs, as well as data from supply chain, industrial, environmental and surveillance sensors all make corporate data more complex than it used to be. Although not every company needs techniques and technologies for handling large, unstructured data sets, Verisk Analytics CIO Perry Rotella thinks all CIOs should be looking at Big Data analytics tools." ... Technology leaders should adopt the attitude that more data is better and embrace overwhelming quantities of it, says Rotella. ... One of the most talked about Big Data technologies is Hadoop, an open-source distributed data processing platform originally created for tasks such as compiling web search indexes. It's one of several so-called 'NoSQL' technologies (others include CouchDB and MongoDB) that have emerged to organize web-scale data in novel ways. Hadoop is capable of processing petabytes of data by assigning subsets of that data to hundreds or thousands of servers, each of which reports back its results to be collated by a master job scheduler. Hadoop can either be used to prepare data for analysis or as an analytic tool in its own right. Organizations that don't have thousands of spare servers to play with can also purchase on-demand access to Hadoop instances from cloud vendors such as Amazon."

    The value of emerging Big Data analytical technologies is not just that they crunch mountains of data it's that they do it quickly. Fast analysis is the next subject discussed by Carr. He writes:

    "Big Data technologies are one element of a larger trend toward faster analytics, says University of Kentucky CIO Vince Kellen. 'What we really want is advanced analytics on a hell of a lot of data,' Kellen says. How much data one has is less critical than how efficiently it can be analyzed, 'because you want it fast.' The capacity of today's computers to process much more data in memory allows for faster results than when searching through data on disk-even if you're crunching only gigabytes of it. Although databases have, for decades, improved performance with caching of frequently accessed data, now it's become more practical to load entire large datasets into the memory of a server or cluster of servers, with disks used only as a backup. Because retrieving data from spinning magnetic disks is partly a mechanical process, it is orders of magnitude slower than processing in memory. Rotella says he can now 'run analytics in seconds that would take us overnight five years ago.' His firm does predictive analytics on large data sets, which often involves running a query, looking for patterns, and making adjustments before running the next query. Query execution time makes a big difference in how quickly an analysis progresses. ... To improve analytics performance, hardware matters, too. Allan Hackney, CIO at the insurance and financial services giant John Hancock, is adding GPU chips-the same graphical processors found in gaming systems-to his arsenal. 'The math that goes into visualizations is very similar to the math that goes into statistical analysis,' he says, and graphics processors can perform calculations hundreds of times faster than conventional PC and server processors."

    In a business environment that is moving at an increasingly fast pace, results from slow analytic processes can be like reading yesterday's news. Companies no longer have the luxury of poring over data for long periods of time before making decisions. Speed matters. Fortunately, costly super-computers are no longer necessary to achieve acceptable results for most businesses. Decreasing technology costs is the next trend discussed by Carr. He writes:

    "Along with increases in computing capacity, analytics are benefitting from falling prices for memory and storage, along with open source software that provides an alternative to commercial products and puts competitive pressure on pricing. [John Ternent, CIO at Island One Resorts,] is an open-source evangelist. ... 'To me, open source levels the playing field,' he says, because a mid-sized company such as Island One can use ... an open-source application ... for statistical analysis. ... The changing economics of computing [is] altering some basic architectural choices. For example, one of the traditional reasons for building data warehouses was to bring the data together on servers with the computing horsepower to process it. When computing power was scarcer than it is today, it was important to offload analytic workloads from operational systems to avoid degrading the performance of everyday workloads. Now, that's not always the right choice. ... By factoring out all the steps of moving, reformatting and loading data into the warehouse, analytics built directly on an operational application can often provide more immediate answers."

    Carr reports that, even though the cost of computing is going down, "potential savings are often erased by increased demands for capacity." That's why so many companies are moving analytical processes to the cloud. Lots of potential headaches and expenses can be lifted off of in-house IT departments and placed on the shoulders of cloud service providers. Before doing that, however, William J. Holstein recommends that companies that are "considering adopting advanced business analytics should: Determine precisely what analytical tools the company needs; weigh the advantages of buying versus building; and, assess your ability to commit the necessary time and resources." ["Analyze This!" Chief Executive, 7 March 2012] Returning to the trends being discussed by Carr, he next writes about mobile applications.

    "Like nearly every other application, BI is going mobile. ... For CIOs, addressing this trend has more to do with creating user interfaces for smartphones, tablets and touch screens than it is about sophisticated analytic capabilities. ... The requirement to create native applications for each mobile platform may be fading now that the browsers in phones and tablets are more capable, says Island One's Ternent. 'I'm not sure I'd invest in a customized mobile device application if I can just skin a web-based application for a mobile device.'"

    The final trend discussed by Carr is social media. He writes:

    "With the explosion of Facebook, Twitter and other social media, more companies want to analyze the data these sites generate. New analytics applications have emerged to support statistical techniques such as natural language processing, sentiment analysis, and network analysis that aren't part of the typical BI toolkit. Because they're new, many social media analytics tools are available as services."

    Although Carr's discussion may lead one to believe that Big Data analytics are only useful and affordable for large companies, Holstein reports that many small- and medium-sized businesses "must [also] handle an astounding amount of data." That means that those companies "can use analytical tools just as the largest corporations can—or the hottest Web-based social media startups or the biggest intelligence agencies with three-letter names." He continues:

    "They can use those tools to eke out real competitive advantages against rivals that haven’t embraced the new capabilities. Even the most advanced tools, such as those IBM developed to such powerful effect with its Watson competitor on the Jeopardy game show, are within reach of companies with $10 million, $50 million or $100 million in annual sales."

    Holstein believes that there are "at least four stages in adopting an analytical system" each of which focuses on a different question. In Stage 1, the question is: "What Will We Analyze?" He writes:

    "It's important to go through a considered thought process before any decisions are made about what type of systems to purchase or develop, says Paul Magnone, co-author of Drinking From the Fire Hose and a 21-year veteran of IBM. There are specialist companies and there are integrators who bring various specializations together under one roof, 'but the step before that is to get a grasp of your business and ask the right questions,' says Magnone. ... Those questions include: What is most important to the business? What matters most to your customers?"

    I have consistently pointed out in my posts that good solutions always begin with good questions. The better the question the better the solution. Holstein notes that the variety of data sources will make a difference in the vendor or services a company eventually employs. Data integration is always a serious consideration. Stage 2 in adopting an analytical system focuses on the question: "Do We Buy or Build?" Holstein writes:

    "One of the debates in the field is whether small- and mid-size enterprise CEOs should try to develop their own business analytics in cooperation with vendors or simply rely on the outsiders to install systems that essentially 'plug in' to what they already have. The big vendors argue that they have already built hundreds of industry-specific models and can tweak those systems for a particular SME. They can even deliver the services via the cloud, meaning the customer pays for use as he or she downloads or utilizes software and other services. That raises a corollary issue: do you want to take a big plunge on a major expense or do you want to proceed with a step-by-step implementation with a long-term partner? The reality on the ground seems to be that most small company CEOs want to have a hand in developing their analytical capabilities gradually, not in a single big-bang moment."

    One of the reasons that my company, Enterra Solutions, builds modules is that we realize that one-size-fits-all solutions don't normally work (especially for small- and medium-sized companies). Some tailoring is almost always required. Stage 3 in adopting an analytical system focuses on the question: "Are We Ready to Invest?" Holstein writes, "If a CEO decides to co-develop a business analytics system, odds are that he or she will need internal talent to help." That help comes in the form of employees who know the business (i.e., employees who will use the system) and IT people who will be needed to help administer it.

    Stage 4 in adopting an analytical system focuses on the question: "Do We Understand the Impact?" Holstein writes: "The reality is that reaching a certain point of sophistication with business analytics changes the way the company is run and challenges the traditional culture. ... All of which explains why going down the path of business analytics can be so profound." Holstein concludes, "Whatever complexities may exist, the payoffs from the successful implementation of an analytical system are clear." He provides examples of the kinds of returns on investment that companies shoud expect if they embrace Big Data analytics. Those returns include more business and increased profits.

    April 20, 2012

    Big Data and Language

    Since Enterra Solutions uses an ontology in most of its solutions, the topic of language is of interest to me both personally and professionally. That's why two recent articles caught my attention. The first article discusses how Big Data is being used to discover how the use of words has changed over time. The second article talks about how some executives are taking courses aimed at making them more literate in the language of IT.

    In the first article, Christopher Shea asks, "Can physicists produce insights about language that have eluded linguists and English professors?" ["The New Science of the Birth and Death of Words," Wall Street Journal, 16 March 2012] To answer that question, a team of physicists used Big Data analytics to search for insights from "Google's massive collection of scanned books." The result: "They claim to have identified universal laws governing the birth, life course and death of words." The team reported its findings in an article published in the journal Science. Shea continues:

    "The paper marks an advance in a new field dubbed 'Culturomics': the application of data-crunching to subjects typically considered part of the humanities. Last year a group of social scientists and evolutionary theorists, plus the Google Books team, showed off the kinds of things that could be done with Google's data, which include the contents of five-million-plus books, dating back to 1800."

    Whether or not you are interested in linguistics, this effort demonstrates how powerful Big Data techniques can be for producing new insights. According to Shea, the team's research "gave the best-yet estimate of the true number of words in English—a million, far more than any dictionary has recorded (the 2002 Webster's Third New International Dictionary has 348,000)." Shea continues:

    "More than half of the language, the authors wrote, is 'dark matter' that has evaded standard dictionaries. The paper also tracked word usage through time (each year, for instance, 1% of the world's English-speaking population switches from 'sneaked' to 'snuck'). It also showed that we seem to be putting history behind us more quickly, judging by the speed with which terms fall out of use. References to the year '1880' dropped by half in the 32 years after that date, while the half-life of '1973' was a mere decade."

    This demonstrates the increasing velocity of new knowledge as well as the importance of storing old knowledge. I'm a fan of history and Big Data techniques may eventually help us paint a truer, less biased, history of the world. I'm also a fan of the future and I know that Big Data techniques will help us make that future better. Shea continues:

    "In the new paper, Alexander Petersen, Joel Tenenbaum and their co-authors looked at the ebb and flow of word usage across various fields. 'All these different words are battling it out against synonyms, variant spellings and related words,' says Mr. Tenenbaum. 'It's an inherently competitive, evolutionary environment.'"

    I'm reminded of President Andrew Jackson's quote, "It's a damn poor mind that can think of only one way to spell a word!" He was joined in that sentiment by Mark Twain, who wrote, "I don't give a damn for a man that can only spell a word one way." I suspect those sentiments are also shared by former U.S. Vice President Dan Quayle who once famously "corrected" elementary student William Figueroa's spelling of "potato" to the incorrect "potatoe" at a spelling bee. Shea continues:

    "When the scientists analyzed the data, they found striking patterns not just in English but also in Spanish and Hebrew. There has been, the authors say, a 'dramatic shift in the birth rate and death rates of words': Deaths have increased and births have slowed. English continues to grow—the 2011 Culturonomics paper suggested a rate of 8,500 new words a year. The new paper, however, says that the growth rate is slowing. Partly because the language is already so rich, the 'marginal utility' of new words is declining: Existing things are already well described. This led them to a related finding: The words that manage to be born now become more popular than new words used to get, possibly because they describe something genuinely new (think "iPod," "Internet," "Twitter")."

    Although the scientists claim that "higher death rates for words ... are largely a matter of homogenization," I wonder if it isn't also a matter of there being more specialized and less generalized education. Shea continues:

    "The explorer William Clark (of Lewis & Clark) spelled 'Sioux' 27 different ways in his journals ('Sieoux,' 'Seaux,' 'Souixx,' etc.), and several of those variants would have made it into 19th-century books. Today spell-checking programs and vigilant copy editors choke off such chaotic variety much more quickly, in effect speeding up the natural selection of words."

    Of course, spell checkers aren't perfect. An anonymous poet penned the following poem to make that point:

    I have a spelling checker
    It came with my PC
    It plainly marks for my revue
    Mistakes I cannot sea
    I've run this poem threw it
    I'm sure your pleased to no,
    It's letter perfect in it's weigh
    My checker tolled me sew.

    Shea reports that the database analyzed by the scientists "does not include the world of text- and Twitter-speak, so some of the verbal chaos may just have shifted online." He continues:

    "Synonyms also fight Darwinian battles. In one chart, the authors document that 'Roentgenogram' was by far the most popular term for 'X-ray' (or 'radiogram,' another contender) for much of the 20th century, but it began a steep decline in 1960 and is now dead. ('Death,' in language, is not as final as with humans: It refers to extreme rarity.) 'Loanmoneys' died circa 1950, killed off by 'loans.' 'Persistency' today is breathing its last, defeated in the race for survival by 'persistence.' The authors even identified a universal 'tipping point' in the life cycle of new words: Roughly 30 to 50 years after their birth, they either enter the long-term lexicon or tumble off a cliff into disuse. The authors suggest that this may be because that stretch of decades marks the point when dictionary makers approve or disapprove new candidates for inclusion. Or perhaps it's generational turnover: Children accept or reject their parents' coinages."

    What I found interesting was that the scientists discovered a "similar trajectory of word birth and death across time in three languages." Even so, they concluded that the field "is still too new to evaluate fully." As is normally the case, not everyone agrees with the conclusions reached by the team. Academics love arguing amongst themselves. Shea reports:

    "Among the questions raised by critics: Since older books are harder to scan, how much of the word 'death' is simply the disappearance of words garbled by the Google process itself? In the end, words and sentences aren't atoms and molecules, even if they can be fodder for the same formulas."

    In our work at Enterra, we understand that every discipline develops its own special lexicon. That's why we work hard to ensure that our ontology understands words in various settings. The IT world is no different when it comes to creating a specialized language that can sound foreign to the technology-challenged. Jonathan Moules reports that some executives are taking courses to help them understand this specialized lexicon. ["Coding as a second language," Financial Times, 28 March 2012] He reports:

    "Alliott Cole sees a large number of tech start-ups in his work as principal in the early-stage investment team of private equity firm Octopus. The trouble is that he often struggles to comprehend what those writing the software that underpins those companies are talking about. 'For several years I have worked hard to understand how new infrastructure, products and applications work together to disrupt markets,' he says, explaining why he recently decided to take a course that claims to be able to teach even the most IT-illiterate person how to create a software application, or app, in just a day. 'While [I am] conversant in many of the trends and the – often confusing – array of terminology, it troubled me that I remained an observant passenger rather than an active driver, particularly in the realms of computer programming.' Mr Cole is not alone."

    The course taken by Cole was "created by three former advertising executives – Steve Henry, Kathryn Parsons and Richard Peters – and Alasdair Blackwell, an award-winning web designer and developer, because they felt there was "a widely felt, but rarely discussed, problem. Tech talk is increasingly commonplace in business and life ... but most people, including senior executives, find the language used by software engineers, social media professionals and the 'digital natives' ... baffling." Moules reports that modern technology is changing many industries so even well-educated people need an occasional refresher to "revisit the basics of how technology functions." After spending a day taking the course with a handful of executives, Moules reports that they all were "happy to leave [programming] to the experts – but now, at least, they feel more confident of being able to talk the same language."

    As the business world enters the age of Big Data, more specialized words are likely to be invented to describe technologies what cannot adequately be described using the current lexicon. There will also be words made up by marketing departments that will catch on. Only Big Data techniques, especially rule-based ontological analysis, are capable of making the connections and providing the insights that will help us make better decisions in the future -- perhaps even decisions about the words we use.

    April 19, 2012

    Supply Chain Disruption: The Reason is the Resin

    Not many people outside of the automobile industry have probably ever heard of cyclododecatriene (CDT) or one of the materials made from it -- PA-12 (nylon-12). Shortages of CDT and nylon-12, however, are creating concern in the automobile industry. As Jeff Bennett and Jan Hromadko report, "Production shortfalls at a single German auto-parts supplier are beginning to ricochet through the global auto business." ["Nylon-12 Haunts Car Makers," Wall Street Journal, 17 April 2012] They explain:

    "Inventories of the resin are being depleted after an explosion last month at an Evonik Industries AG plant in Marl, Germany, that killed two employees. Evonik describes itself as the only integrated maker of the resin, which is used to make fuel and brake lines."

    According to Bennett and Hromadko, the CDT shortage is severe enough that "more than 200 auto executives met in a Detroit suburb ... to evaluate a looming shortage of a relatively obscure resin essential to modern auto production." Several Evonik executives were reportedly in attendance at the meeting. Since so many automobile manufacturers rely on CDT, it's surprising that they all seem to have relied on a supplier of the resin that affects nearly half of all production. That seems like a recipe for disaster. An Evonik spokeswoman stated that it will take at least three months to fully repair the damaged plant. John Reed and Chris Bryant report, "Automakers are scrambling to avert a shock to their global supply chain caused by a shortage of [CDT]." ["Supply chain blow to carmakers," Financial Times, 17 April 2012] Bennett and Hromadko report:

    "Evonik makes 25% of the global supply of the specialty resin known as nylon-12 and supplies a chemical building block to another company that makes a similar amount. The resin is a precise blend of chemicals that can resist reacting with gasoline and brake fluids."

    Reed and Bryant add, "Evonik is one of the industry’s leading producers of PA-12. France’s Arkema, Ems-Chemie of Switzerland and Japan’s Ube Industries are its major competitors." Reed and Bryant report that auto executives at the Detroit meeting "discussed the state of inventories and production capacity of the material, and sought to identify alternative materials or designs to offset expected shortfalls in supply." One of the meeting's participants stated, "It is now clear that a significant portion of the global production capacity of PA-12 (nylon 12) has been compromised." Reed and Bryant write, "The bottleneck highlights the negligible margin for error right now in the global automotive supply chain, which is running on lean inventories three years after the industry's worst crisis in many decades." As I have noted in several past posts, there is a constant tension between those who desire to run companies using lean principles and those who are tasked to manage risk who would prefer operating using resilient principles.

    There is nothing inherently wrong with that tension. Creative tension in business is normally a good thing. When one side dominates the business, however, challenges almost always arise. If lean principles dominate, then supply chain disruptions are almost inevitable in today's business environment that is characterized by long and complex supply chains. On the other hand, if resilient principles dominate, a company can have a difficult time competing since its costs are likely to be higher. To read more about the balance that is required between lean and resilient principles, see my posts entitled Supply Chain Resiliency Still an Issue and Supply Chain Risk Management: Tension between Lean and Resilient Principles.

    Both articles cited above discuss other supply chain disruptions that have recently affected the automobile industry. Bennett and Hromadko report:

    "Last year, production in Japan of Merck KGaA's Xirallic, the shiny pigment in some automotive paints, was disrupted by the March 2011 tsunami and subsequent nuclear power plant problems in Onahama, Japan. Auto makers had to limit or stop taking orders for some cars that used the pigment for certain colors because the plant was the industry's primary supplier of the pigment. The Merck plant was repaired, but disruption rippled through the industry for more than six months."

    Reed and Bryant report that the resin shortage marks "carmakers’ third supply crisis in the space of a year. Last year’s earthquake in Japan and floods in Thailand wreaked havoc on some carmakers’ production by causing shortages of semiconductors, paint pigment and other parts."

    Jay Phillion, an executive with parts maker TI Automotive Ltd., told Bennett and Hromadko, that the auto industry was searching for quick alternatives to CDT. Phillion's company "has already warned customers that production disruptions are highly possible should there be no quick solutions." According to Bennett and Hromadko, executives at the Detroit meeting "were divided into separate teams. Each was assigned a task, such as finding a replacement material or identifying new firms to produce it." That sounds a lot like closing the barn door after the horse has already escaped. IHS Chemicals analyst Paul Blanchard told Bennett and Hromadko that "there are alternative materials available but they must be tested and produced on a greater scale." Obviously that takes time and time in not on the auto industry's side. As Blanchard put it, "I would be surprised if there is more than a month's worth of inventory out there. We are 19 days into this and the scope still has yet to be defined."

    One of the things that a good risk management plan looks at are the perturbative effects of a potential disaster (i.e., how one event can trigger a domino effect throughout the supply chain). The explosion in the Evonik plant has done just that. Bennett and Hromadko explain:

    "On April 10, Arkema SA, another manufacturer of the resin, said shortages of Evonik's building block meant it also wouldn't be able to supply its customers with the resin, Mr. Blanchard said. 'The ability of Evonik and Arkema to find alternate sources of CDT [a resin building block] is very limited and it is doubtful that the CDT shortage can be made up. In the short term auto and truck production will be affected,' he said. General Motors Co. said ... it has a global team from its purchasing, engineering and supply departments working to allocate resin and prioritize its needs. Ford Motor Co., Chrysler Group LLC and Toyota Motor Corp. each said they are monitoring the situation, but have not had any reports of production disruptions."

    Reed and Bryant also reported on the team that General Motors has put in place to look for solutions to the resin disruption. They note that Evonik employees are part of that team. Reed and Bryant further report, "Volkswagen, Daimler, Ford Motor, and Chrysler said that they had seen no impact at their plants yet." They continue:

    "The Japanese crisis was serious enough to impact overall car sales in the US in 2011 and hurt the earnings of manufacturers ranging from Toyota to PSA Peugeot Citroën. After [the Detroit] summit, carmakers and suppliers said they had scheduled a number of follow-up meetings ... to mitigate the impact of the capacity shortfall on their operations. Analysts said that automakers might struggle to find alternative materials or engineer new parts quickly. Because of safety concerns, the design of fuel-injection and braking systems is especially sensitive. 'Fuel, braking and engine components of this type are not easily reengineered,' said Michael Robinet, managing director of IHS Automotive Consulting in Northville, Michigan."

    Evonik is struggling to mitigate damage to future sales caused by this disruption. In addition to participating in working groups and rushing to complete plant repairs, it is trying to downplay the effects of the blast. A spokesman for the company told Reed and Bryant, "While we do expect there to be substantial constraints with respect to our ability to provide supplies of CDT-based product, we are nonetheless confident that we will be able to provide alternative solutions in the form of substitutes." Reed and Bryant report that Evonik "is planning to construct a new PA-12 plant in Asia, but said this would not be ready for three years."

    This appears to be a situation where a little "what if" scenario planning could have had a big effect on how automakers responded to this situation. Clearly, a company that affects nearly 50 percent of the production of an essential product isn't hard to spot and the consequences of a disruption in its operations shouldn't be hard to determine. Yet no such "what if" planning seems to have been done. Adding to the mystery of why "what if" scenarios weren't considered is the fact that "even before the accident at Evonik, some automakers had been seeking alternatives to the material because of rising prices."

    The consequences of the explosion at Evonik don't stop with the auto industry. Bennett and Hromadko report, "Plastic parts made of the resin are key components ... in the photovoltaic industry, and in offshore pipelines. Other uses include sporting-goods and household-goods industries." The Evonik disaster is simply the latest chapter in the supply chain risk management book that is recording an increasing number of disruptions to global supply chains. Big Data technologies are now available that can help industries, like the auto industry, to do "what if" modeling and planning as well as help mitigate disasters once they arise. We all know that prevention is better than cure; but when the worse happens, the faster the system can be made whole the better.

    April 18, 2012

    Are We Becoming Less Innovative?

    Ravi Mattu, editor of Business Life, writes, "No one doubts that innovation is essential or that the companies that do it better will thrive." ["Innovation clarion call has a familiar ring," Financial Times, 14 March 2012] Entrepreneur Luke Johnson agrees that "the most desirable quality for any business is to be thought of as innovative" and "that to create jobs and wealth we must generate valuable new intellectual property." ["Time to fire up the cauldrons of creativity," Financial Times, 27 March 2012] Johnson worries, however, that instead of firing up the cauldrons of creativity too many companies are letting them simmer. He writes, "Once, innovation was delivered by leading research and development facilities. But the era of such central laboratories appears to be fading. ... Perhaps the greatest R&D hub was Bell Labs, started in 1925." To read more about Bell Labs, read my post entitled Innovation: The Legacy of Bell Labs.

    Johnson believes that one reason central labs are becoming rare is because "the pace of change in many consumer markets has accelerated. Long-term research horizons are harder to justify for commercial enterprises." He continues:

    "Corporations tend to suffer from inertia. Their structures and cultures are always hard to reform. Meanwhile, greater scrutiny and accountability of public companies gives them less permission to experiment – and possibly lets them be seen to 'waste' resources. Executives all claim they embrace innovation but, in the 21st century, the price of failure within a hierarchy can be too personally expensive. Moreover, why fund research that might make your current products obsolete? Hence few managers are willing to take big risks, so they play it safe instead – and the business gradually ossifies."

    Brenna Sniderman agrees that not all executives are innovative. More than that, she reports that even among those who are innovative there are major differences. ["The Five Personalities of Innovators: Which One Are You?" Forbes, 21 March 2012] She reports:

    "Forbes Insights' recent study, 'Nurturing Europe's Spirit of Enterprise: How Entrepreneurial Executives Mobilize Organizations to Innovate,' isolates and identifies five major personalities crucial to fostering a healthy atmosphere of innovation within an organization. Some are more entrepreneurial, and some more process-oriented – but all play a critical role in the process. To wit: thinkers need doers to get things done, and idealists need number crunchers to tether them to reality. Though it may seem stymieing at times, in any healthy working environment, a tension between the risk-takers and the risk-averse must exist; otherwise, an organization tilts too far to one extreme or the other and either careens all over the place or moves nowhere at all. An effective and productive culture of innovation is like a good minestrone soup: it needs to have the right mix and balance of all the ingredients, otherwise it’s completely unsuccessful, unbalanced — and downright mushy."

    Before briefly discussing the five personality types, Sniderman reminds readers, "None of these are bad. All play crucial roles in developing an idea, pushing it up the corporate channels, developing a strategy and overseeing execution and implementation. These are all pieces of a puzzle, arteries leading to the beating heart of corporate innovation." The five personality types include: Movers and Shakers; Experimenters; Star Pupils; Controllers; and Hangers-On. Despite the fact that Sniderman insists that none of these are bad, there is certainly a pejorative association linked to Controllers and Hangers On. In fact, the order in which she lists them seems reflect a value judgment and relative importance of each kind of personality with regards to innovation. She begins with Movers and Shakers:

    "Movers and Shakers. With a strong personal drive, these are leaders. Targets and rewards motivate them strongly, but a major incentive for this group is the idea of creating a legacy and wielding influence over others. These are the ones who like being in the front, driving projects forward (and maybe promoting themselves in the process), but at the end of the day, they provide the push to get things done. On the flip side, they can be a bit arrogant, and impatient with teamwork. Movers and Shakers tend to cluster in risk and corporate strategy, in the private equity and media industries, at mid-size companies. ... Movers and Shakers can encompass up to one-third of the executive suite."

    The one thing I didn't read in that description is that Movers and Shakers are creative. They are driven, but are they idea people? My gut tells me that many of them are. The next group, Experimenters, appear to be better suited to that roll.

    "Experimenters. Persistent and open to all new things, experimenters are perhaps the perfect combination for bringing a new idea through the various phases of development and execution. 'Where there is a will, there is a way,' is perhaps the best way to describe them. They’re perfectionists and tend to be workaholics, most likely because it takes an incredible amount of dedication, time and hard work to push through an idea or initiative that hasn't yet caught on. They take deep pride in their achievements, but they also enjoy sharing their expertise with others; they're that intense colleague who feels passionately about what they do and makes everyone else feel guilty for daydreaming during the meeting about what they plan on making for dinner that night. Because they’re so persistent, even in the face of sometimes considerable pushback, they’re crucial to the innovation cycle. They tend to be risk-takers, and comprise about 16% of executives – and are most likely to be found in mid-size firms. ... Surprisingly, they're least likely to be CEOs or COOs – just 14% and 15%, respectively, are Experimenters."

    I don't find it too surprising that Experimenters prefer to keep busy chasing ideas rather than being overwhelmed with the administrative burdens of the front office. I suspect their absence is more often than not self-selected. The next personality type discussed by Sniderman is the Star Pupil.

    "Star Pupils. Do you remember those kids in grade school who sat up in the front, whose hands were the first in the air anytime the teacher asked a question? Maybe they even shouted out 'Ooh! Ooh!' too just to get the teacher to notice them first? This is the segment of the executive population those kids grew into. They're good at … well, they're good at everything, really: developing their personal brand, seeking out and cultivating the right mentors, identifying colleagues' best talents and putting them to their best use. Somehow, they seem to be able to rise through the ranks and make things happen, even when corporate culture seems stacked against them. Unsurprisingly, CEOs tend to be Star Pupils. What’s most interesting about this group, though, is the fact that, at 24% of corporate executives, they don’t seem to cluster in any one particular job function, industry or company size; rather, they can grow and thrive anywhere: IT, finance, start-ups, established MNCs. They're the stem cells of the business world."

    Recalling that we are talking about personalities associated with innovation, those first three appear to represent the heart and soul of innovation. The next two personality types -- not so much. The first type discussed by Sniderman is the Controller.

    "Controllers. Uncomfortable with risk, Controllers thrive on structure and shy away from more nebulous projects. Above all, they prefer to be in control of their domain and like to have everything in its place. As colleagues, they're not exactly the team players and networkers; Controllers are more insular and like to focus on concrete, clear-cut objectives where they know exactly where they stand and can better control everything around them. They comprise 15% of executives — the smallest group overall — and tend to cluster on both extremes of the spectrum: either in the largest enterprises (with 1,000 or more employees) or the smallest (with fewer than 10). This makes sense when you think about it: controllers thrive on overseeing bureaucracy (at larger firms) or having complete control over all aspects of their sphere – at the smallest firms, they may be the business owner who has built an entire company around their personality. Controllers pop up most frequently in sales and marketing and finance, and populate the more practical, less visionary, end of the corporate hierarchy: these are the department heads and managers who receive their marching orders and get to mobilizing their troops to marching."

    Controllers may be important for making a business successful, but they can hardly be considered innovative by most accepted definitions. Hangers-On also fit into that category -- necessary but not innovative. Sniderman writes:

    "Hangers-On. Forget the less-than-flattering name; these executives exist to bring everyone back down to earth and tether them to reality. On a dinner plate, Hangers-On would be the spinach: few people's favorite, but extremely important in rounding out the completeness of the meal. Like Controllers, they don’t embrace unstructured environments, and they tend to take things one step further, hewing to conventional wisdom and tried-and-true processes over the new and untested. When asked to pick a side, Hangers-On will most likely pick the middle. This is not necessarily a bad set of characteristics to have; someone has to be the one to remind everyone of limitations and institutional processes. While they comprise 23% of all executives – the same no matter the company size – they cluster most strongly in the CFO/ Treasurer/ Comptroller role, where 38% are Hangers-On. This makes sense; someone has to remind everyone of budget and resource constraints."

    My guess is that Luke Johnson believes that too many Controllers and Hangers-On are now in charge in the business world. Ravi Mattu, with whose quote I began this post, was writing a book review about Need, Speed and Greed: How the new rules of innovation can transform businesses, propel nations to greatness, and tame the world’s most wicked problems, by Vijay V. Vaitheeswaran. Mattu asserts that from the title of Vaitheeswaran's book, he is obviously a "glass half full type" of thinker (i.e., problems can be solved if just put our minds to them). Mattu writes that Vaitheeswaran defines innovation as "fresh thinking that creates something valuable," and that he believes innovation "is the key to surviving in the age of 'globalization and Googlization'."

    I'm not sure to what "new rules of innovation" Vaitheeswaran refers and apparently Mattu isn't either. He writes, "The cases, themes, companies and thinkers discussed are well known." Nevertheless, Mattu reports that the book "is divided into three sections" that correspond with its title (i.e., need, speed, and greed). He continues:

    "In 'Need: Why Innovation Matters', he says the combined effect of urbanisation, an ageing population, the rise of emerging markets and a global middle class is placing an unprecedented burden on existing systems. This raises questions for countries – how can the US counter the rise of China? – and big, established companies that 'are facing disruptive threats that could put them out of business altogether'.

    "In 'Speed: Where Innovation is Going', he posits that these challenges can be met only if those who feel under threat rethink how they innovate. 'You must be agile, open, and willing to embrace risk,' he writes. This means throwing out the top-down approach and tapping into new models such as open innovation. He cites approvingly Procter & Gamble's Connect + Develop project, whereby it collaborated with innovators from outside the organisation.

    "In 'Greed: How to win in the age of disruptive innovation', he argues that the 'post-Enron and post-Lehman' view that capitalism is irrevocably doomed is wrong. 'Greed is not only good but also can do great good,' provided 'there are clear incentives to tackle the wicked problems of society'. He points to social entrepreneurs who are driven by both profit and ethical motives in creating new business models."

    I'm not sure whether Johnson's "glass half empty" view or Vaitheeswaran's "glass half full" outlook is the correct one. One thing both agree on, however, is that companies that don't innovate won't thrive and might not even survive in the decades ahead.

    April 17, 2012

    Big Data/Cloud Computing Trends

    There are a number of trends associated with Big Data and cloud computing that are clearly beginning to emerge. In this post, I'd like to discuss a few of them beginning with job growth.

    Job Growth

    Joe McKendrick writes, "A new study commissioned by SAP and conducted by Sand Hill Group speculates that cloud computing — fueled by mobile computing, social networking and big data — may generate as many or more opportunities in the coming years than the Internet itself did in its early years." ["Mobile, social and big data drive cloud computing boom: studies," Service Oriented, 22 March 2012] Although that may sound like a bold statement, companies looking for employees to work with Big Data and cloud computing services know how difficult those employees are to find. McKendrick continues:

    "The study's authors said cloud computing is already generating a sizable number of jobs in the US today, and based on numerous trends and indicators, has the future potential to create very large business opportunities and hundreds of thousands of new jobs. Of course, as anyone who was around during the dot-com craze of the 1990s knows, we've been down this road before with over-the-top industry projections. There's no question that cloud is the hype of the day. Still, the cloud represents a shift in business technology resources that presents both risk and great opportunity for vendors and end-users alike."

    In past posts, I have detailed some of those risks and opportunities. Like most analysts, I conclude that the benefits of Big Data analytics and cloud computing (including software-as-a-service (SaaS) applications) are much greater than the risks involved. One of the reasons that the dot.com era was characterized by irrational exuberance was that people were excited about the potential of increased connectivity. The problem was that too many startups began with no business plan and no real understanding of where all the connectivity was going to lead. I don't see that happening this time around because most Big Data/Cloud Computing activities are business oriented. Businesses were stung by vendors during the dot.com era that over-promised and under-delivered. Businesses are being much more cautious this time around. McKendrick continues:

    "Consider potential job growth, both within IT and the business. For example, the SAP study relates, 11 cloud computing companies added 80,000 jobs in the United States in 2010, and the employment growth rate at these organizations was almost five times than that of the high-tech sector overall. The report cites a previous study out of Bank of America Merrill Lynch Global, which calculated the total number of employees at 11 cloud companies (Amazon, Google, Netflix, OpenTable, Salesforce, Taleo, SuccessFactors, RightNow, Intuit, NetSuite, and Concur) in January 2010 and January 2011. The total number of employees grew 27% during that one-year period, which was an additional 80,000 new jobs. The employee growth at these 11 cloud companies was almost five times the employee growth rate for the high-tech services sector overall, which grew 5.9%, to add about 17,500 jobs during a similar period."

    Those are pretty impressive numbers; especially considering that this job growth took place during economically challenging times. McKendrick reports, "Even more bullish numbers come from new research conducted by IDC and sponsored by Microsoft Corp., which also looked at the economic benefits of cloud computing in the years ahead. Cloud computing will potentially generate at least 14 million new jobs across the globe within the next three years. Moreover, these new jobs may likely be in many areas outside of IT." The areas may be "outside of IT," but they are areas that affect any good Sales and Operations Planning (S&OP) process, namely: "areas such as marketing, sales, finance and administration, production, and service." McKendrick states, "This does not even consider all the new types of jobs that may be created as a result of cloud, perhaps with titles such as 'virtual resources administrator' or 'customer network facilitator.'"

    Increasing Revenue and Savings

    Another trend highlighted by McKendrick is the increasing revenue that is going to be created in the cloud computing sector. Revenue is going to go up because cloud computing is going to help keeps costs down. He writes:

    "IDC's research also predicts revenues from cloud innovation could reach $1.1 trillion per year within the next 36 months. The analyst firm estimates that last year alone, IT cloud services helped organizations of all sizes and all vertical sectors around the world generate more than $400 billion in revenue. ... SAP-Sand Hill's report also examined the economic impact on consumers — companies buying cloud services. Cloud computing could save US businesses as much as $625 billion over five years, the study’s authors predict."

    McKendrick reports that the IDC study asserts that "three industry megatrends are propelling the growth of cloud services and employment. They are:

    • "The boom in mobile computing devices such as smartphones and tablets: 'Mobile apps will drive massive demand for cloud services on the back end, such as app stores, databases, and storage. The recent success of tablet devices will further expand the demand for cloud services as these mobile devices give users greater access to information.'
    • "Social networking: 'Such massive scalability and elasticity would not be possible without cloud computing technologies to drive these sites.'
    • "Big Data: 'Cloud infrastructure and platforms will play a huge role in accessing, processing, and analyzing such massive amounts of data. This is where cloud-based systems shine.'"

    Pattern Recognition

    As McKendrick notes, cloud computing shines in the area of Big Data analytics. Big Data can be mined for patterns and insights that provide real value. An interesting trend that is emerging is the search for even larger patterns than those found within the Big Data itself (i.e., patterns that can be applied to sets of data other than the set from which the pattern was detected). As Quentin Hardy writes, "It's not just about Big Data. For the big players in enterprise technology algorithms, it's about finding big patterns beyond the data itself." ["I.B.M.: Big Data, Bigger Patterns," New York Times, 15 February 2012] Hardy explains:

    "The explosion of online life and cheap computer hardware have made it possible to store immense amounts of unstructured information, like e-mails or Internet clickstreams, then search the stored information to find some trend that can be exploited. The real trick is to do this cost-effectively. Companies doing this at a large scale look for similarities between one field and another, hoping for a common means of analysis. When it comes to algorithms, 'if I can do a power grid, I can do water supply,' said Steve Mills, I.B.M.'s senior vice president for software and systems. Even traffic, which like water and electricity has value when it flows effectively, can reuse some of the same algorithms."

    Mills calls this reutilization of algorithms, "Leveraging the cost structure of new mathematics." What I like about this trend is that encourages cross-disciplinary collaboration. Hardy explains:

    "That kind of cross-pollination is reminiscent of the way Wall Street, starting in the 1990s, hired astrophysicists and theoretical mathematicians to design arcane financial products. Now the cost of computing has come down so much that it is useful to bring such talent to other industries. I.B.M., Mr. Mills said, is now the largest employer of Ph.D. mathematicians in the world, bringing their talents to things like oil exploration and medicine. 'On the side we’re doing astrophysics, genomics, proteomics,' he said. In the last five years, I.B.M. has spent some $14 billion purchasing analytics companies, in the service of its Big Data initiative. 'We look for adjacencies' between one business and another, said Mr. Mills. 'If we can't get an adjacency, we'll never get a return.' The trend of looking for commonalities and overlapping interests is emerging in many parts of both academia and business."

    An exciting side benefit of discovering "adjacencies" is the fact that many of the best innovations occur at the intersections (or along the borders) of disciplines. To put it another way, discovering adjacencies could generate a spike in innovation as well as an increase in profits.

    Visualization

    Mining data for insights is only the front half of the challenge and, by itself, insufficient to add value to a company. Those insights (or other analytical products) need to be presented to decision makers in a way that is both useful and informative. Insights that aren't used are worthless. Jeff Kelly puts it this way, "As the infrastructure layer continues to mature, vendors and increasingly enterprises are turning their attention to the real value proposition of Big Data – namely, deriving actionable insight via Big Data Analytics and Visualization." ["Hadoop, Big Data Focus Shifting To Analytics and Visualization," Wikibon Blog, 26 October 2011] He continues:

    "That’s not to say Big Data infrastructure isn't important or doesn't need improving – clearly tasks like writing and managing complex Map Reduce jobs and networking racks of Hadoop nodes still need simplifying – but that it has reached a maturity level where it is now practical for may enterprises to shift at least some of their focus to analyzing and making use of the data in addition to processing and storing it. ... To reiterate, there's still plenty of work to do on the infrastructure layer of Hadoop and other Big Data approaches. ... But the focus of the Big Data industry is – and should be – moving to include analytics and visualization. This is especially important for enterprises. Hadoop and other Big Data approaches, while still somewhat novel, should not be treated as some off-to-the-side science project. Enterprises should apply Big Data approaches like Hadoop only when they’ve identified areas where Big Data will help soothe a significant pain-point and/or bring real business value. And this requires analytic/visualization platforms and applications, tools that provide insights from Big Data that facilitate innovation such as identifying new market opportunities or helping create new products."

    Conclusions

    As I noted above, the difference between the dot.com era and Big Data/Cloud Computing era is that companies providing hosting, application, and analytic services are going to achieve success because they support traditional business objectives rather than ignore them like some companies did during the dot.com era. It should become clearer each day that Big Data/Cloud Computing era is not a passing fancy; rather it's the next big thing that is going to change how businesses operate in the decades ahead.