In 1927, Werner Heisenberg published his famous uncertainty principle which states "that it is impossible to determine simultaneously both the position and velocity of an electron or any other particle with any great degree of accuracy or certainty. ... The more precisely one property is measured, the less precisely the other can be measured. In other words, the more you know the position of a particle, the less you can know about its velocity, and the more you know about the velocity of a particle, the less you can know about its instantaneous position." ["Uncertainty Principle," Wikipedia]. In more general terms, what Heisenberg claimed was that the simple act of observation affects what is being observed. As a general principle, most people believe that when someone is observed their behavior changes. That's the reason that stores install surveillance cameras and some employers use closed circuit television to watch their employees.
A famous 1924 experiment at the Hawthorne Plant outside of Chicago wanted to discern whether behavior (specifically productivity) would change if working conditions were subtly changed. The study concluded that it did and the change in behavior of employees at the plant became known as the "Hawthorne Effect." Basically, the Hawthorne Effect is a noted increase in production following changes in employee working conditions -- in the initial experiment it was a change in lighting conditions. Over the years, Heisenberg's uncertainty principle has been questioned and last year so was the Hawthorne Effect ["Light work," The Economist, 4 June 2009]. The article reports:
"When America’s National Research Council sent two engineers to supervise a series of industrial experiments at a large telephone-parts factory called the Hawthorne Plant near Chicago in 1924, it hoped they would learn how shop-floor lighting affected workers’ productivity. Instead, the studies ended up giving their name to the 'Hawthorne effect', the extremely influential idea that the very act of being experimented upon changes subjects’ behaviour. The idea arose because of the perplexing behaviour of the women who assembled relays and wound coils of wire in the Hawthorne plant. According to accounts of the experiments, their hourly output rose when lighting was increased, but also when it was dimmed. It did not matter what was done; so long as something was changed, productivity rose. An awareness that they were being experimented upon seemed to be enough to alter workers’ behaviour by itself."
Frankly, that sounds like a reasonable conclusion. The article reports, however, that when data from the original experiment was rediscovered and rigorously analyzed, the so-called Hawthorne Effect is difficult to justify. The article explains why:
"It turns out that idiosyncrasies in the way the experiments were conducted may have led to misleading interpretations of what happened. For example, lighting was always changed on a Sunday, when the plant was closed. When it reopened on Monday, output duly rose compared with Saturday, the last working day before the change, and continued to rise for the next couple of days. But a comparison with data for weeks when there was no experimentation showed that output always went up on Mondays. Workers tended to beaver away for the first few days of the working week in any case, before hitting a plateau and then slackening off. Another of the original observations was that output fell when the trials ceased, suggesting that the act of experimentation caused increased productivity. But experimentation stopped in the summer, and it turns out from the records of production after the experiments that output tended to fall in the summer anyway. Perhaps workers were just hot."
The article concludes, "For something so influential and intuitively appealing, it turns out that the Hawthorne effect is remarkably hard to pin down." Change and how change occurs is of great interest to innovators, whether it occurs at the quantum level of physics or on the production floor of a factory. "Change is at the heart of innovation—no innovation happens without change, and innovation changes the status quo," writes Renee Hopkins. "Understanding how to make change happen, then, is of critical interest to innovators." ["How Change Happens," Bloomberg BusinessWeek, 19 April 2010]. How change occurs and who is responsible for making it happen are questions that continue to fascinate researchers. John Kay writes:
"We despise geeks – but we are also intimidated by them, and they retain a powerful influence on our thinking. When we talk about innovation, we visualise men and women in white coats with test tubes and microscopes. Outside many university cities around the world there are biotechnology estates established by governments that believe high technology is the key to a competitive future. The funds that governments provide to support innovation are all too often appropriated by large companies that are better at forming committees to pontificate about what the global village will want in the future than they are at assessing what their customers want today. ... Pioneers of innovation are routinely pushed aside by competitors whose skills are in the marketplace rather than the laboratory. The invention of the body scanner won a deserved Nobel Prize for EMI’s Geoffrey Houndsfield, but almost destroyed the company. The market for scanners is now shared by Siemens and GE." ["Innovation is not about wearing white coats," Financial Times, 16 December 2009].
Kay makes a good point and it's one that I've stressed in the past. To be considered an innovation, an idea must be new, it must have value to someone (i.e., there must be customers for it), and it must be realized (i.e., brought to market). That is why Kay points to Steve Jobs and Apple as examples of innovation. He writes:
"Apple is the most innovative consumer products company of the last decade. It has redefined how people listen to music, blindsiding both music publishers and established electronics manufacturers. And it has reinvented the telephone. ... Innovation is about finding new ways of meeting consumers’ needs, including ones they did not know they had. Sometimes it comes from a laboratory scientist but, more often, the innovation that changes the business landscape comes from the imagination of a Henry Ford or Walt Disney, Steve Jobs or Sir Stelios Haji-Ioannou [founder of EasyJet PLC]. But understanding the needs of customers is what distinguishes innovation from novelty."
Kay also points out that there can be a difference between supporting research and development (R&D) and supporting innovation. He concludes:
"[Britain's] National Endowment for Science, Technology and the Arts picked up this point. For years research and development scorecards have dutifully recorded how much pharmaceuticals companies spend on the search for new drugs and the expenditure of governments on defence electronics. But a Nesta report, presenting plans for a new innovation index, has now recognised that most of the spending that promotes innovation does not take place in science departments. The financial services industry may have been Britain’s most innovative industry in the past two decades – perhaps too innovative – but practically none of the expenditure behind that innovation comes under 'R&D'. And the same is true in retailing, media and a host of other innovative industries. Support for innovation is not the same as support for R&D. Important contributions to commercial innovation come from new businesses such as Easyjet, which see opportunities that others have missed. Most of these opportunities do not actually exist and the innovations fail. But only a few such entrepreneurs have to be right to change the face of business. Other innovations come from successful companies, such as Apple, which may not be at the frontiers of science but are in close touch with consumers. Like all business success, innovative success is based on matching capabilities to market."
In a post entitled Not All Small Businesses are Created Equal, I noted that only 2 to 3 percent of entrepreneurial companies can be considered superstars -- companies that change industries, have high growth rates, and hire significant numbers of employees. Kay is arguing that superstar companies are those that manage to match innovative success with market success. A task easier identified than achieved. Ade McCormack argues that the companies that will be best positioned to be superstars will be those whose employees have the strongest information technology (IT) skills ["e-Skills can make ideas into innovation," Financial Times, 16 February 2010]. She writes:
"In five years, only 10 per cent of the jobs in the EU will not require IT skills, says a recent study by IDC, a market research and analysis firm owned by Pearson, which also owns the Financial Times. So unless your organisation is either technology-free or people-free, it is likely that e-skills will be relevant to between 90 and 100 per cent of your staff (and your boardroom)."
McCormack tells us that she was "asked to produce a book entitled The e-Skills Manifesto – A Call to Arms." Researching for the book was, she writes, a real "eye-opener." She continues:
"The important point is that these skills are not only needed in the IT department. Increasingly, essential tools for other highly skilled users are IT-based. For example: air traffic control management, predicting the price of corn, or designing cars that cost less than $1,000. With the simplification of technology, and increasing ease of access (via the cloud/internet), the roles of users and technologists will become blurred – thus increasing the likelihood that we will witness the industrial equivalent of the fall of the Berlin Wall. ... Those that choose to remain on the technology side of the divide will need to have sustainable e-skills – the competence needed to develop and maintain systems in an environmentally friendly manner."
McCormack worries that unless society can convince more females to get involved in computer science, the workforce of the future will lack sufficient skilled employees to sustain innovation. She writes:
"The IT industry has done little to attract women. Given that there is a global shortage of skilled workers, it seems ridiculous that we are excluding half the resource pool. This is an issue that needs attention at the primary education level. The truth is that IT skills are increasingly required for work and social inclusion in general. They are fundamentally enabling. This is not enough: e-skills become genuinely useful when they form the basis of innovation – whether this innovation takes the form of greater operational efficiency, heightened customer experiences or simply market domination. In a global market, if you do not have innovation processes in place, you are flying blind. Creative ideas are 10 a penny, genuine innovation is gold dust."
In a couple previous posts [Nerds with Curves and Web Socializing], I noted that members of the fairer sex are increasingly involved in Web 2.0 activities (as users rather than programmers) while males are drawn more often to programming and gaming. McCormack argues that women need to get into programming as well -- and I agree wholeheartedly. She continues:
"Now we have made the link between people and business sustainability – via e-skills and innovation–let us look at what action businesses need to take:
• Ensure the management team understands the role e-skills will play in success;
• Synchronise the HR and IT departments to ensure all staff have the necessary e-skills;
• Put schemes in place to harness the capabilities of minority and excluded groups. Great if it brings good PR as well, but that is not the point;
• Ensure managers are capable of managing a high-tech workforce. Give special attention to those who manage technical staff, as this is traditionally an area of weakness;
• Put processes in place to industrialise innovation in the organisation. This is already quite common in manufacturing, but more often than not is decoupled from the IT function rather than entwined with it.
• Outsource aggressively. Why reinvent the wheel if others can do it better and cheaper? It’s not about headcount.
• Increasingly, students studying abroad are likely to find more interesting opportunities for employment in their homeland, particularly those from India and China. Do what is required to encourage these people to build their career with you rather than your global competitors."
McCormack's comments should cause human resource directors to think about what kind of training will be required in the years ahead as well as what kinds of recruits they go after. If Professor Richard Florida, a professor of business and creativity at the University of Toronto’s Rotman School of Management, is correct, the global economy is ripe for innovation. He "argues that economic bust is usually followed by innovation boom, resulting in better living standards. It is a view that owes much to Joseph Schumpeter’s contention that downturns represent 'creative destruction'." ["How recession can lead to a boom in innovation," by Richard Reeves, Financial Times, 29 April 2010] Reeves' article is a review of Florida's book entitled The Great Reset How new ways of living and working drive post-crash prosperity. With the world "tipped into the worst recession for a generation, [and] with rising unemployment and sovereign debt crises among its woes," Florida argues that we should look forward to a "great reset." This reset "could result in ways of living and working characterised by 'greater flexibility and lower levels of debt, more time with family and friends, greater promise of personal development, and access to more and better experiences'." Sounds great doesn't it? But innovation doesn't occur spontaneously -- it takes innovative people. And who are these people? Reeves gives us Florida's answer:
"Echoing the themes of his book The Rise of the Creative Class (2002), Florida suggests that cities dominated by educated, mobile workers are the future: 'The places that thrive today are those with the highest velocity of ideas, the highest density of talented and creative people, and the highest rate of metabolism.' Part of the new 'spatial fix' will be greater mobility, underpinned by two necessary reforms. First, high-speed rail to link urban areas to form megacities. Fast rail would cut the commute from Philadelphia to New York to half an hour, and Portland, Seattle and Vancouver would become a single labour market. ... The second need is an end to the obsession with home ownership that blights the US and British economies. Florida points out that mortgage lending made the capital markets poisonous, and that the impact of falling house prices has been to trap people in their debt-funded homes. 'The most staggering damage caused by the housing crisis may not be the impact on the financial markets, it may be the long-run competitive disadvantage caused by the inability to relocate the labour force to where the jobs of the future lie,' he says. Florida wants the US government to redirect some of the $230bn of tax breaks for homeowners towards support for the private rented sector. Cities are also the best hope for greener living. Manhattan dwellers have a 30 per cent smaller carbon footprint than their fellow citizens. Four out of five of them travel to work by public transport. Florida also hopes more of us will use one of the inventions of the 'first reset' – bicycles – to get around. Big changes surely do lie ahead after the recent economic crunch. But his conclusions – a world of trains, cities and bicycles – have a nostalgic feel. A rewind, rather than a reset."
I don't claim to know what the future holds in store. As an optimist, I believe it will bring us more good things than bad. Florida is correct about one thing; educated, mobile workers are the future. That mobility may not come via bicycles or high-speed trains (it may come via virtual commuting), but educated workers, skilled in IT, will be critical for every economy that is going to flourish in the decades ahead.