Site moved to enterrasolutions.com/2012/02, redirecting in 1 second...

« January 2012 | Main | March 2012 »

21 posts from February 2012

February 29, 2012

Coming to Grips with the Shortage of Trucking Capacity

Yesterday's post entitled Trucker Shortage Continues to Garner Headlines, focused on what most analysts consider an acute problem -- a growing shortage of truck drivers. But the driver shortage is only half of what is causing the capacity crunch. There is also a shortage of equipment (i.e., tractors and trailers). This shortage must also be addressed. To address the driver shortage, trucking company executives believe that driver pay must be increased some 30 percent in order to attract new drivers into the sector. Such an increase would likely mean an 11 percent increase in freight rates. To address the capacity shortage, companies need to buy more trucks. But it doesn't make sense to buy trucks if there are no drivers for them. Nevertheless, let's look at the capacity challenges.

An article in American Shipper notes, "Almost all new truck purchases are to swap out old equipment, according to trucking executives. A recent survey of 125 motor carrier officers conducted by Transport Capital Partners, a strategic advisory firm specializing in trucking, found that 73 percent expected to add between zero and less than 5 percent capacity in the next 12 months, compared to only 60 percent who responded that way in May. And there was a 90 percent drop since May, from roughly 28 percent to 3 percent, in the number of companies that planned to augment their fleets by 16 percent or more." ["Where did all the trucks go?, 24 October 2011] Obviously, capacity can't grow if the only equipment being bought is to replace older equipment.

The current economic situation is probably the main reason that trucking firms are hesitating to buy new equipment. The article reports, "Modern Class 8 tractors are 30 percent more expensive than ones being replaced ($125,000 compared to $85,000 10 years ago)." As a result, "Investment in new vehicles remains difficult for many companies because of the high initial cost for trucks that carry clean emissions technology, higher maintenance costs and tight credit from banks." The challenges don't stop there. Even if companies were ready to buy new vehicles to expand their fleets, the vehicles really aren't available. The article reports:

"About 12,000 trucks need to be produced each month just to maintain the industry's fleet at its current size and there [were] only ... four months [in 2011], as of August, in which truck production ... exceeded that level. 'And the reality of it is that it's not a shortage that you can get over tomorrow. The equipment manufacturers are running flat out just to do replacements but they're really reluctant to open any new facilities because of the cyclical nature of the industry,' [Lana Batts, President of Transport Capital Partners], said. 'So it’s going to take a long time to work our way through five years of not producing 12,000 trucks a month,' which means rates, accessorial charges and driver wages will go up in the near future, she added."

One indication that capacity is unlikely to improve over the short-term is the report that "Sweden-based global truck and bus maker Scania is cutting its production rates back a further 15% and plans to lay off some 1,000 workers as it said demand for heavy vehicles continues to decline in Europe, Latin America, and elsewhere." ["Scania to cut truck production levels," by Sean Kilcarr, Fleet Owner, 21 December 2011] The importance of having new equipment should not be underestimated. The article notes, "Having a modern fleet helps with recruiting because drivers don't want trucks that break down and prevent them from hitting the mileage targets upon which their pay is based, he said." The article goes on to note that shippers are starting take notice and action. It reports:

"Astute shippers have been strengthening their relationships with core carriers the past couple of years in anticipation of capacity problems. The idea is to give consistent business to certain carriers, accept fair rates and reduce dock-wait times so that a shipper is rewarded when decisions have to be made about which customer to serve first during busy times. Creating more regular routes reduces ambiguity for carriers and lets them plan how to use their assets more effectively."

Lora Cecere also believes that something has to give and she believes companies should ask, "So what do I do about it?" Doing something now is important because she believes that "power in the supply chain is shifting to the logistics providers." ["Keep on Truckin?" Supply Chain Shaman, 28 April 2011] She recommends that companies pursue five tactics that are in line with those recommended above by America Shipper. The first tactic involves being a company with which providers find it easy to do business. She writes:

"#1. It is going to get worse not better. Act now. Be easy to do business with. The old saying, 'when the going gets tough, the tough get going', has never been truer. Companies that understand the true nature of the transportation issues are investing in building long-term relationships with carriers. This includes the use of transportation planning systems to speed tendering, building continuous loops and managing empty miles, driving improvements in supply chain execution at the dock, in drop yards and in continuous shipping operations. They are making it easier for the carriers to get the loads, move trucks more easily in and out of the warehouse facilities and making their operations easier to do business with. Seven-day shipping operations are becoming the norm."

Being easy to do business with doesn't happen with the wave of wand, it takes planning and effort. Improving planning is the second tactic recommended by Cecere. She writes:

"#2. Improve planning. The early bird gets the worm. Companies are speeding time to tender by 30% through the use of advanced planning tools and business-to-business connectivity. This includes the use of transportation forecasting applications, direct integration of transportation planning systems to inbound and outbound manufacturing planning systems, and quicker processing of orders. In January 2011, the Supply Chain Leaders Association polled 45 supply chain leaders, 46% responded that improving supply chain optimization was the most important factor to improve to drive growth. ... The use of transportation forecasting improves carrier visibility of an order by a day. The need to forecast transportation is obvious."

The three things that all companies need to concentrate on are processes, technology, and people. Cecere has already addressed the first two. She turns next to people.

"#3. Educate, forecast and build a guiding coalition. Educate your management team quickly on the new reality because the paradigm has changed. To do this, look outside in. I recommend the use outside benchmarking to help your team quickly see that freight costs, availability, and increasing variability are the new reality."

The next tactic recommended by Cecere addresses the speed at which things happen in today's business environment.

"#4. Improve decision velocity. Freight is no longer the tail that wags the dog. Times are tough for the carriers. Remove the barriers. Improve time to decision for freight payments, fuel charge adjustments, and accessorial charge. Invest in analytics to get at the data in Enterprise Resource Planning and Supply Chain Planning Systems that is largely untapped. Make decisions quicker. Make decisions better. Improve the working capital in the transportation supply chain."

Cecere's final recommended tactic involves taking a broader, holistic view of supply chain networks and taking responsibility for it.

"#5. Take responsibility for your ENTIRE network. Today, 33% of companies have central teams that plan and execute freight contracts. This consolidation of freight requirements enables a company to take control of THEIR entire network and gain economies of scale. Even though you have outsourced manufacturing, leaders still take responsibility for their network and the movement of the freight."

American Shipper indicates that some shippers are turning to private fleets and dedicated contract carriage to ensure that disruptions don't occur. It reports:

"Retailers and manufacturers, in an effort to create more transportation stability, are also expanding their use of private fleets and dedicated contract carriage in which a carrier sets aside a certain number of vehicles and drivers to transport cargo for a customer on a regular route. Most dedicated contracts have surge protection for the shipper to accommodate spikes in volume, Larry Menaker, who heads his own eponymous consulting firm, said. ... Some private fleets have created extra capacity in recent years by selling empty space in their trailers on return trips to others needing transportation to the same general destination. ... Shippers can further redesign their supply chains to reduce their transportation costs and attract carriers by adjusting production schedules to produce loads when motor carriers have room and shipping in off-peak months when trucks are more plentiful, [freight industry economist Noël Perry] said. There is about a 15 percent swing in available capacity between the highest and lowest volume months of the year, he noted."

Clearly, there are no silver bullet solutions to the looming trucking capacity shortfall. The solution, however, appears to begin with attracting more drivers into the business. Inevitably, that means that shipping rates must rise (even if diesel prices stabilize or alternative fuels finally come into vogue). Shipping companies are pushing for permission to carry heavier loads as one way of increasing capacity, but that effort is facing some opposition. For on that topic, read the Trucking Sector portion of my post entitled The Supersized Supply Chain. Shippers need to begin to take actions now to avoid potentially devastating consequences of not being able to ship goods in a timely and affordable way.

February 28, 2012

Trucker Shortage Continues to Garner Headlines

Mark Yonge, vice chairman of the Marine Highways Cooperative, told the editorial staff at SupplyChainBrain that "the continuing driver shortage in trucking and ever more congested roadways argue in favor of using marine highways." ["The Importance of Marine Highways to Cargo Transportation," 13 October 2011] That may (or may not) be true (for a more thorough discussion, see my post entitled Upgrading America's Maritime Infrastructure). The point is that a number of analysts continue to insist that a shortage of truck drivers could have a significant negative impact on supply chains in the months and years ahead. For example, Lora Cecere writes, "We have always assumed that supply chains can keep on trucking, but has all this changed? Supply chain applications matured based on the assumption that manufacturing was a constraint and transportation was abundant. Transportation is now anything BUT abundant." ["Keep on Truckin?" Supply Chain Shaman, 28 April 2011] She continues:

"What has changed? Why is it so important now? The answer is three fold: equipment, labor and regulation. Equipment shortages abound, and driver shortages are now acute. With the rise of gasoline, these shortages are coupled with escalating pricing. It is unprecedented. Historically, transportation availability was a given; and as a result, the focus was on getting the best price. Traditional supply chain processes focused on manufacturing constraints and minimization of these impacts. As a result, this new wave of transportation issues are catching some supply chain leaders by surprise, and our historic supply chain policies are not aligning the functional teams to solve the problem. Instead, they are often putting supply chain functions at odds."

When anyone, especially someone as knowledgeable as Cecere, calls a problem "acute," you know it's time to pay attention. She insists, "Now is the time to act." She goes on to point out that a number of factors are creating a perfect storm for supply chains. "Available trucks for over-the-road shipments in the United States have never been tighter," she writes, "however, companies still want to negotiate price using strong-arm tactics. The price of oil is at a record high, road congestion has never been higher, but customer dock requirements have never been stricter." However, the one factor that is on everybody's mind is the shortage of truck drivers. Cecere explains:

"Based on the Council of Supply Chain Management Professionals (CSCMP), there is an expected shortage of drivers of 400,000 by the end of 2011. This will be acerbated by retirement. In the United States, 1 in 6 drivers are nearing retirement age of 55 in 2011. Yet, there are few young drivers. The number of drivers under the age of 35 is less than 25%. There is also a shortage of equipment. Over 2000 trucking companies went out of business in the Great Recession of 2009, and available over the road truck capacity is expected to fall another 40% between 2009 and the end of 2011. Increasing regulation and government intervention will make this a stewpot that will boil over."

Cecere has some recommendations for how companies can deal with the trucking challenge, but I'll discuss those in a future post. In this post, I want to concentrate on what commentators are saying about the driver and truck shortage. An article in American Shipper asserts, "The commercial truck scarcity predicted for the past two years is starting to become reality, forcing shippers to rethink how to keep their supply networks functioning smoothly." ["Where did all the trucks go?, 24 October 2011] The article continues:

"U.S. truck capacity is so tight that analysts and trucking executives are warning of serious shortages at the slightest sign of improvement in the economy. 'Sometime in 2012 there is a reasonable probability of sporadic supply chain failures based on capacity,' freight industry economist Noël Perry ... said at the Council of Supply Chain Management Professionals’ annual conference in Philadelphia. Transportation constraints, once isolated to certain long-distance truckload routes, are now widespread as demand increases for trucks to move goods to and from ports, distribution centers, farms and factories, according to industry experts."

Like the analysts cited above, the folks at American Shipper insist "the capacity crunch is a function of two variables — not enough equipment and drivers." Concerning equipment shortages, they write:

"During the economic downturn, which started in 2007 for the trucking industry, motor carriers significantly reduced their fleets by selling used tractors at home or overseas and deferring purchases of replacement equipment. Thousands of trucking companies, most of them small outfits, went out of business. And investment in new vehicles remains difficult for many companies because of the high initial cost for trucks that carry clean emissions technology, higher maintenance costs and tight credit from banks. The truckload industry lost about 20 percent of its capacity since late 2006, but shippers didn’t feel the impact because freight tonnage plunged about 30 percent. Supply and demand are almost in equilibrium now as the economy has rebounded from its depths in 2008. Tonnage is almost at the same level as in late 2006. That means motor carriers are busy keeping up with orders. The shortage is most acute in the flatbed sector because it was the first to feel the impact of the freight recession that began five years ago and pulled back on equipment buys, experts say."

Turning to the shortage of drivers, the article covers much of the same information provided by Cecere above.

"More than 13 percent of the driver workforce left the industry during the past four years and there are fewer young people entering the market to replace an aging driver pool. The trucking industry was short about 190,000 drivers during the middle of last decade, when the economy was strong. That figure is about 125,000 today because the recession and slow growth have temporarily suppressed the need for trucks, according to Perry. Logistics professionals warn that pending federal safety regulations for trucking will exacerbate the capacity shortfall and raise motor carrier costs. The Federal Motor Carrier Safety Administration’s new Compliance, Safety and Accountability initiative has increased the level of motor carrier scrutiny by scoring and ranking companies based on more timely and accurate data from roadside inspections, state-reported crashes and periodic safety audits. The new system for the first time creates a mechanism to track violations of individual drivers, which will now be factored into a carrier’s overall safety score. Carriers are becoming more particular about the drivers they choose because driving records will stay on their ledger even if an employee leaves the company and because shippers are selecting safe transportation providers to minimize any potential liability from an accident."

For more information about the Comprehensive Safety Analysis (CSA) 2010 program, read my post entitled Truck Transportation Safety and the Supply Chain. Another regulation change that could exacerbate the driver shortage involves hours-of-service. The article explains:

"There is also trepidation about new hours-of-service rules that would reduce the driver’s workday by an hour to 13 hours and set limits on the use of the 34-hour rest period that restarts the weekly duty clock. The agency is also considering reducing the daily duty period behind the wheel from 11 hours to 10 hours."

The article was written before the new hours-of-service rules were actually announced. To learn more about what regulation changes were made, read my post entitled Trucking Outlook for 2012. According to the article, providing drivers with better working conditions and more pay may be the only way to attract new drivers into the industry. It states:

"Industry observers and executives say that raising driver pay and getting truckers home more often are the only solutions to the driver shortage, but Perry said carriers won’t do that until freight rates go up at least 10 to 15 percent. Scheduling long-haul drivers to be home every week instead of every other week adds $30,000 to $40,000 in annual cost per driver, he said."

How much will drivers' pay have to increase? "According to [the latest quarterly report from Transport Capital Partners, based on a survey of CEOs and other executives at truckload carriers], 65% of the trucking executives surveyed said that wages must rise to more than $60,000 annually to attract and retain drivers. That's versus averages wages of about $48,000 in 2011, according to US Bureau of Labor statistics." ["Carrier CEOs Says Driver Pay Must Rise to over $60,000 - but will Shippers Come Along?" Supply Chain Digest, 25 January 2012] The article states:

"There is a growing sense of urgency about a current shortage of over-the-road truck drivers, with deep concern that the problem is likely to get much worse over the next few years, causing a capacity crisis in the industry not for shortage of tractors or trailers but the drivers needed to move them. ... Analysts at FTR Associates, a transportation related research firm, have estimated the driver shortage in 2012 will be about 180,000, with several pundits, such as transportation economist Noel Perry, saying that could rise to as high as 350,000 over the next few years. That would be similar to levels seen in the 2005 era, when the driver shortage played a strong role in the extreme capacity crunch that drove rates much higher and caused much shipper angst for nearly two years."

The article concludes by estimating what kind of impact rising driver wages will have on freight rates. It reports:

"If we ... said a 30% increase in wage rates are needed in the industry, how would that filter down into increases in the rates shippers pay? A recent study by the American Transportation Research Institute found that driver wages and benefits comprise 36.5% of a carrier's total cost per mile. If those costs were to rise by 30%, it would mean the total cost per mile would increase by about 11% (.365 x .30). (Note: we are also allowing the benefits costs to rise by 30% as well given health care cost escalation, etc.) Shippers could then expect freight rates to also rise by about 11% based on the increased driver wages alone, before any other cost factors."

With economy still being relatively flat, an 11 percent increase in shipping rates is a very inflationary number. It is doubtful that shippers can absorb that kind of rate without having to pass some or most of it along to consumers. No wonder the shortage of drivers continues to garner headlines.

February 27, 2012

Advances in RFID Tagging

Bob Trebilcock, Executive Editor of Modern Materials Handling, notes that there is a "continued need for greater visibility into the supply chain." ["Bringing sensors and RFID together," 28 June 2011] Greater visibility, however, requires better information and better information sharing. Technology plays a central role in improving visibility and Trebilcock asserts that the technologies that do the best job are "are data-hungry beasts." He writes, "To get those greater levels of visibility, you have to constantly feed data to the beast, and collect data from more and more points across the supply chain." He singles out a company called TempTrip, that "has come up with a way to tie barcode, RFID and sensor technology together to collect data that might impact the shelf life of temperature sensitive products such as fresh produce or pharmaceuticals." He continues:

"It's a great example of the kind of innovation we're seeing in this space and a reason that RFID is posting impressive numbers, especially in manufacturing and logistics outside the four walls. It's also an example of how big business is seeing the potential in supply chain solutions."

Last fall, JC Penney revealed that it was implementing an "item-level RFID tagging program [in] 1100 stores across three product categories." That and other similar announcements by large retailers had the staff at Supply Chain Digest "wondering [if] recent moves [may] at last be the catalyst for RFID in the consumer goods to retail industry to finally take off." ["Do JCPenney, Macy's Announcements Mean RFID to Finally Really Takeoff in Retail?" 2 November 2011] The article continues:

"In the last couple of years, ... buoyed by a series of pilot programs and research studies, item-level tagging of apparel goods had seemed poised to breathe new life into RFID in consumer goods to retail. Walmart jumped back into the fray, with a 2010 announcement that it would rollout item-level RFID for jeans and underwear to 3300 US stores. ... Even earlier, specialty retailer American Apparel had announced it was rolling out an item-level RFID program across its store network, a program currently in progress and perhaps facilitated by the fact that all its merchandise is private label. ... Macy's ... announced a program it that would have RFID capabilities in place in all 850 of its Macy's and Bloomingdale's stores by the end of the third quarter of 2012 - an aggressive agenda indeed."

Another Supply Chain Digest article reported on the findings from a study that concluded RFID programs are about to gain momentum. ["Is the Tipping Point Really, Truly Here for Item Level RFID Tracking in Apparel Retail?" 2 February 2012] The article stated:

"A new report from Accenture for the VICS Item-Level RFID Initiative says we are on the cusp of another tipping point, this one for adoption of item-level tagging in the apparel to retail chain in significant numbers. The report, titled Item-level RFID: A Competitive Differentiator, says that like many other technologies before it that took a long time to incubate, 'RFID now appears set to catch fire.' That conclusion comes in part from a survey of 58 suppliers and 56 retailers in North America, conducted by Accenture on behalf VICS (the Voluntary Interindustry Commerce Solutions Association). That research found that for some processes, such as taking store inventory, RFID technology can now drive improvements several orders of magnitude better than current standard methods. For example, taking store inventory, once a project of days or weeks, 'can now be tallied with lightning-fast, near-perfect accuracy.'"

The big drawback to RFID has always been cost. According to the report, RFID costs will continue to decline, which is one reason it concludes that the future of RFID tagging is bright. The report concludes "that most major apparel and footwear retailers will adopt RFID technology in some part of their business within the next 3-5 years - that is, 'if recent momentum continues.'" The article reports that there are currently "aggressive item-level RFID efforts [at] American Apparel, JC Penney, Macy's, and - interestingly - Walmart itself, with rumors of several others, such as Gap stores, said to be ready to jump in, according to SCDigest reporting." The "interesting" remark associated with Wal-Mart refers to the company's earlier stillborn effort to implement an RFID system. The article continues:

"A number of studies have indicated the payback for soft goods retailers from item-level RFID can be very strong. New research from the University of Arkansas has shown the ROI for apparel vendors can also be strong. ... The report also references an unknown company that is said to have shrank its cash-to-cash cycle time by 35%, lifted revenues 10% and generated 5-7% better profit margins as a result of instituting item-level RFID, according to research from Gartner."

An example of a company that has benefited from its RFID system is Warmkraft, a Mississippi company that applies finishes to military uniforms. ["Warmkraft Boosts Order-Fulfillment Efficiency With RFID," by Claire Swedberg, RFID Journal, 10 November 2011] According to Swedberg, Warmkraft "has reduced its manual shipping labor costs by 50 percent, while lowering its shipping–error rate down to 0.2 percent from 5 percent, by RFID-tagging its products at the item level after those goods are treated, and by then reading the tags prior to shipment." She continues:

"Warmkraft began tagging its products destined for the U.S. Air Force (USAF) in November 2009, as required by the military agency, and has now expanded the system to its two Mississippi facilities at which uniforms are treated prior to being shipped to the Air Force—as well as, now, the U.S. Marine Corps. The USAF's item-level RFID deployment was intended to help the U.S. Defense Logistics Agency track the receipt and issuance of uniforms at Lackland Air Force Base's recruit training center. ... The U.S. military receives uniforms from Warmkraft after the finishing company provides chemical treatments that include insect repellant, waterproofing and wrinkle proofing. Warmkraft receives uniforms from garment manufacturers, applies the appropriate chemical treatments, and ships the finished clothing to recruiting centers, where they can then be issued to soldiers. Prior to installing RFID, the firm depended on manual inspections to verify which uniforms were packed in which boxes, and to ensure that no mistakes were made before those cartons were shipped to fill military orders. In addition, regular audits were conducted on boxes just prior to their loading onto trucks, and if any errors were discovered in any of the audited cartons, the entire order would need to be manually inspected for additional errors before it could be transported. Ensuring that the proper items could be accounted for is especially important, [Ron Lack, Warmkraft's general manager] says, because the garments belonged to the supplier, and not to Warmkraft. 'We process the goods,' he states. 'We don't own them. If they send us 1,000 garments, we have to account for 1,000 garments.'"

It is exactly that kind of requirement for inventory control that has placed the apparel industry the furthest down the RFID road. Lack told Swedberg that "the results of the technology's deployment have been dramatic." Nevertheless, he indicated that "if the military were not mandating the use of RFID ... the company would probably not continue its item-level tagging, given the current high tag cost." That's surprising considering the company claims to have reduced both manual shipping labor costs and its shipping error rate. But Lack indicates that current tag prices couldn't be covered by normal profit margins if the company was using them just for its own benefit. The company currently orders about 1.5 million tags a year at a cost of approximately 15 cents per tag. Lack told Swedberg, "At the volumes we're doing, it's tens of thousands of dollars a year." Because the company is a service provider rather than a manufacturer, its profit margin is low. I suspect that if the cost of tags could be significantly reduced, Lack would not hesitate to use them regardless of whether they were required or not.

The February 2012 Supply Chain Digest article cited above claims "that tag prices are much lower than many companies perceive, having dropped rapidly in recent years." But as Lack pointed out, how pricey tags are perceived to be has a lot to do with profit margins. The article reports, "In many cases, depending on type and quantity, tags be procured for as little as 10 cents each, and 20-cent tags are commonplace. These falling costs continue to open up tagging to move products." Warmkraft's tags are obviously priced right in the middle of those pricing limits, but for its normal business are still too expensive. The following figure demonstrates some cost factors.

Source: Accenture/VICS

To encourage companies to implement RFID processes, the "nonprofit standards organization GS1 US is offering a program ... intended to guide retailers and suppliers through the process of adopting Electronic Product Code (EPC)-based radio frequency identification technology." ["GS1 US Offers 'EPC Item-Level Readiness Program' Aimed at Retailers, Suppliers," by Claire Swedberg, RFID Journal, 31 January 2012] Swedberg reports:

"The GS1 US EPC Item-Level Readiness Program will consist of webinars, community discussion groups and Web-based user tools. This program, the organization reports, is intended to help drive the adoption of item-level RFID EPC tags by suppliers and retailers, as the market for the tagging of apparel and other consumer products grows—prompted, in large part, by retailer adoption and subsequent requests to suppliers to apply tags to their products. The Web-based educational programs and teleconference sessions are designed to help businesses develop their EPC RFID deployment plans, learn what their costs may be and follow a timeline for adoption. The program also provides guidance regarding usage of the EPC symbol. ... A product supplier can purchase a one-year membership into the program for $2,500. The annual cost for a retailer is $15,000, and includes site-specific support—including, in some cases, on-site assistance with a pilot or deployment. Suppliers can also purchase specific support for an additional charge."

Ben Coxworth writes, "Radio frequency identification (RFID) tags are definitely a handy way of tracking shipments. Instead of simply crossing their fingers and hoping for the best, importers and exporters can check the location and condition of shipped items in real time, by remotely accessing the data being transmitted by RFID tags attached to those items." That much, we all know. He points out, however, that "many such tags don't work on metal objects such as shipping containers or oil drums, as the metal interferes with the functioning of the tags' antennas." Fortunately he reports, "A new tag developed at North Dakota State University gets around that limitation, ... it uses the metal object as its antenna." ["Antenna-less RFID tags designed to work where others don't – on metal objects," Gizmag, 6 February 2012] Very clever. Coxworth continues:

"Typically, when an RFID tag is to be attached to metal cargo, the antenna is placed on a spacer to keep its electromagnetic field from being affected by the metal. This results in the tags having a total thickness between 0.5 and 3 centimeters (0.2 to 1.18 inches), depending on the type of tag being used. In a rough-and-tumble shipping environment, such protruding tags can be damaged or ripped off. The North Dakota tags, however, are less than 3 millimeters thick, and are applied directly to the metal - they could even be recessed into it. This thinness is due partly to the fact that they have no antenna of their own, but also because of a unique material used in their construction. This material is highly electrically permeable, allowing the tags' integrated circuits to receive current from the metal upon which they're mounted. The university is currently looking for corporate partners interested in licensing the technology."

Advances, like those being developed at North Dakota State and elsewhere, hold great promise for making RFID tags more useful as well as less expensive. The combination of those two factors does make the future of RFID technologies look brighter.

February 24, 2012

Technology Trends in 2012

Eval‐Source, a "consulting firm that provides enterprise software selection and strategic technology consulting services for organizations, ... "compiled a list of 12 technology trends and predictions that should play out this year. " ["Enterprise software technology predictions and trends – 2012," Eval-Source Blog, 6 February 2012] Such lists are always interesting points of departure for discussion. I decided the Eval-Source list was worth adding to other lists I've previously discussed [More Supply Chain Predictions for 2012, Part 1, More Supply Chain Predictions for 2012, Part 2, and Technologies that Could Change Your Life]. Eval-Source introduces it list by noting: "These predictions and trends are based on what we see with our customers, what are their concerns and what organizations will need to address as these predictions have started to permeate organizations already." The first trend is good news for some vendors:

  • "ERP SaaS spending will increase – pointed solutions were bought because of the recession. Companies are starting to realize the implications of doing business globally. They also now understand that their existing system(s) must accommodate global factors, ease collaboration issues, reduce administration costs, simplify usage and force greater adoption within the company, manage social aspects of their business, manage content for dissemination and overall save money in doing so. Since companies have purchased point solutions they are now looking to unify all the systems. Organizations will try to unify all their applications in one place using the cloud. Also there are many new SaaS ERP applications with specific functionalities and industry specific solutions. Software evaluation will become more difficult as hybrid models and SaaS differentiation pricing models will complicate decisions."

There are a number of reasons that companies will continue to move some (if not all) IT services to the cloud. I think Eval-Source hit most of them. I believe that simplifying usage and saving money are biggest drivers. The company's next trend follows from the first. If companies are going to spend more money on cloud computing, it makes sense that more providers are likely to emerge.

  • "Rise of cloud providers – cloud vendors will become cloud brokers by providing application, hosting, infrastructure and platform all from one source. Many complementary vendors will partner with other providers and application vendors to provide an end-to end customer solution. Large software vendors such as SAP/LAWSON/ORACLE etc. will team up with IBM, TATA, CGI, ACCENTURE, SATYAM etc. The service based providers of business process outsourcing (BPO) not only resell but have started to realize if they team up with vendor to split the implementation management IT failures are becoming reduced. Many other tier two software resellers are becoming cloud brokers as they have started to build their practices by looking for Cloud Architects."

Although Eval-Source talks about teaming between large software vendors, the biggest trend seems to be large vendors acquiring cloud-based companies to increase their bona fides. That's a trend discussed below, but not the next trend listed. Eval-Source lists another fallout resulting from the expanding number of SaaS vendors -- solution sprawl.

  • "Solution sprawl will complicate software evaluation. Once specialized vendors have started to include additional and complimentary functionality within their applications. As these apps become larger customers will have more choice of vendors which will drive monthly subscription prices down. Vendors will start to diversify their application portfolio to include much more functionality (i.e., SFDC picking up HR vendor leads to more competitive pricing for consumers). A possible problem that consumers will face with this approach is that organizations will have to pay closer attention to software evaluation as the additional features and functions will complicate an already difficult process and may lead to increased IT failure if the wrong solution is selected due to the extras that may not cover the original business objectives."

Although solution sprawl could obviously complicate things, it also allows companies to tailor their cloud-based services so that they can increase their return on investment. Eval-Source next turns to the merger and acquisition trend I mentioned earlier.

  • "Start-up buying frenzy. Larger vendors will purchase start-ups with actual sales. They will be purchased by larger companies to grow their portfolio. It is often easier for companies to buy an existing vendor with marketshare or customers to increase application portfolio sizes. This will cause applications to become more diverse and larger than before. This has become a trend in software development as in-house development has taken a back seat to just buying the vendor straight out. The solution sprawl this causes will also complicate software evaluation for organizations. Large industry players will start to acquire smaller and complementary solutions to grow their portfolio – similar to the Google/Hubspot acquisition."

Although Eval-Source claims that these mergers and acquisitions will add to solution sprawl, I'm not so sure. If the past is prologue, I suspect that the acquisition trend will eventually cool and only the best solutions will survive. In the end, a smaller, but more focused group of solutions may emerge. Because large vendors are paying a premium for the companies they are acquiring, they are going to do everything possible to ensure that their investments pay off. The next identified trend to watch is IT integration.

  • "Integration between systems [is] becoming easier and less time consuming. Once integration was a big part of the software selection and implementation process, this is no longer the case. Open API's, SDK's, EXCEL, CSV files, import tools provided by vendors have made implementation simpler. This also includes additional solutions that can be integrated with other point solutions are now significantly easier than before."

Eval-Source points out that a number of vendors now offer integration services. My company, for example, is agnostic about the systems with which our solutions must integrate. New technologies allow us and other vendors to integrate both structured and unstructured data. The next trend is an interesting one. Eval-Source calls it the consumerization of IT and it discusses how it will affect employee hiring and retention.

  • "Consumerization of IT in enterprises will differentiate quality employees from one company to another. The younger workers in the workforce are demanding enterprises to adopt the technologies they grew up with. Whether it is gamification, bring your own devices (BYOD) to work and other technology perks will be determining factors for employees to select the companies they want to work for. Options such as work at home and other collaboration options will influence employees as to which companies they will work for. Organizations will have to become more creative as to how to recruit new employees and be able to keep them."

The next trend identified by Eval-Source has direct ties to the consumerization of IT. If personal devices are going to be used in the workplace, how does a company deal with security? Most companies won't be able to act like the CIA and tell folks to "check them at the door."

  • "Mobile device management and security have to be addressed. Organizations that support multiple device and OS' will have to become more diligent as to how security is handled from an enterprise point of view. Mobile device management will become disruptive to large organizations with large workforces to administer, control, inventory, upgrade etc. Also how is additional security managed for the additional devices and rogue devices that are added to the network without IT permission?"

Another fallout of the consumerization of IT is the dawn of Big Data era.

  • "Data explosion and content management will disrupt IT organizations and architectural organizational strategies. The explosion of social media, the infrastructure that is required to support the social aspect and data created by newly created collaboration, both internally and externally will become an issue to manage. Organizations will have to look at new content management systems to unify the many disparate silos throughout the organization to unify data for use. This will include new applications for content management, whether it is a SaaS or on-premise model and how does this impact your internal application strategy? Mobile devices out in the field will also cause further content management issues for organizations."

I agree that Big Data is going to be a big deal. When Eval-Source talks about "content management," it is not just referring to the gathering and storing of data but also to analyzing that data to support business operations. If the data can't be turned into actionable knowledge, why bother to collect and store it? One growing source of Big Data is social media; but, the next trend that Eval-Source looks at is not in the collection of data from social media, but controlling how it is used by employees.

  • "Does your organization have social media policies in place as this is becoming a very hot-button legal issue. Organizations that have a social media strategy should have a code of conduct for usage. Many lawsuits have emerged on the issue of who owns the followers of a social media account the company or the individual acting on the company's behalf. Codes of conduct, disciplines, dismissal behaviour should all be defined and users of these company accounts should be made aware of the implications of using social media on a company’s behalf."

The next trend discussed by Eval-Source is a topic that gets a lot of attention in the supply chain sector -- collaboration.

  • "The rise of internal collaboration tools. – Not using email. A few large companies in Europe have said they are not going to use email. This is a great way of getting your article read which is just not the case. What they said is that email usage will decline due to more internal collaboration. Email still has many business uses especially for privacy within companies and external privacy issues. It has been my experience even with larger companies internal collaboration is still very difficult among team members and even worse across various departments throughout the organization. Collaboration vendors will have to focus on the message of how companies will be using internal collaboration tools to expedite requests and reduce traffic and duplicate content. Many tools of this nature already exist such as Chatter, IM, Lotus Notes collaboration. Organizations will deploy these types of systems to leverage data, answers and form a central repository."

To learn more about the challenges of collaboration, read my post entitled Dynamic Collaboration: Inside and Out. The next trend that Eval-Source asserts is worth watching is the rise of mobile applications.

  • "The rise mobile applications. Tablets and smartphones are playing an important part of the enterprise. Applications will become more distributed and organizations will have to pay attention to mobile device management, further application security measures and enterprise applications in general. These applications will become harder to monitor, controlling content and storage of such data will lead to an explosion of metadata within the enterprise."

People are falling in love with mobile applications and the devices on which they are found. I agree with the folks at Eval-Source that workers will prefer working with these devices and applications. Companies will adopt them because they will require less training, offer good service, and will confront little opposition when adopted. The final trend identified by Eval-Source is going beyond software-as-a-service (Saas) into platform and infrastructure as well.

  • "Organizations will increase adoption of infrastructure and platform (IaaS, PaaS) for their foray into cloud. These services will open new doors to companies to get into cloud and easily build SOA infrastructures without large outlays of cash. The new platforms will allow for greater business agility for companies to adapt quickly to changing market conditions. These will also provide the basis for organizations to foray into cloud further by adding additional SaaS and other services to the expandable cloud platform."

I think Eval-Source is correct. Many companies are going to be eager to shift system administration and infrastructure upgrades onto someone else's shoulders and budget. Overall, I think that the Eval-Source list is pretty good. One missing trend was an increased use of artificial intelligence in both cloud-based applications and mobile devices. Apple's Siri continues to make headlines and most analysts seem to think that AI applications are only going to grow.

February 23, 2012

Assessing the Time Element of Supply Chain Risk Management

Last year, the Editorial Staff at Supply Chain Digest wrote, "Risk management was ... little talked [about] a decade or so ago, but now it is near the top of every supply chain executive's priority list." ["Adding a Velocity Dimension to Risk Management," 26 May 2011] According to the staff at Logistics Manager, things haven't changed, it reports, "Retailers and manufacturers are increasingly focusing on risk in their supply chains, according to KPMG's latest global CFO Consumer Markets survey." ["Supply chain risk moves up the agenda," 13 February 2012] The article continues:

"The current economic uncertainty is cited by nearly half of respondents (44 per cent) as the biggest risk that these global consumer companies face. This is followed by political instability (27 per cent) where recent clashes have been seen in China and the Far East over large scale job cuts. Consumer companies also cite supply chain issues as presenting one of their greatest operating risks in emerging markets. Recent examples include the explosion of the Fukushima nuclear plant in Japan, the political upheaval in Asia, floods in Thailand and earthquake in New Zealand. The report 'Turning global risk into an opportunity' surveyed 350 senior finance executives in the retail, food, drink and consumer goods manufacturing industry. It found that companies are showing greater appetite for risk reviews, as finance leaders and their boards meet more frequently to discuss the key risks."

I found it interesting that KPMG survey interviewed Chief Financial Officers about risk management; but, I'm not surprised. Henry Ristuccia, a partner with Deloitte & Touche, told Russ Banham, "While more companies are now appointing chief risk officers, many don't have that position, and therefore responsibility for risk management ends up with the board and the CFO." ["Disaster Averted?" CFO Magazine, 1 April 2011] To learn why some analysts believe that defaulting to the CFO is a bad idea, read my post entitled Supply Chain Risks: Who's in Charge and What are They Looking At? Regardless of who is paying more attention to risk management, Gerry Penfold, risk consulting partner at KPMG, concludes, "More regular communication with senior colleagues on risk issues will lead to greater resilience at a time of increased uncertainty and more tangible assurance for the board."

It is little wonder that risk management is climbing the priority list. As I've pointed out in several past posts, events that could potentially disrupt supply chains are on the rise. The following graphs, taken from an article by Stephan Wagner and Nikrouz Neshat, provides one such analysis. ["Assessing the Vulnerability of Supply Chains Using Graph Theory, International Journal of Production Economics, Vol. 126, No. 1, July 2010, pp. 121-129]

Historydisasters

Daniel Dumke, commenting on the Wagner and Neshat paper, notes that "several factors help increase the vulnerabilities of today’s supply chains. When supply chain complexity increases (e.g., supply chain length, higher division of labor, …), the vulnerabilities also rise. Furthermore there is evidence that natural and man-made disasters are on the rise as well." ["Assessing Vulnerability of a Supply Chain," Supply Chain Risk Management, 10 October 2011] Rather than throw one's hands up in despair over rising risks, Dumke reports that Wagner and Neshat recommend that companies hire mathematicians to help them "calculate a Supply Chain Vulnerability Index (SCVI)." The SCVI involves "a four step algorithm based on graph theory." Dumke concludes:

"I really like the graph approach to assessing supply chain vulnerabilities. And I think it is a great method to support the understanding of a complex system like the supply chain. The article combines two very interesting aspects of it: the practical implementation and the assessment of supply chain vulnerability and a survey to compare different vulnerability levels across industries. ... From a business and research point of view this article should direct the supply chain risk management efforts especially in the industries with the highest risk levels, Automotive and ICT."

One aspect of risk management that didn't seem to be addressed in the SCVI process is the element of time. The Supply Chain Digest article notes, "Many companies have or still use a simple 2 x 2 framework for assessing and managing risk, where on one dimension is the likelihood of occurrence (high or low) and the other the level of impact (high or low)." A more sophisticated framework, the article states, is one that uses "a higher level of granularity, recognizing that there are more intervals of supply chain impact or likelihood than just High or Low." The article provided the following framework as an example.

Multileve risk framework

The article went on to report that at an "ISM conference, supply management veteran Robert Kemp made an interesting point during one session: that another dimension beyond likelihood and impact needs to be added, and that is the likely velocity of the event or risk." That's a great point. Even an imminent event might unfold slowly. For example, a hurricane making its way across the Atlantic unfolds much more slowly than a tornado spawned suddenly by a super-cell. The article muses, "Though you could argue that there is some overlap between velocity and impact, we think there are also differences, and that it makes a lot of sense to add velocity to these frameworks. We are struggling, however, to figure out how to represent that in a now three-dimensional image." If you have any good ideas, they'd like to hear from you.

Daniel Dumke discusses a chapter from the book Managing Supply Chain Risk and Vulnerability: Tools and Methods for Supply Chain Decision Makers, edited by Teresa Wu and Jennifer Vincent Blackhurst, that discusses the time dimension of risk in a slightly different way. ["Let me help you with... Time-Based Risk Management," Supply Chain Risk Management, 31 August 2011] The chapter was written by Professors ManMonhan S. Sodhi and Christopher S. Tang. Dumke writes:

"The time-based risk management approach aims to travel on this thin line [between the cost of reducing risks and maximizing profits] and delivers a strategy to mitigate disruption risk without compromising profits. It consists of three time frames which should be [the] focus of the risk manager:

  • time to detect a disruption (D1),
  • time to design a solution (D2),
  • time to deploy (D3),
  • time to response is set to the sum of D1 to D3 (R1), and
  • time to recover (R2)

"The authors argue, that prior work has pretty much focused on the generation and selection of recovery plans, but only after the event has occurred. ... Time-based risk management now tries to reduce the time needed for the other elements, since 'just as 80% of the total cost of a product is determined during the product design phase, the activities that take place for designing response can have significant effect on the overall impact of a disruption.' ... Furthermore, based on three case studies ... the authors argue that a longer response time can also lead to hugely increased recovery times. 'This is mainly because the magnitude of the problem triggered by the event escalated exponential[ly] over time.'"

The thing I like about this approach is that it forces risk managers to focus on situations in which the time between detection and response is near zero (i.e., events with very high velocity). Dealing with events that unfold more gradually is a much easier task if a company can master responses to high velocity events. Dumke continues:

"The authors name five time-based disruption management strategies to reduce the response time (R1) and therefore also the recovery time (R2).

  1. Work with suppliers and customers to map risks
  2. Define roles and responsibilities
  3. Develop monitoring/advance warning systems for detection
  4. Design recovery plans
  5. Develop scenario plans and conduct stress tests"

Dumke concludes:

"Sodhi and Tang use very illustrative cases to make their points for a time-based risk management approach. This approach represents a corporate strategy which of course has to include a very broad definition of supply chain management. So not only the logistics and manufacturing parts are included, but also product design, finance, [and so forth].

I suspect that the time element is often overlooked in corporate risk management discussions. Carol McIntosh writes, "Recent natural disasters are forcing supply chain executives to take a hard look at their supplier relationships. There has always been a risk of natural disasters but the frequency is increasing with more substantial consequences. The global nature of the supply chain creates more risk as the consumer is much more likely to buy elsewhere. Companies can't rely on customer loyalty." ["What's the cost of addressing supply chain risk?" The 21st Century Supply Chain, 26 July 2011] She continues, "Supply chain risk analysis involves a number of criteria. Cost is one, but also the characteristics of the material. Is it custom? Are their capacity limitations with your supplier? Are there currency risks? What is the lead-time?" I think the folks at Supply Chain Digest are right to add: How fast can the risk unfold? McIntosh cites an article that concludes, "Companies that manage supply chain risks effectively will outperform those that ignore or are blindsided by them." One of those blindsides could be time.

February 22, 2012

Big Data: Hope and Hype, Part 2

Part 1 of this two-part series involved a discussion about a McKinsey study that concluded that Big Data represents the next frontier. A portion of that discussion included concerns about Big Data analysis raised by Daniel W. Rasmus, who isn't quite as sanguine about the future of Big Data as the analysts at McKinsey & Company. ["Why Big Data Won’t Make You Smart, Rich, Or Pretty," Fast Company, 27 January 2012] The discussion ended with two of Rasmus' nine "existential threats to the success of Big Data and its applications." In this post, I'll discuss the remaining threats on his list. Rasmus' next threat involves complexity. He writes:

"Combining models full of nuance and obscurity increases complexity. Organizations that plan complex uses of Big Data and the algorithms that analyze the data need to think about continuity and succession planning in order to maintain the accuracy and relevance of their models over time, and they need to be very cautious about the time it will take to integrate, and the value of results achieved, from data and models that border on the cryptic."

Combining models is not the only complexity involved in Big Data. Most observers agree that there are three "Vs" associated with Big Data: volume (terabytes to petabytes and beyond); velocity (including real-time, sub-second delivery); and variety (encompassing structured, unstructured and semi-structured formats). To those three, some observers add a fourth "V": volatility (which involves the ever-changing sources of data, e.g., new apps, web services, social networks, etc.). Rasmus' next concern involves feedback loops. He writes:

"Big Data isn’t just about the size of well-understood data sets, it is about linking disparate data sets and then creating connective tissue, either through design or inference, between these data sets."

I couldn't agree more. At the heart of Enterra’s approach is an artificial intelligence (AI) knowledge-base that includes an ontology and extended business rules capable of advanced inference. Ontology interrelates concepts and facts with many-to-many relationships that are generationally more advanced and appropriate for artificial intelligence applications than standard relational databases. It creates the "connective tissue" discussed by Rasmus. His next concern is about the algorithms that drive Big Data applications. He writes:

"It is not only algorithms that can go wrong when a theory proves incorrect or the assumptions underlying the algorithm change. There are places where no theory exists at any level of consensus to be meaningful. The impact of education (and the effectiveness of various approaches), how innovation works, or what triggers a fad are examples of behaviors for which little valid theory exists--it's not that plenty of opinion about various approaches or models is lacking, but that a theory, in the scientific sense, is nonexistent. For Big Data that means a number of things, first and foremost, that if you don't have a working theory, you probably don't know what data you need to test any hypotheses you may posit. It also means that data scientists can't create a model because no reliable underlying logic exists that can be encoded into a model."

I agree with Rasmus that a business shouldn't consider a Big Data solution for any process that they don't fundamentally understand. No one should know a business better than those who own and operate it. A solutions provider needs to work closely with a company to ensure that the model and algorithms they provide are right and that the data being gathered and analyzed are correct. Rasmus' next concern involves confirmation bias. He writes:

"Every model is based on historical assumptions and perceptual biases. Regardless of the sophistication of the science, we often create models that help us see what we want to see, using data selected as a good indicator of such a perception. ... Even when a model exists that is designed to aid in decision making about the future, that model may involve contentious disagreements about its validity and alternative approaches that yield very different results. These are important debates in the world of Big Data. One group of modelers advocates for one approach, and another group, an alternative approach, both using sophisticated data and black boxes (as far as the uninitiated business person is concerned) to support their cases. The fact is that in cases like this, no one knows the answer definitively as the application may be contextual or it may be incomplete (e.g., a new approach may solve the issue that none of the current approaches solves completely). What can be said, and what must be remembered is, the adage that 'a futurist is never wrong today.'"

Clearly Big Data has some value when it comes to forecasting; but, Rasmus' concerns are nonetheless valid. Eliminating (or, at least, reducing) confirmation bias in such systems is an important consideration to keep in mind. Rasmus' next concern involves the fact that the world changes (i.e., that it is not a good idea to steer a ship by looking astern). He writes:

"We must remember that all data is historical. There is no data from or about the future. Future context changes cannot be built into a model because they cannot be anticipated. Consider this: 2012 is the 50th anniversary of the 1962 Seattle World’s Fair. In 1962, the retail world was dominated by Sears, Montgomery Ward, Woolworth, A&P, and Kresge. Some of those companies no longer exist, and others have merged to the point that they are unrecognizable from their 1962 incarnations. ... Would models of retail supply chains built in 1962 be able to anticipate the overwhelming disruption that [Wal-Mart's] humble storefront would cause for retail? Did Sam Walton understand the impact of Amazon.com when it went live in 1995? The answer to all of the above is 'no.' These innovations are rare and hugely disruptive."

Rasmus is arguing that organizations must be flexible and that models they use must have feedback loops if they are to maintain "relevance through incremental improvement." He then reminds us that occasionally "the world changes so much that current assumptions become irrelevant and the clock must be started again. Not only must we remember that all data is historical, but we must also remember that at some point historical data becomes irrelevant when the context changes." Rasmus' next concern involves motives. He writes:

"Given the complexity of the data and associated models, along with various intended of unintended biases, organizations have to go out of their way to discern the motives of those developing analytics models, lest they allow programs to manipulate data in a way that may precipitate negative social, legal, or fiduciary outcomes."

We all know that there are numerous privacy concerns associated with the collection and analysis of Big Data. I suspect that privacy concerns are more likely to spur outrage in the general populace than any other concern. Along with data breaches, they are also likely to get a company in trouble more often than other concerns. Rasmus' final concern involves issues about actions that are taken as a result of Big Data analysis. He writes:

"Consider crime analysis. George Mohler of Santa Clara University in California has applied equations that predict earthquake aftershocks to crime. By using location and data and times of recent crimes, the system predicts 'aftercrimes.' This kind of anticipatory data may result in bastions of police flooding a neighborhood following one burglary. With no police presence, the anticipated crimes may well take place. If the burglars, however, see an increase in surveillance and police activity, they may abandon planned targets and seek new ones, thus invalidating the models' predictions, potentially in terms of time and location. The proponents of Big Data need to ensure that the users of their models understand the intricacies of trend analysis, what a trend really is, and the implications of acting on a model’s recommendations."

All of these concerns might lead you to believe that Rasmus is anti-Big Data. He's not. He admits that "some of the emerging Big Data stories don't test the existential limits of technology, nor do they threaten global catastrophe." In other words, there are applications for Big Data that are useful. He writes:

"Big Data will no doubt be used to target advertising, reduce fraud, fight crime, find tax evaders, collect child support payments, create better health outcomes, and myriad other activities from the mundane to the ridiculous. And along the way, the software companies and those who invested in Big Data will share their stories."

Rasmus is interested in how Big Data can improve the quality of life not just a company's bottom line. He provides a few examples:

"Companies like monumental constructor Arup use Big Data as a way to better model the use of the buildings they build. The Arup software arm, Oasys, recently acquired MassMotion to help them understand the flow of people through buildings. ... The result is a model, sometimes with thousands of avatars, pushing and shoving, congregating and separating--all based on MassMotion’s Erin Morrow and how he perceives the world. Another movement oriented application of Big Data, Jyotish (Sanskrit for astrology), comes from Boeing’s research center at the University of Illinois in Urbana-Champaign. This application predicts the movement of work crews within Boeing’s factories. It will ultimately help them figure out how to save costs and increase satisfaction by ensuring that services, like Wi-Fi, are available where and when they are needed. Palantir, the Palo Alto-based startup focused on solving the intelligence problem of 9/11, discovers correlations in the data that informs military and intelligence agencies who, what, and when a potential threat turns into an imminent threat. ... For some fields, like biology, placing large data sets into open source areas may bring a kind of convergence as collaboration ensues. But as Michael Nielsen points out in Reinventing Discovery, scientists have very little motivation to collaborate given the nature of publication, reputation, and tenure."

Rasmus concludes, "I seriously doubt that we have the intellectual infrastructure to support the collaborative capabilities of the Internet. We may well be able to connect all sorts of data and run all kinds of analyses, but in the end, we may not be equipped to apply the technology in a meaningful and safe way at scales that outstrip our ability to represent, understand, and validate the models and their data." At this point in time, Rasmus is probably correct. Who knows what the world of computing will look like a half-century or century from now. If organizations simply used data to improve business processes, increase marketing opportunities, or better position inventory, Rasmus might have a cheerier view of Big Data. He seems to believe, however, that much more sinister things are afoot. He ends his article this way:

"The future of Big Data lies not in the stories of anecdotal triumph that report sophisticated, but limited accomplishments--no, the future of Big Data rather lies in the darkness of context change, complexity, and overconfidence. I will end, as [Chicago professor Richard H. Thaler] did in his New York Times article ("The Overconfidence Problem in Forecasting"), by quoting Mark Twain: 'It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so.'"

Just because Big Data can be (and likely is) abused, doesn't mean that there are no benefits to be gained through its collection and analysis. Like any other area of business, ethics is important when dealing with Big Data. The story about Big Data is just beginning to be written. There are likely to be plot twists and turns; but, in the end, my biases tell me that the world will benefit from all this data in ways that are not yet apparent. But it's good to have gadflies like Rasmus reminding us of potential misuses.

February 21, 2012

Big Data: Hope and Hype, Part 1

According to McKinsey analysts, "'Big data' is the 'next frontier for innovation, competition and productivity." ["Big data is 'the next frontier'," by Jessica Twentyman, Financial Times, 14 November 2011] Daniel W. Rasmus isn't quite so sanguine about the future of Big Data. "If 2012 is the year of Big Data," he writes, "it will likely be the year vendors and consultants start to over-promise, under-deliver, and put processes in motion that will generate insights and potential risks for years to come." ["Why Big Data Won’t Make You Smart, Rich, Or Pretty," Fast Company, 27 January 2012] As President and CEO of company that analyzes Big Data, I believe that both points of view have merit. I know that sounds like a waffle; but, historically, the "next big thing" has always been over-hyped before proving itself to have lasting value.

First, let's examine why McKinsey analysts believe that Big Data is the next frontier. Twentyman reports:

"A recent report by the management consultancy argued that the successful companies of tomorrow, whether they are market leaders or feisty start-ups, will be those that are able to capture, analyse and draw meaningful insight from large stores of corporate and customer information. The implication is that businesses that cannot do so will struggle. For that reason, McKinsey argues, 'all companies need to take big data seriously'."

The key words in that paragraph are "meaningful insight." Mountains of data are useless unless some actionable insights can be drawn from it. The challenge, of course, is that so much data is being generated that it is impossible to glean anything from it manually. That is why Twentyman reports that IT companies enthusiastically agree with the conclusions of the McKinsey report. The message (i.e., data analysis is important for businesses), she writes, "helps sell information management systems and software." She continues:

"[Big] data stands out in four ways, according to James Kobielus, analyst with Forrester Research: for its volume (from hundreds of terabytes to petabytes and beyond); its velocity (up to and including real-time, sub-second delivery); its variety (encompassing structured, unstructured and semi-structured formats); and its volatility (where scores to hundreds of new data sources come and go from new apps, web services, social networks and so on)."

For his part, Rasmus is uncomfortable with all these data sources. "The vast hordes of data [collected] during e-commerce transactions, from loyalty programs, employment records, supply chain and ERP systems are, or are about to get, cozy," he writes. "Uncomfortably cozy." He continues:

"Let me start by saying there is nothing inherently wrong with Big Data. Big Data is a thing, and like anything, it can be used for good or for evil. It can be used appropriately given known limitations, or stretched wantonly until its principles fray. ... The meaningful use of Big Data lies somewhere between these two extremes. For Big Data to move from anything more than an instantiation of databases running in logical or physical proximity, to data that can be meaningfully mined for insight, requires new skills, new perspectives, and new cautions."

He's afraid that new cautions are being ignored. As an example, he points to Dirk Helbing of the Swiss Federal Institute of Technology in Zurich, who is spending more than €1-billion on a project whose aim is "nothing less than [foretelling] the future." Rasmus writes that Helbing's project hopes to "anticipate the future by linking social, scientific, and economic data." If it succeeds, Rasmus writes, "This system could be used to help advise world governments on the most salient choices to make." He continues:

"Given the woes of Europe, spending €1-billion on such a project will likely prove to be wasted money. We, of course, don't have a mechanical futurist to evaluate that position, but we do have history. Whenever there is an existential problem facing the world, charlatans appear to dazzle the masses with feats of magic and wonder. I don't see this proposal being anything more than the latest version of apocalyptic sorcery."

In a post entitled Artificial Intelligence: The Quest for Machines that Think Like Humans, Part 1, I cited an article that discussed a DARPA-supported IBM project involving cognitive computing. The head of the project hopes to develop a cognitive computing system than can do things like monitor the world's oceans and "constantly record and report metrics such as temperature, pressure, wave height, acoustics and ocean tide, and issue tsunami warnings based on its decision making." That's approaching the grandiose level that concerns Rasmus. The head of the IBM project also admits, however, that the system could be used for much more modest activities, like monitoring the freshness of produce on a grocer's shelves. While I agree in principle with Rasmus that Big Data can be used for both good and ill, I believe the good far outweighs the bad.

In the blog cited above, I identified several technologists who believe we are a long way off from developing computers that think like humans. Rasmus counts himself among that number. Since Enterra Solutions uses a Cyc ontology, I found it interesting that Rasmus mentions Cyc. He writes:

"Cyc [is] a system conceived at the beginning of the computer era, [whose aim was] to combat Japan's Fifth Generation Project as it supposedly threatened to out-innovate America's nascent lead in computer technology. Although Cyc has yielded some use, it has not yet become the artificial human mind it was intended to be, able to converse naturally with anyone about the events, concepts, and objects in the world. And artificial intelligence, as imagined in the 1980s, has yet to transform the human condition."

I agree that Cyc has not resulted in computer systems that think like humans. I also agree that it has been used to create some very useful artificial intelligence systems that are more nuanced than some other applications. Cyc ontologies help add common sense into AI systems that are notorious for lacking it. Rasmus' bottom line: "As Big Data becomes the next great savior of business and humanity, we need to remain skeptical of its promises as well as its applications and aspirations."

As president of a company that analyzes Big Data, I agree with Rasmus that we shouldn't let the hype get ahead of the reality. Big Data allows us to dream big; but, those dreams must be anchored in a cold business reality that can provide a solid return on investment. The reason that analysts and IT companies are enthusiastic about Big Data is that the tools necessary to gain insights from it are not very old. That means we are only beginning to understand what can be done with Big Data. Rasmus, however, sees "a number of existential threats to the success of Big Data and its applications." The first threat is the flip side of hype -- overconfidence. Rasmus writes:

"Many managers creating a project plan, drawing up a budget, or managing a hedge fund trust their forecasts based on personal abilities and confidence in their knowledge and experience. As University of Chicago professor Richard H. Thaler recently pointed out in the New York Times ("The Overconfidence Problem in Forecasting"), most managers are overconfident and miscalibrated. In other words, they don't recognize their own inability to forecast the future, nor do they recognize the inherent volatility of markets. Both of these traits portend big problems for Big Data as humans code their assumptions about the world into algorithms: people don't understand their personal limitations, nor do they recognize if a model is good or not."

Rasmus' concern is valid to a point. One of the reasons that researchers are trying to develop artificial intelligence systems is to eliminate bias. By equipping systems with a few simple rules, researchers are letting systems "learn" on their own. Enterra's supply chain optimization solutions, for example, use a Sense, Think/Learn, Act™ system. We believe that machine learning is an extremely valuable tool. However, when decision makers are presented with information upon which they must act, AI systems can't completely eliminate decision maker bias or overconfidence. Rasmus' next concern is a graver one. Are Big Data solutions going to get so large that no one is going to be able to understand and challenge all of the assumptions used to generate its algorithms? It's a good question. Rasmus writes:

"Even in a field as seemingly physical and visceral as fossil hunting, Big Data is playing a role. Geologic data has been fed into a model that helps pinpoint good fossil-hunting fields. On the surface that appears a useful discovery, but if you dig a bit deeper, you find a lesson for would-be Big Data modelers. As technology and data sophistication increases, the underlying assumptions in the model must change. Current data, derived from the analysis of Landsat photos, can direct field workers toward a fairly large, but promising area with multiple types of rock exposures. Eventually the team hopes to increase their 15-meter resolution to 15-centimeter resolution by acquiring higher-resolution data. As they examine the new data, they will need to change their analysis approach to recognize features not previously available (for more see "Artificial intelligence joins the fossil hunt" in New Scientist). Learning will mean reinterpreting the model."

Anytime you change the parameters of a query you are likely to get different results. Rasmus' concern is that if you change enough parameters in a large system you might not really know some of the underlying assumptions the model is now making. Rasmus' next example underscores that point. He continues:

"On a more abstract level, recent work conducted by ETH Zurich looked at 43,000 transnational companies seeking to understand the relationships between those companies and their potential for influence. This analysis found that 1,318 companies were tightly connected, with an average of 20 connections, representing about 60 percent of global revenues. Deeper analysis revealed a 'super-entity' of 147 firms that accounts for about 40 percent of the wealth in the network. This type of analysis has been conducted before, but the Zurich team included indirect ownership, which changed the outcome significantly (for more see "The network of global control" by Bitali, Glattfelder, and Battiston). If organizations rely on Big Data to connect far-ranging databases--well beyond corporate ownership or maps of certain geologies--who, it must be asked, will understand enough of the model to challenge its underlying assumptions, and re-craft those assumptions when the world, and the data that reflects it, changes?"

That's a good question on which to end the first post of this two-part series. Companies can avoid the dilemma Rasmus has identified by identifying more modest goals than forecasting the future with precision. At Enterra Solutions, we believe that Big Data applications are used best in management-by-exception situations (i.e., where decision makers have the final say, but are only involved when the system identifies that a situation that is abnormal). Monitoring Big Data for abnormalities can be just as important as mining it for deep insights. Tomorrow I'll look at the remainder of Rasmus' existential threats to the success of Big Data and its applications.

February 20, 2012

Moving S&OP Forward: Better Tools or Better Questions?

"Aligning supply and demand is a core purpose of the supply chain function in an enterprise," asserts Bill McBeath, "but [it] is rarely done well." ["Demanding Times: Part Three - Aligning Supply and Demand," ChainLink Research, 4 October 2011] McBeath continues:

"Far too many companies still lack coordination between those in charge of generating and managing demand and those in charge of ensuring supply. When demand and supply are not coordinated, it leads to ludicrous situations such as the manufacturing plant building more inventory while at the same time marketing is reducing price to get rid of inventory."

McBeath's example of a "ludicrous situation" highlights the downside of failing to align corporate resources using an effective Sales and Operations Planning (S&OP) process. He writes:

"Sales and operations planning is meant to break down these walls between operations, sales, marketing and finance in order to align supply and demand, but typically suffers from inaccurate input forecasts and long (monthly) cycles resulting in stale plans that don't incorporate the latest information. Sales is not held accountable for forecast accuracy and is not incented to help align supply and demand."

I am all for anything that breaks down walls (or silos) within a company, but supply chain analyst Lora Cecere asserts that too many supply chain professionals want to blame the sales force for placing obstacles in the path of corporate alignment. She claims that supply chain professionals seem to like playing the role of S&OP victim. ["And, the Question is…?" Supply Chain Shaman, 3 February 2012] She explains:

"I feel that many supply chain professionals act like a victim when they talk about sales involvement in S&OP. They are a martyr for the operations cause. Their conversations are centered on these questions:

  • How do you get sales to give you a better forecast?
  • How do you get sales to come to the S&OP meetings?
  • How do you drive ownership of inventory with sales?

"Let me let you in on a little known secret. You don’t. These are the wrong questions. You will never be successful trying to leverage change from the supply chain team; especially if you have a victim mentality trying to make sales “responsible for inventory and forecasts.” You will retire <or get fired> before you MAKE sales do anything."

I don't believe that McBeath and Cecere are at fundamental odds on this point, they just offer different solutions for addressing the challenge. I like Cecere's approach as a starting point. She's of the basic belief, as am I, that good solutions start with good questions. Since she believes that the supply chain professionals have been asking the wrong questions, not just bad ones, she is not surprised that good solutions that improve S&OP processes have not been forthcoming. She indicates that the first thing you need to do is get the sales department intimately involved in the S&OP process in order to get its attention. She writes:

"How? Start with the fact that S&OP on average drives a 2% increase in sales. <Now you have their attention. This is something that they care about.> Then make it worth their while …

  • How do you eliminate sales bias? Apply lean forecasting approaches to the forecasting process to eradicate sales bias and error. Make all people accountable by understanding the value of their contribution to the forecast.

  • What is the role of sales in the forecasting process? Don’t waste their time. Do not ask sales to forecast. Ask them for input on general trends and apply it to the forecast. Sales should never be asked to forecast at an item level.

  • How do you get sales to the S&OP meetings? Make it worth their while by making it important to their boss.

  • How do you make sales responsible for inventory? You don’t. You make the entire team accountable for inventory as part of the S&OP process.

"There are natural tensions between sales and operations. Use the tensions to improve the process. Never wear the cloak of the victim. <No one looks good in that color…>"

McBeath certainly agrees with Cecere when she writes that making it worth the sales team's time is the key to getting them involved. He believes that consensus planning is one of the best ways to ensure that the sales team cares about the process. He writes:

"Consensus planning tries to reconcile [different forecasting methods] intelligently to produce a more accurate forecast used across the firm to drive planning and execution. Elements of smart reconciliation include:

  • Integrate awareness of key demand and supply events, such as a large customer order or a promotion. Measure and incent participants on forecast accuracy. This is critical to change behavior and realize improvements. When there is no penalty for poor forecasts or reward for good ones, salespeople will always over-forecast to make sure they have more than enough to meet, regardless of what combination of demand occurs. As a result, manufacturing never believes the forecasts and creates their own.

  • Leverage technologies that factor in the past forecast accuracy of each participation, adjusted each time period. Some of these can do sophisticated weighting in order to produce a more accurate forecast than any of the individual contributors.

  • Leverage technology that enables scenario building and analysis. Use tools that let people work in the way they are used to, usually with Excel, while integrated to a shared database for analysis and reconciliation.

"Build in checks and balances. The person scrubbing the forecast should be on the alert for things that don't make sense, such as a part being forecast before the actual first availability date, perhaps caused by production delays. These conflicts can be resolved; for example the account manager can then go back to the customer to discuss the delay, rather than living with a false expectation, wrong forecast, and ultimately a disappointed customer."

McBeath and Cecere also agree that good forecasting approaches that help eradicate sales bias and error are important. McBeath insists, "Execution based on stale data and out-of-date plans is pervasive. The best companies have short planning cycles and rapid decision making cultures." Good processes and consensus planning are combined into what McBeath calls "true one-number planning." The one thing that many S&OP experts would disagree with McBeath on is his recommendation to rely mainly on Excel to get that "one true number." Although Excel is an excellent spreadsheet program, it can't handle all the disparate data that needs to be integrated in order to come up with one version of the truth. For more on this topic, keep reading.

Karin Bursa, vice president of marketing with Logility, is also a strong proponent of good S&OP processes. She insists such processes help "companies to achieve greater visibility and make more intelligent decisions about how to respond to real market demand." ["The Evolving World of Sales & Operations Planning," SupplyChainBrain, 22 June 2011] The article continues:

"Why the sudden surge of interest in sales and operations planning (S&OP)? Bursa says it's because companies are viewing it as a potent tool for breaking down organizational silos, an age-old problem. They see the possibility at last of getting everyone to focus on a common plan, while harmonizing high-level corporate strategy with operational and tactical measures. As a concept, S&OP is far from new. But Bursa says it has evolved 'significantly' in recent years. Technology is a major reason. For the first time, companies can get access to clean data, giving them the ability to model various scenarios. Even more important, though, is the changing attitude of executive leadership. Top managers know that S&OP can help teams throughout the organization make 'the best decisions for the business.'"

Although Bursa claims that S&OP has come a long way, McBeath and Cecere point out that there remains a long way to go. Bursa notes that "S&OP initially came out of the manufacturing sector, but it isn’t just about optimizing a manufacturing plan, she says." The article explains:

"It can guide companies toward achieving other key goals such as entering new markets, and opening or consolidating distribution centers. The evolution of S&OP tracks the desire of companies to fully understand demand and get closer to end customers. Sometimes, Bursa says, that might mean not producing at 100-percent capacity."

Bursa believes that it is critical that an empowered executive "be assigned to oversee the S&OP effort." She calls this executive a "process champion." The article continues:

"'It really is 60 percent process, 30 percent data and only 10 percent technology,' Bursa says. 'There has to be a process champion, with the goal to drive the best visibility for the business. The discipline from which that individual hails depends on the culture of the company,' she says. Bursa counters objections that S&OP is a laborious process. 'If you're hearing that,' she says, 'it's because it's disconnected.' A successful effort entails a streamlining of data gathering 'in one time-phased plan, to model multiple scenarios.' Such a goal, she says, 'cannot be done efficiently and effectively with spreadsheets.'"

As I noted above, Bursa's last comment is one that I see repeated often. Most analysts agree, when it comes to S&OP, the spreadsheet era is over (or, at least, it should be). John Westerveld, a demo architect for Kinaxis, agrees that a spreadsheet (like Excel) "is not a viable S&OP tool." ["What's the "right" S&OP tool?" The 21st Century Supply Chain, 2 February 2012] He has nothing against Excel. In fact, he writes, "[Excel] is a very powerful tool that works best with smaller data sets. Companies managing a complex supply chain soon find that they are stretching Excel's boundaries to the breaking point." That's the real point. So if a spreadsheet isn't the right tool, what is? Westerveld continues:

"Rather than call out a specific tool, let's look at some criteria that you need to consider when looking for an S&OP application:

  • I mentioned that Excel is not a good tool for S&OP is because it doesn’t support 'what-if' analysis. This is obviously something you need to look for in any tool for long range planning. But simply allowing you to hive off a version of the plan and make a change is not enough. The 'what-if' analysis must provide detailed analysis as to what the impact of a given change will be throughout the supply chain. A key supplier of low level components was just hit by a flood and won't be shipping parts for six months – what impact will this have on my production? What-if demand increases on one of my product lines? What impact does this have on a constrained resource?… on my contract manufacturers?… on my suppliers? Only a system with full supply chain analytics can address these questions."

At Enterra Solutions, we call these 'what if' impacts the perturbative effects of a potential disruption. It's important to know what kind of an effect an upstream event can have on downstream sales. Westerveld continues:

  • "A related requirement is integration. If you are going to provide answers to the 'what-if' questions above, you will likely need to have detailed information from multiple manufacturing centers around the world. These centers in all likelihood are running a variety of disparate ERP systems, so your tool will need to integrate across all these systems and bring the data in to a single environment."

There is obviously a lot that could be written on this topic. Integration is at the very heart of Big Data analysis and the three "Vs" commonly associated with it: Volume, Variety, and Velocity. Westerveld's final important characteristic is monitoring He writes:

  • "As mentioned earlier, the ability to monitor performance against the plan and report on exceptions is key. A S&OP plan is good. A S&OP plan that you monitor and react to when underperforming is truly powerful."

Westerveld concludes, "There are several other factors that can play into your selection of a system for S&OP, but I think these are key ones." McBeath ends his article the way he began it. He writes, "Aligning supply and demand is a core purpose of the supply chain function in an enterprise. It can make all the difference between profitably building and selling the very items your customers are actually desiring and buying vs. building the wrong items that aren't really needed, only to be sold at steep discounts." To achieve that purpose, a company must ask the right questions, field the right team, and deploy the right technologies.

February 17, 2012

Innovation from the Outside In

"The necessity for big corporations to seek innovative ideas from outside their own organisation is not new," writes Anthony Goodman. "IBM and Procter & Gamble were early pioneers in the last decade. Apple has made its externally driven App Store a key to its business success this decade." ["The benefits of giving prizes," Financial Times, 4 May 2011] Goodman continues:

"A.G. Lafley, the inspirational former CEO of P&G, was a noted advocate for what he called 'connect and develop' alongside traditional R&D. In a Harvard Business Review article in March 2006, two P&G executives described how they 'estimated that for every P&G researcher there were 200 scientists or engineers elsewhere in the world who were just as good – a total of perhaps 1.5m people whose talents we could potentially use.' Such thinking has influenced product development at pharmaceutical and technology companies and illustrates the benefit of tapping into extended networks beyond the walls of a single company."

The goal of "outside-in innovation" is to tap into the talents of others at a reasonable cost with good results. As Goodman puts it, "Simply being open to ideas from outside isn’t enough on its own." The specific strategy on which Goodman focuses is "the use of prizes [to] help lure the best thinking out into the open." He continues:

"As the Financial Times noted in December [2010]: 'inducement prizes have the potential to shine. Supplementing direct grants, patents and simple competitive pressures, they can refresh parts of the innovation system that other incentives cannot reach.' A multimillion dollar industry has grown up around the idea of prize-giving from the non-profit X Prize Foundation to private companies such as InnoCentive in the US and OmniCompete in the UK."

This is not the first time I've discussed the benefits of prizes to achieve specific goals (for example, see my posts entitled A New Approach to Innovation, More Prizes for Innovation, and Contests and Innovation). These posts date back several years. The fact that prizes for innovation continue to make headlines leads me to believe that they may be around for years to come. Vineet Nayar, CEO of HCL Technologies, which has revenues of $3.3bn, and author of Employees First, Customers Second, told Goodman: “Small experiments with ideas could give us rich returns. Crowd-sourcing ideas from the best young brains in the world who have not been ‘brainwashed’ with our current ways is one such experiment.” David Roth, a former Bain consultant, told Goodman that there were two primary benefits to contests:

"First, they promote the development of new ideas and suggestions. Second, they build communities of innovators and signal the broader importance of innovation in general – both in society and within the sponsoring organization. That’s really cool."

Peter Diamandis, the head of the X Prize Foundation, is convinced that "focused and talented teams in pursuit of a prize and acclaim can change the world." ["And the winner is…" The Economist, 5 August 2010] The article notes that prizes can be traced back to "the Longitude Prize [which] was set up by the British government in 1714 as a reward for reliable ways for mariners to determine longitude." The article continues:

"Incentive prizes do spur innovation. A study led by Liam Brunt of the Norwegian School of Economics scrutinised agricultural inventions in 19th-century Britain and found a link between prizes and subsequent patents. The Royal Agricultural Society awarded nearly 2,000 prizes from 1839 to 1939, some worth £1m ($1.6m) in today’s money. The study found that not only were prize-winners more likely to receive and renew patents, but that even losing contestants sought patents for more than 13,000 inventions."

In addition to the organizations named above, Kaggle, a firm started by Australian economist Anthony Goldbloom, was established in 2010 to run competition for companies. "The customer supplies a data set, tells Kaggle the question it wants answered, and decides how much prize money it’s willing to put up. Kaggle shapes these inputs into a contest for the data-crunching hordes. To date, about 25,000 people—including thousands of PhDs—have flocked to Kaggle to compete in dozens of contests backed by Ford, Deloitte, Microsoft, and other companies." ["Kaggle's Contests: Crunching Numbers for Fame and Glory," by Ashlee Vance, Bloomberg BusinessWeek, 4 January 2012]. Vance reports:

"By far the most lucrative prize on Kaggle is a $3 million reward offered by Heritage Provider Network to the person who can most accurately forecast which patients will be admitted to a hospital within the next year by looking at their past insurance claims data. More than 1,000 people have downloaded the anonymized data that covers four years of hospital visits, and they have until April 2013 to post answers."

The dream of Jeremy Howard, Kaggle’s chief scientist, is to make its best participants wealthy if not famous. "These guys should be earning as much as hedge fund managers and golfers," he says. Another outside-in strategy used by companies that HCL Technologies’ Nayar discussed with Goodman is called crowdsourcing.

"Small experiments with ideas could give us rich returns. Crowd-sourcing ideas from the best young brains in the world who have not been 'brainwashed' with our current ways is one such experiment."

Rachel Emma Silverman explains, "Crowdsourced labor usually involves breaking a project into tiny component tasks and farming those tasks out to the general public by posting the requests on a website. Many firms that use crowdsourcing pay pennies per microtask to complete projects such as tagging or verifying data, digitizing handwritten forms and database entry." ["Big Firms Try Crowdsourcing," Wall Street Journal, 17 January 2012] Silverman continues:

"Companies that have assigned work to the crowd say it is generally cheaper and faster than hiring temps or traditional outsourcing firms. Crowdsourced labor can cost companies less than half as much as typical outsourcing, says Panagiotis G. Ipeirotis, an associate professor at New York University's Stern School of Business, who studies crowdsourcing. Some individual microtasks can take just a few seconds and pay a few cents per task. More complex writing or transcription tasks might pay $10 or $20 per job, while some highly skilled work, such as writing programming code, commands higher rates."

Silverman points out that crowdsourcing has caused some concerns. She notes that "workers may sign up for tasks unaware of what their labor may be used for." Research has revealed that some crowdsource laborers have actually been used to help create spam. Silverman continues:

"Another concern is that crowdsourced labor risks creating what Harvard Law School professor Jonathan Zittrain calls "digital sweatshops," where workers who may be underage work long hours on mind-numbing tasks for very little money—or, if the work is structured as a game, for no money at all. Crowdsourcing sites often pitch their work to stay-at-home moms or students who can pick up a few tasks to do during short breaks."

Regardless of the strategy companies use, John Yuva, editor of Inside Supply Management, insists, "In today's marketplace, innovation is the cornerstone of competitive survival. Few companies can go it alone and remain viable." ["Share and Share Alike," Inside Supply Management, Vol. 22, No. 7, September 2011] Yuva obviously believes this is as true for the supply chain sector as it is for other business sectors. He continues:

"Fostering innovation with suppliers through collaborative support and trusted relationships is no easy task. Innovation for any company represents potential revenue and leverage over competitors. It's time to dismantle the barriers to collaborative innovation and consider the value proposition and return on investment that is attainable. It's going to require giving and taking from both sides. However, the results may transform how the supply chain collaborates, and bring solutions to the marketplace."

Yuva makes a number of interesting points which I will discuss in a future post. Even if a company isn't a good fit for contests or crowdsourcing, it still needs outside-in help. Often that help comes from customers. Ravi Mattu asserts, "Being innovative today means finding how to engage with your customers as early in the process and as deeply as possible." ["Innovation is all about the customer," Financial Times, 14 November 2011] Mattu continues:

"Context matters. ... The smartest businesses, from retailers to fashion brands to credit card businesses and manufacturers, are trying to engage their customers to drive innovation in their offerings. ... B.J. Emerson, head of technology [Tasti D-Lite, a US low-calorie frozen dessert chain], says a failure to engage with customers who are already using [social media] technologies is a form of 'social negligence'. ... Using customers to inform your decisions directly is not just about the clever use of technology. Good designers such as Michael Bierut, a partner at Pentagram and co-founder of the Design Observer weblog, understand this better than most. One of his rules for being innovative ... is to 'shut up and listen'. ... For big organisations, the big challenge is how to create a culture that encourages customer-led innovation. The best answer I have heard was from R. Gopalakrishnan, a senior executive at Tata. He said that companies need to be willing to look beyond market research and focus groups to understand the context of what their customers' needs are."

Gopalakrishnan's advice sounds very much like advice you might hear from designers at IDEO or from Harvard Professor Clayton Christensen. Designers at IDEO like to take a "Deep Dive" so that they can understand the context in which consumers will be using the products they are designing. Christensen advises companies to look beyond what customers are saying to discover what they actually need. Henry Ford has been attributed (some claim incorrectly) with saying, "If I'd asked my customers what they wanted, they'd have said a faster horse." To paraphrase an old adage: a company wrapped up in itself makes a very small package. Outside-in strategies offer companies lots of advantages; but the greatest advantage is a fresh perspective.

February 16, 2012

The Other Side of the Supply Chain Street: Reverse Logistics

Steve Sensing, Vice President and General Manager for Ryder’s Hi-Tech/Electronics vertical, believes "reverse logistics has become an area of high priority for companies looking to reduce costs, add efficiencies and improve the customer experience." ["Reverse Logistics: The Untapped Revenue Stream," Logistics Viewpoints, 4 August 2011] Clyde Mount, President of 3PL Worldwide Inc., agrees with Sensing. He writes, "In these days of liberal return policies, satisfaction guarantees and customer rights, there are increasing opportunities to build a relationship with your customer, save the sale and even increase order values through creative up-sell and cross-sell programs." ["The Supply Chain is Not a One-Way Street," SupplyChainBrain, 1 February 2012] Mount continues:

"For most supply chain professionals, the main focus on the flow of goods lands somewhere between the procurement of raw materials and their delivery to the customer. Lately there has been considerable attention paid to modeling, simulation and analytics within the supply chain, ensuring initial deliveries will meet demand and that replenishment can be timely and achievable at a reasonable cost. But an often overlooked part of the process relates to reverse logistics—the process of managing returns, processing refunds and the various customer touch points visited along the way."

Mount indicates companies should leverage their reverse logistics processes. Doing so presents an opportunity "that directly affects revenue and the bottom line, but is often overlooked within the supply chain industry." He offers a "few basic rules" for providers, beginning with policy. He writes:

"Build a fair return policy into your P&L and reserves. Quality customers should always be top of mind when writing your return policy. Consumers want easy returns, including reasonable time frames, the ability to receive credit and no penalties for making the return or refund. How quickly should customers expect to receive credit or a new product? What condition does merchandise need to be in for it to be returned? These questions and more must all be detailed in your policy."

In today's world of multi-channel commerce, building customer loyalty is becoming more difficult. Fair return policies foster such loyalty. Mount's next recommendation is to communicate your return policies clearly and broadly. He writes:

"Clearly state your policies in all your sales channels. Articulated standards around returns and refunds should be laid out clearly for customers and potential customers wherever the company has had a presence – online, in print, on sales collateral, and so forth. In general, a 'click-to-agree' or other affirmative button signifying acceptance of your policy should be displayed on any online checkout screens. Clearly communicating your return policy to customers prior to a sale can help prevent chargebacks and reinforce your case if a dispute does occur – and it is simply good customer service, too. Create your policy in as many places as possible and enforce without exceptions."

I agree that a "click-to-agree" button is a good thing to have for on-line purchases; but don't require your customers to make multiple clicks to find out what they are agreeing to. A short, clear policy (like the signs used in brick-and-mortar stores) should be right above the "click-to-agree" button. Mount's next recommendation is to make sure you fully understand the system you are using.

"Understand how your payment processor and merchant bank view returns, refunds, and calculate their reserves. Most merchant banks require that return and refund policies are made conveniently available to customers to prevent misunderstandings. All merchant banks maintain a chargeback threshold. For instance, if a bank's threshold is 2 percent, it may sever ties with you once your chargebacks reach 2 percent of your sales for a given period. When shopping for a bank to partner with, take their section on thresholds into consideration. Ask if the bank has a chargeback management staff to aid in customer disputes. Most importantly – especially if you are doing most of your business online – ask whether or not your business would be considered a high-risk merchant account. If so, this would likely necessitate a reserve account being set up in your name, which can in some cases severely inhibit company growth, including funds for marketing, capital investment and cash flow."

Business executives shouldn't have to be told that the devil is in the details; but, I guess we all have to be reminded occasionally of that age-old adage. The details about cash flow, working capital, cash reserves, etc. are always important for a business. Mount's next recommendation involves processing returns. He writes:

"Process returns and refunds promptly. Within the supply chain, everyone is affected by customer dissatisfaction, returns and refunds, which can cost two to three times a standard order depending on the product – but merchants are affected the most. The actual return is not the risk; rather it's the lack of issuing timely credits that can negatively affect the merchant. The smartest supply chain providers utilize advanced order management systems that benefit merchant account relationships by simplifying a return and refund process, preventing chargebacks and maintaining high levels of customer satisfaction. When a customer is unhappy, the merchant runs the risk not only of losing the customer but their merchant account altogether if too many refund requests pile up and are not processed in a timely manner. Streamlining the return and refund process and cutting risk is an essential and attainable goal – one any successful marketer must drive to achieve."

Cash flow is no less important for your customers than it is for you. Whether that customer is the end-user or the merchant selling a product, refunding owed money sluggishly is going to create irritation if not anger. On the other hand, as Mount implies, prompt refunds can foster customer loyalty and satisfaction and, in the long run, save you money. Mount's final recommendation involves saving the sale and building relations by maintaining a good reverse logistics process. He writes:

"Use each of these customer touch points as an opportunity to save the sale and build a relationship. It is considerably less expensive to build on an existing customer relationship than acquire a new one; it is also a missed opportunity when you do not transform the return process into a moment for customer satisfaction. It is also beneficial to offer the customer free shipping, special customer discounts or delayed payments on a case-by-case basis. The opportunity to up-sell and cross-sell during the refund and return process is an important consideration to drive revenue and profits up as well. Following up with an e-mail thanking customers for contacting you with their issue is also a great way to drive sales and continue to build the relationship."

When Mount states that the supply chain is not a one-way street, he has more in mind than just the logistics of getting products returned. As his comments clearly indicate, he is also thinking about customer relations. He concludes:

"It is generally thought that good customer relationships are built in the beginning – when a customer orders a product of their interest and is satisfied upon delivery – but the truth is this relationship is strengthened when the marketer deals well with issues after the order, and the customer sees how effortless the return process can be. So put forth the energy to make yours as efficient as possible, advertise your policy clearly, train your employees to use these interactions as moments to not only satisfy customers but boost sales, then stand back – and watch your business thrive."

Not all companies are equipped to adequately handle the reverse logistics on their own. If that is case, Steve Sensing recommends considering a 3PL solution. He writes:

"Once a supply chain afterthought, reverse logistics has evolved into a highly complex endeavor. This is especially true in the hi-tech/electronics sector, where product lifecycles have dramatically shortened, global service networks create more supply chain complexity, products are highly customized to consumer preferences and sustainable practices are increasingly required. For many companies, the whole reverse logistics/returns management process has been a kind of black hole; a cost center that offers little visibility into which products are in the pipeline, whether they should be repaired, repackaged, restocked, recycled or disposed of in some other way, or whether they belong in the reverse channel at all. However, effectively managing the reverse supply chain has increasingly become more important to the operational and financial performance of companies."

On that last point, Mount and Sensing are in complete agreement. In fact, as you will see, they are in agreement on many points concerning reverse logistics. Sensing continues:

"More than ever, companies are using robust, efficient reverse logistics networks to:

  • Increase velocity
  • Reduce costs (transportation, administrative, aftermarket support)
  • Gain service market share
  • Improve customer service and retention
  • Meet sustainability goals"

Although Mount didn't discuss what is done with products once they're returned, I suspect he would agree with Sensing that dealing with them responsibly enhances a company's reputation and bottom line. Sensing discusses how reverse logistics can help a company meet its sustainability goals. He writes:

"Reverse logistics is ... intrinsically aligned with sustainability. Instead of carting products to landfills, companies are recovering the value of the assets through a variety of other paths, such as returning to stock, donations, secondary market sales and recycling. When companies maximize tons per mile, consolidate shipments, reduce returns and optimize product disposition/asset recovery processes, they are simultaneously reducing harmful emissions and energy usage, while increasing profitability and asset utilization. In today's markets, the total cost of logistics is increasingly defined in terms of carbon impact. As 'going green' becomes a standard business practice, consumers are asking for measurements around climate change impacts, energy consumption and emissions."

Although I agree that environmental and stockholder activists are worried about carbon impact, most consumers aren't. Once they've returned a product, few consumers think about what happens to it. That's why Sensing's arguments about recovering the value of returned assets will likely have greater impact on a company's policies than environmental considerations. Fortunately, as he points out, one benefits the other. As Sensing puts it: "The synergy is obvious: end-to-end reverse logistics/product lifecycle management solutions translate into energy savings, provide economic value and strengthen customer relationships."

Since Ryder advertises itself as a "provider of leading-edge transportation, logistics and supply chain management solutions," it's only natural that Sensing should discuss the "advantages of outsourcing reverse logistics." He writes:

"Companies that outsource some or all of their logistics services are looking for better control of their supply chains to drive quality, reduce costs, increase visibility and improve inventory management. For reverse logistics, this means increasing the speed and efficiency of recovering, inspecting, testing and dispositioning returned products. A growing number of companies are turning to 3PLs to meet those goals. Reverse logistics is well-suited for outsourcing. Unlike forward logistics, it is characterized by uncertainty of supply; no one can easily predict which products are coming back, when they're coming back or in what condition they'll arrive in. Adding to the complexity is the customized nature of reverse logistics supply chains, which operate under company-specific rules that can vary for thousands of different SKUs."

When commentators discuss supply chain complexity, most of them are thinking about forward logistics and processes. Sensing makes a great point when he notes that there is considerable complexity involved in reverse logistics as well. He notes that "effective reverse logistics management requires a broad range of operational, technical and strategic capabilities." He provides a list of those capabilities, which include:

  • Scale and flexibility to meet changing business needs
  • Industry and geographic expertise
  • Visibility into the full product life cycle
  • Refurbishment/distribution center management
  • Web-based technologies and data integration

If you don't think your company is up to snuff on those capabilities, then outsourcing reverse logistics might be a good option. Sensing asserts that if you are considering outsourcing your reverse logistics operation, you should ask a few important questions about the providers you are considering. They include:

  • Do they have measurable performance standards?
  • Do they have the ability to shift fixed costs to a transaction-based environment?
  • Can they integrate forward and reverse logistics with overall supply chain strategies?
  • Do they have leverageable infrastructure and transportation resources and move products into secondary markets, e-waste streams or back into the forward supply chain?

Obviously, Sensing believes that 3PL reverse logistics providers can often generate a lot of added value for companies. He concludes:

"Companies ... should keep in mind the cost reductions, supply chain efficiencies and improved asset recovery rates that a robust reverse logistics network can provide. In the face of ongoing competitive and economic pressures, companies should carefully weigh the benefits of working with trusted supply chain partners to navigate the complex world of reverse logistics/product lifecycle management. By doing so they can establish themselves as leaders in sustainable supply chain management, while at the same time, unlocking the hidden value of reverse logistics, one of the supply chain’s last untapped revenue streams."

Mount would add that the other main value of a good reverse logistics process, one that is not so hidden, is improved customer relations and loyalty. When companies recognize that supply chains aren't one way streets, they come to appreciate opportunities that exist on the other side of the road.