Archive for October, 2009

Beware Sharp Edges!

October 23, 2009


I am sometimes chastised for saying it out loud, but engineers have a hard time with context.  Every physics homework problem that advises, “ignore the effects of gravity and friction” adds another brick to the wall that separates solutions to technical problems from solutions that are meaningful to customers.  I am not making a value judgment.  In fact, we would never make technical progress at all if every possible real-world variable had to be taken into account at the outset of a project.  An engineer once worked for me who insisted on starting every engagement with “What do we mean by reliability?”  before listing all of the possible ways that a system – any system, not necessarily just the one we were supposed to be talking about – could be unreliable.  None of those discussions ever came to a satisfactory conclusion.

However, as we saw in “Well, what kind of fraud is it?“, worlds collide when there is confusion about context. The collisions are damaging to business and sometimes it is impossible to recover from them.  It may be a technical feat to hone the edges of a warning sign to lethal sharpness, but it is not the purpose the sign.

Corporate culture can make it hard to blend context, and it is especially hard for companies with strong engineering roots to draw the line between valued technical advice and technical value that can be delivered to customers.  There was an internal joke at HP:

How can you tell where the sales meeting is?  Look for a dozen white Ford Tauruses in the  visitor parking lot.

The typical HP company car was a white Taurus, and it was common to hold customer meetings in which HP engineers outnumbered customers by five to one or more.

There is one sure-fire way you can tell that engineering culture is driving the business operations to a destructive collision.  I call it the catalog rule.  Imagine a sales meeting with N salesmen and M customer representatives.  One of the salesmen should be able to arrive with all of the sales material and, regardless of how large N is, there should be only M sales packets on the table — one for each of customers.  It happens so often that there are M times N catalogs on the the table that you sometimes scarcely notice it.  A customer wants to buy a solution to a complex problem. At the first customer engagement, glossy specifications for all of the carefully engineered component parts are dumped on the table.  This is the point in the meeting where the customer is supposed to have a flash of insight, leap to his feet and start congratulating the engineers.  In the real world, however, the reaction is a little different.   Very few customers want to be their own system integrators. My former Telcordia Applied Research colleague Dennis Egan puts it this way: “Our engineers just want to see their stuff used.”  It seems like a simple thing to ask for, but sometimes this urge for appreciation trumps all other concerns.   In particular, it can confuse the true business context, although you might have to look hard to find it.

It wasn’t that long ago that choosing a data communications service was a confusing and expensive task.  Many telecom customers chose the safe path and called their traditional voice telephony service providers, although it was frequently a big mistake to do that.  Data services in 1995 were a jumble of  software and hardware standards,  confusing pricing models, and regulatory inconsistencies.  A phone call to Bell Atlantic in 1995 inquiring about ISDN service inevitably led to questions that few commercial customers and almost no residential customers could answer.  The question “How far are you from the Central Office?” would usually be met with: “What’s a Central Office?” Because maps and engineering diagrams were frequently inconsistent, an ISDN customer would sit patiently through explanations of loads and coils and why the service probably would not perform as advertised anyway.  A thick reference book titled Engineering and Operations in the Bell System, published by Bell Labs, was given to every engineer in the company. Later, after the 1984 divestiture of the regional phone companies put the physical plant in the hands of seven independent regional operators, Bellcore maintained Engineering and Operations as the network engineering manual for all telephone infrastructure in the country.  By the time DSL service became widely available in 1997, Engineering and Operations specified a work flow diagram for providing DLS service to a single customer with steps that could only be completed after a hundred other independent steps all were completed.

These were the early days of e-commerce, and a clever group of entrepreneurs formed a company with the wonderful name Simplexity to simplify the life of telecom customers in the new age of data.  They had been buoyed by Michael Dell’s brilliantly simple business plan for the company that was to be Dell Computer™:  four pages that said in plain language that it was a hassle to buy computers and that virtually every potential buyer would choose to make a single phone call directly to a manufacturer if it would cut the hassle.  Buying data service was a hassle, too.  Simplexity’s founders reasoned that the 1997 equivalent of Dell’s single phone call for telecom services was this simple website:


By negotiating with service providers for a percentage of all subscription fees – a process that was well understood in the industry because resellers of voice and data services were common – Simplexity was able to project a steady growth in revenue as data customers chose the Dell direct-sales shopping model.  Their first few customers apparently verified the market hypothesis, and Simplexity was one of the start-up successes of 1997, raising substantial venture funding and positioning itself for a successful IPO.

The engineering was flawless.  Simplexity’s Virginia-based development lab looked a lot like silicon valley start ups: an open floor plan with ping pong tables, bean bag chairs and board games scattered everywhere.  Java programmers seemingly fresh out of high school chattered excitedly about the next generation of services that would be marketed through

Then Simplexity’s revenue growth stalled.  The large number of smaller contracts that investors had anticipated did not follow the small number of large, early contracts.  In fact, new revenue began to decline even as data services began to explode.  Surprisingly, reseller revenue continued to rise as new customers shopped around and additional data service contracts were added to existing customer accounts in record numbers.  Simplexity began cutting its technical staff and adding traditional sales staff to compete head-to-head with the resellers.  This undercut the cost savings as Simplexity found itself paying more in commissions to order-book-carrying salesmen.  By early 2000, Simplexity had run out of cash, and, shortly after that, the company ceased operations.

In my discussions with company executives it was clear that they understood only too late that Michael Dell’s model did not work in telecom.  Customers had been purchasing voice and data services from human salesmen for years and the inherent inefficiency in doing that was more than offset by the personal relationships that drove sales.  A website – no matter how efficient – could not replace the long-standing social ties between buyers and sellers.  Simplexity was a great technology in a marketplace that did not need it.   The Dell model was a red herring.  Dell worked in the PC marketplace because there was no longstanding and trusted way of buying computers that had to be displaced.

Why didn’t Simplexity’s market research expose such a basic flaw in their business model?  I attended Simplexity’s early customer briefings – meetings for engineers aimed at selling their technical advantages.  They went out of their way to avoid positioning themselves as just another vendor.  Meanwhile their bricks-and-mortar competitors were fighting it out over who would get the next order.  It was “just another vendor” who got the order.

This is the message that I give to new start ups:  if it’s a choice between an exciting technology meeting and a boring sales meeting at which you are just another vendor, choose boring.   Your customer may not understand it, but if your product is really that good it will outshine the competition anyway.   And, if you are in a vendor meeting, chances are someone  is interested in buying.   It may be more exciting to warn everyone about your sign’s incredibly sharp edges, but that’s not the real reason it’s there.



Guess Who’s Coming to Dinner, Part 3

October 19, 2009

Note: This is a continuation of my Guess Who’s Coming to Dinner posts about the power of including innovators in strategic decision-making.

It took George Heilmeier an afternoon to convince Secretary of Defense James Schlesinger of the value of DARPA’s six “silver bullets”, capability-changing technologies that could guide system designers for the next decade:

  • Create an “invisible aircraft”.
  • Make the oceans “transparent”.
  • Create an agile, lightweight tank armed with a tank killer “machine gun”.
  • Develop new space based surveillance and warning systems based on infrared focal plane arrays.
  • Create command and control systems that adapted to the commander instead of forcing the commander to adapt to them.
  • Increase the reliability of our vehicles by creating onboard diagnostics and prognostics.

“Invisible aircraft” refers to the stealth technology that led directly to the F-111A Nighthawk and is good illustration of how innovators can influence events by focusing on business objectives.  In those days, half of the aircraft in a strike mission were there, not to fire weapons, but to detect and disrupt enemy radar.  Reducing aircraft radar cross-sections by a factor of 10,000 would lead to a ten-fold reduction in radar detection range and a corresponding increase in mission effectiveness.  Classified research in stealth technologies – mainly materials science – had been under way since the 1950’s, but DARPA’s idea was to use stealth as the primary criterion for aircraft design.  Performance and stability are the first casualties in this kind of design, so George knew that, not only would he have to integrate all of the component technologies it would take to produce a flyable, battle-worthy airplane, he would also have to convince the Air Force – run by and for pilots – of the usefulness of this way of designing an airplane.  Pilots understandably wanted to think that aerodynamics would be uppermost in the minds of designers, but DARPA wanted to turn that principle upside down.

The world changed after that.  By the 1980’s many high-performance military planes operated so close to the their performance envelopes that they were difficult or impossible to control without computerized assistance.  There was, in fact, a sort of dark  murmur among military pilots who understood both avionics and computers.  I was directing software test and evaluation oversight projects for the Director of Defense Test and Evaluation at that time.  One of our systems was an advanced fighter aircraft that was being retrofitted with computerized flight controls.  Some of the test pilots had done graduate work in computer science, and were clearly comfortable shifting between flying jet fighters and thinking about computer software.  One of them had a poster taped to the wall of his cubicle.  It showed a mocked-up  pilot’s eye view from the cockpit of a military  airplane that was clearly spiraling into the ground.  On the heads-up display was a graphic that looked something like this:



>>PROGRAM ABEND AT LOCATION 001001010111011.


We were talking about an operational test that he would be flying the next day, but all I could do was stare at the poster. I was a software tester.  I knew that fatal error messages like this were common. They came bundled with the price of the software. Most graduate students knew it, too. I thought to myself “This is the bravest guy I have ever met.”

In approving the silver bullets Schlesinger had promised to keep Pentagon staff off  Heilmeier’s back, but the Air Force resisted DARPA every step of the way:

During this period, the Air Force was not at all supportive of DARPA designing and building aircraft and would not cooperate with us.  We needed their help but received none.  As a last resort, I went to see AF Chief of Staff, Gen. David Jones to plead the case. When I entered his office, I was shocked to see that the general, who had refused to help us in no uncertain terms, was present.  I thought that the program was dead and me with it.[1]

Jones listened to George’s pitch, turned to his reluctant General and said, “We’re going to help these guys.” It was not a question.  Whether this was a directive from Schlesinger or a result George’s powerful presentation is not really important.  The Air Force cooperated from that point on, and on the morning of December 1, 1977, George watched from the end of a runway at Edward Air Force Base as the first prototype of a stealth aircraft took off.

Tying a technology agenda to business goals empowers both sides, and it puts both the passive and active resistors in an organization in a bind.   The cost of resisting change is to put their own goals at risk, often with unpleasant career consequences.  It also allows technology leaders to form new agendas that bypass an unmovable bureaucracy.  Here is how Heilmeier summarizes these lessons:

  1. When you really believe in a concept and the people involved, practice “no excuses” management.  The meaning of this is that you must remove all of the bureaucratic impediments to success.
  2. “Breaking glass” and going around the bureaucracy can be done if you believe in your cause and refuse to quit.
  3. In a game changing initiative, a small group must take on a larger group who won’t always “play fair”.

The danger in this approach is  that success depends almost entirely upon personal commitments, and those commitments can easily be undermined by a change in leadership.  When that happens — as I know from personal experience —  entrenched interests  come roaring back, hell-bent on toppling whatever was achieved.  The time frame for achieving goals has to fit within the tenure of the “small group” because worlds will inevitably come crashing together.

I will have more about this is a later post.

[1]George H. Heilmeier,  “A Moveable Feast – Kyoto Prize Lecture (SD Version)” 2005

Edupunk: It’s Alien vs Predator With Relevance of Universities at Stake

October 14, 2009

Much to my daughter’s dismay, I like Green Day.  Maybe they’ve mellowed since the early 90’s.  Maybe I just need overdriven guitars and liberally sprinkled f-bombs to balance my ITunes™ playlists.  There is no doubt however that there was much concern in the family when I proclaimed 21st Century Breakdown album of the decade: “My Dad can’t like my favorite band!” I’ll admit I was slow to come around.  Back in the days before Georgia Tech had a College of Computing, the School of Information and Computer Science had a punk rock band with a marginally offensive name, and it didn’t catch my fancy.  Band members are now highly regarded professors at Georgia State, Vanderbilt and Clemson.  It took me twenty-five years but I’m starting to see the point.

On the other hand, I got the point of Edupunk right away:

[it] is about the utter irresponsibility and lethargy of educational institutions and the means by which they are financially cannibalizing their own mission.[1]

According to Jim Groom, the educational technology specialist at Virginia’s University of Mary Washington who invented the term “Edupunk”, “The whole idea is a reaction to the over-engineered, badly designed and intellectually constraining technology that has been foisted onto the American higher education system as a substitute for deep reflection about what universities should be evolving into.” Just like the early punk rockers invented forms for themselves, Edupunk is a catchy — and cheerily anarchistic —  way of thinking about DIY in educational technology. Like the punk rockers, Edupunkers don’t mind alienating the  establishment.  They are not without adult supervision, though.

There is a growing punk movement among mainstream educators, a reaction to recent trends in American higher education that in their view are taking colleges down a dead-end path. It is a sentiment that I share.  I’ll have more to say about the Edupunk movement in my book on the Fate of American Colleges and Universities in the 21st Century, but there is an interesting WWC collision at work here, and since I had such a great response to Dancing With the Stars,  I thought it was worth mentioning it.

No less authority than Clayton Christensen (of Innovator’s Dilemma fame) has noticed that higher education has gone all-in for an organizing principle that equates factory-like efficiency with effectiveness.  His 2008 book with Curtis Johnson and Michael Horn[2] is  a complete and damning analysis of the approach to standardized higher education that fires the Edupunk movement.

I was stuck between worlds when I was Dean of the College of Computing at Georgia Tech.  On one hand, I was a prime customer for technology that would genuinely improve operations in an environment where generating a payroll report or even simple analytics to predict enrollments seemed beyond the organization’s capability. On the other hand, I watched in horror the purchase and deployment of  expensive, awkward course management systems (CMS) that are the educational equivalent of the industrial-weight enterprise resource planning (ERP) systems  used to connect customer acquisition and financial processes to supply chain systems in large corporations.  You could almost hear Clay Christensen’s “Tut-tut!” as briefing after briefing made it clear that CMS was there to group and chunk and synchronize when, in the classroom, the real need was for specialization and personalization.

Six-sigma has hit higher education, and trends like CMS and outcome-based assessment combined with layer after layer of accreditation and bureaucratic program review — with their focus on documents, processes and repeatability – are exactly what has  the Edupunks up in arms.  Edupunk has with increasing frequency attracted the attention of VC’s like Union Square Ventures (think Twitter), whose Hacking Education conference brought together long-tail innovators and others who believe that one-size fits all standardized institutions have a real problem.

I’ll let you decide which roles are played by Alien and Predator, but I want to be clear about my vote: factory models have no place in colleges and universities. There are no statistical control charts for higher education, and models borrowed from manufacturing and social science are leading college administrators seriously astray.  The real disruptors are MIT’s Open Courseware, peer-to-peer tutoring of the sort I talked about in last week’s post, games, social networking sites like Atlanta’s, and online exchanges. These are the worlds that are colliding, and if they do, the next economic bubble to burst will be American higher education.

[1] How Web-Savvy Edupunks are Transforming Higher Education” by Anya Kamanetz, Fast Company, September 1, 2009

[2] Clayton Christensen, Curtis Johnson and Michael Horn, Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns, McGraw-hill

Guess Who’s Coming to Dinner, Part 2

October 11, 2009

Being “technology driven” is often not the best path to real innovation.  Part 1 of this post was distilled from a conversation with George H. Heilmeier, former director of DARPA, CEO of Bellcore, inventor of the liquid crystal display and winner of the 2005 Kyoto prize. It was based in part on the “Heilmeier Catechism”, an approach to technology strategy that begins, not with the technology but with the business problem to be solved.  It was shared widely with the many younger managers who came under George’s influence over the years, and I have heard from a fair number of them in recent weeks.  All had their own stories to tell about why the approach of “selling to investment bankers” was exactly the right way to think about positioning R&D in a larger organization.  In all of our discussions, George has always been insistent about two things: the negative power of vested interests and the failure of  technology transfer by “throwing technology over the transom”.  Out of this came his notion of an “interdisciplinary team” with representation from R&D, product engineering and manufacturing, where leadership and balance shift as time goes on. This is the dinner table.   As important as these ideas are for day-to-day management of R&D, they are critical when it comes to initiating projects that are transformative, where commitment to change comes from handshakes at the top of the organization.

Shortly after the Regional Bell Operating Companies began divesting themselves of Bellcore, but before George stepped down as CEO, the appetite for applied research began to change.  To some extent, this was part of a natural evolution of the company from a captive R&D Lab to a stand-alone corporation whose owners – eventually the employee-owned defense systems integrator, Science Applications International or SAIC — demanded not only profitability but also growth in a market that was already growing at 15% per year.   The “30/30 Frontier” (30% revenue growth with 30% operating margins) was a wake-up call for all R&D managers in the company and it was a personal lesson for me in how to engage corporate management with initiatives that were tied to  bet-your-job objectives.

I was in charge of computing research at the time,  and three things were important to me.  First, was Heilmeier’s  commitment to funding forward-looking work at the corporate level, which meant that annual spending goals had to be set by reaching a consensus among product, research,  sales, and marketing teams.   Second was the freedom that Bob Lucky  — Bellcore’s senior VP of Research —  gave to his senior leaders to push the boundaries of the business. Third, was the collaborative but demanding relationship that I had with Chief Operating Officer Sanjiv Ajuha, who was himself a veteran software development manager.

Sanjiv was in turn looking for three business advantages that at first blush seem to be mutually contradictory.  The first two were obvious: near-term competitive advantage for the company’s large software systems and  game-changing inventions that would shake up the marketplace in the long run.  The third was revenue against which corporate R&D investments could be scored.  Near-term objectives were rolled up into product R&D costs while long-term objectives were used in 3-5 year investment planning.  Scoring R&D spending against revenue hardly seems like a competitive advantage but in my view it was the critical piece of the puzzle because it forced us to run a business.  It forced us to operate a business unit with profit and loss goals, not just another corporate cost center (which tend to develop unhealthy  entitlement cultures).   It also forced us to be very hard-nosed about tracking research contributions that led to revenue in existing product lines.  I would like to think this is a classical WWC strategy because it made us  focus externally on business objectives that affected the entire company.

I’ll have a lot more to say in later posts about some of the tools we used to do this, but the example that Heilmeier kept in front of us – because it took some convincing to make sure the lessons stuck – is for me the most compelling part of this story and the one I returned to time and again as I found myself inventing new frameworks in other organizations.

As DARPA Director, George reported to Nixon’s Secretary of Defense James Schlesinger.  Schlesinger himself had impressive academic and technology credentials.  He had served as head of the Atomic Energy Commission and Director Central Intelligence. Schlesinger’s DARPA operated like a technology incubator full of “technology entrepreneurs” as Heilmeier called his staff.  Under Heilmeier, DARPA settled on six over-arching themes, all of them aimed at somehow changing the nation’s military posture in ways that would be understandable not only to the Secretary but also to the staff and line officers who were frequently unhappy with DARPA’s “help”:

  • Create an “invisible aircraft”.
  • Make the oceans “transparent”.
  • Create an agile, lightweight tank armed with a tank killer “machine gun”.
  • Develop new space based surveillance and warning systems based on infrared focal plane arrays.
  • Create command and control systems that adapted to the commander instead of forcing the commander to adapt to them.
  • Increase the reliability of our vehicles by creating onboard diagnostics and prognostics.

Each of these “silver bullets” was so directly tied to a military objective that it took only a single meeting with Schlesinger to get his buy-in on the entire agenda.  In my next post I will describe how these technology challenges were turned into military capabilities and why it’s an important lesson for today’s climate where innovation and execution often seem to be at odds.

Dancing with the Stars (of Pure Math)

October 5, 2009

Even casual iTunes™ users know about iTunesU™, the increasingly rich video-taped course offerings from universities as great as Stanford and Oxford and as humble as the dozens of community colleges and adult education programs that make their curricula available for free downloading. I should have seen it coming in the spring of 2001 when Charles Vest – then president of MIT – paid me a visit at HP to tell me of his plans to make MIT’s entire course catalog available for download on the internet, but I was not thinking much about Higher Education as a market in those days.

Things changed in late 2002 when I started to draw a paycheck from a university and began to think hard about the fate of American colleges and universities in the 21st century.  What Chuck Vest predicted one afternoon in my Palo Alto office is now being played out in what I believe is the next economic bubble.  This is quite literally the collision of that half of the earth’s population that has in the last decade joined the free market economy with the inwardly focused world of  Americah higher education, which – unless there are some dramatic changes – is destined to be a marginalized bystander to events that it is ill-equipped to understand.  Here is the stark reality: enhanced technology means that the market for higher education now has many suppliers, and the  hundreds of millions of people who all of a sudden want a university education also find that they have abundant choices, often with lower cost and high quality.    In any market with abundant choices, the winners are inevitably those with compelling brands, price, or value.  There are about 3,500 accredited colleges and universities in the US, and, except for the handful (less than a hundred) who have global brands, most of them have not figured out how to deliver their value at an acceptable price.  In fact, an alarming large number of them cannot even articulate their value to the world that is rushing toward them.  That spells trouble. I will have much more to say about WWC and higher education in later posts.

I am working on a book on this topic so these problems are much on my mind these days, but an email message from a colleague prompted me write that there may be a series of smaller collisions rather than a single cataclysm.

There is a lot of criticism about the quality of iTunesU lectures and online courses.  Some criticism can be dismissed as an “innovator’s dilemma” confusion of the current state  — much of it admittedly primitive – of the technology with its disruptive power.  I find this criticism easy to dismiss because you can see quality of online instruction improving month by month.  Never underestimate the power of technology curves.  The more difficult question is how exactly the technology can replace a skilled human mentor who has ability to interact directly with her students.

Then two e-mails from my friend Dick Lipton showed up.  “Hit 7,000 page views today!” said the first one.  A few hours later: “We were number 20 on WordPress!”  That’s 20 out of roughly 3 million WordPress posts.  Dick is a world-class computational theorist, a member of the National Academy of Engineering and one of the best teachers I have ever known.  He is a star.  He has been blogging pure math for the last year at a website called “G̈ödel’s Lost Letter¨.  Not exactly the stuff you would expect to be in the top  .0007% of all of those posts about Michael Jackson, Death Panels, and the 2016 Olympics.  His latest series “Reasons for Believing P = NP” has been exceptionally popular, drawing hundreds of comments from experts, novices, interested amateurs, and a few cranks.  We have been collaborators for many years. Our offices used to share a common wall.  I know Dick’s voice when he is engaged with his students.  It has a distinctive rhythm and is louder when he is trying to extract a missing argument from a reluctant pupil.  It was the voice I heard when I read his blog, and as I thought about his 7,000 viewers it occurred to me that Dick’s seminar was no longer 10 or 15 graduate students crowded around a white board.  This is not an on-line lecture or an iTunes™ videos. I thought, “This is what the teacher-mentor relationship is like when the technology enables a classroom of 7,000 students.”  When there are abundant choices, students will choose this.