Two recent comments, posted at Mini-Microsoft, have given me some food for thought. Please, read through and ...
- Oh by the way, there is the SmartWatch to look out for. You know - BillG's innovation that made the IEEE Spectrum list of the ten dumbest ideas of the year? LOL, that's the kind of innovation Microsoft is famous for!
- The sad fact is that our execs are now so very disconnected from the real morale/culture problem. They think that somehow just shipping more product will help. NOOO, that's not it. It's the fact that our execs don't understand what it's like to be an IC, to want to just get something done, get credit for it and get paid well to do it...and NOT be getting micro/process managed to death while you're doing it. That's it, plain and simple.
... answer: Could multi-(b/m)illionaries innovate on behalf, and for the benefit, of your average JOE?
Since the collective intelligence is more apt to tackle such a question, I invite you, the reader, to think about this. Your comments will be appreciated!
17 comments:
Steve Jobs has proved he can. The difference between Jobs and Gates is that Jobs could innovate before he became a zillionaire.
Gates is trying to go from an imitator to an innovator, even as he has become ever more disconnected from those average Joes. Low probability of success.
I think that the answer to your question is yes. If the question is: Does being a multi-(b/m)illionare provide an advantage for someone to innovate for the average Joe? The answer to this question is no in my opinion. The key component to creativity and innovation lies in the individual, not in his financial circumstances. There is no argument that having money helps a creative person spread his innovations to the common man, but the money is not the source of his creativity.
I think that the basis for your question has more to do with business organization. In a business that is inherently creative, how do you provide the maximum productivity for your organization? You certainly do not do it by running most ideas from the top in a traditionally hierarchical management system like Microsoft. That is a waste of the talent that you have spent so much time and money to recruit. My own view is that in a business based on creativity and innovation you have to be entirely focused on the individuals. It's not as easy as having a cookie-cutter management scheme filled with all the inconsistencies and paradoxes that come with it. Despite all the work involved, the most important thing for managers to be concerned about in a company based on creativity are the people doing the work. Not the schedules. Not the product features. Not the competition. Concentrate on the people and everything good will follow.
Steve Jobs has proved he can.
Actually, I am not sure how many of the latest products at Apple have come from Jobs vs. his employees. It might have been that Jobs just saw a collection of good ideas more clearly into a successful product in the market.
I think that the answer to your question is yes. If the question is: Does being a multi-(b/m)illionare provide an advantage for someone to innovate for the average Joe? The answer to this question is no in my opinion. The key component to creativity and innovation lies in the individual, not in his financial circumstances. There is no argument that having money helps a creative person spread his innovations to the common man, but the money is not the source of his creativity.
The damn SmartWatch idea doesn't seem to come from somebody who's tuned into the life-styles and aspirations of the generation Y. Now, may BillG try his hand at something like this? Sure, but I cannot call it the future!
I think that the basis for your question has more to do with business organization.
It definitely has an organizational aspect. I don't know the organizational details of Microsoft, but if one needs to go all the way up to Mr. Gates, (compete with his SmartWatch), for resources doesn't seem conducive to innovation. Does it? Not to mention the waste of time on Mr. Gates' part.
Look at this young employee (she even posted @ minimsft) who's in charge with Smart.com. She sounds as if she needs to meet BillG for some reason. It is as if her unit manager cannot appreciate the merits of what she's trying to do, equip her with resources and responsibility, and get that thing running... I hope you've gotten my point by now: Innovation at Microsoft should come from each and every young, bright, and ambitious, individual contributor.
On a different level, too many references at minimsft being are made to BillG (and SteveB), as if one (+1) person can make it through a 60,000-person organization. This might have something to do with how the organization was initially set and run, but it's high time power and responsibility got delegated to lower levels. If the (former) 7 units were as individualized as to report their financial statements, why aren't their heads responsible? On the other hand, it's also about people who need to be more vocal/responsible about making things right, at their levels and up in the Company.
The damn SmartWatch idea doesn't seem to come from somebody who's tuned into the life-styles and aspirations of the generation Y. Now, may BillG try his hand at something like this? Sure, but I cannot call it the future!
Yes, but that's BillG's problem whether he has money or not. He's just not that creative of a guy in any useful way. He can't help it - he's a geek after all! It's just that your original question, "Could multi-(b/m)illionaries innovate on behalf of, and for the benefit of, your average JOE?," went too far is all. Money doesn't preclude someone from being innovative - in fact I'd argue that the two are mutually exclusive attributes. Other than my nitpicking (I firmly believe that the Devil is in the Details!), we are in agreement here.
I said:
My own view is that in a business based on creativity and innovation you have to be entirely focused on the individuals.
You said:
I hope you've gotten my point by now: Innovation at Microsoft should come from each and every young, bright, and ambitious, IC.
I think that are in complete agreement here. As a general management principal for a company that requires creativity, the key focus is on helping ICs deliver the maximum value of their efforts to the company, and you don't do that with a hierarchical management system like Microsoft which is extremely traditional in every way despite their claims otherwise.
That said, one aspect of software development that no one who hasn't spent a lot of time writing code will ever really understand is the incredible level of attention to details that it takes to do a good job - especially on large projects. It isn't too hard to write code, but it is very, very hard to write really good code that is secure, reusable and maintainable in the future. Young people in the business tend to have a problem with this because they haven't had the chance to really screw up good yet. This is why a company that needs creativity in the software domain needs to focus on its ICs even more, not less. Real process in the software world shouldn't be a sudden sledge hammer blow of one-size-fits-all mindless and mechanical procedure. Real process develops ICs so that mechanical procedures are not necessary. and better, cleaner code results. You can't get that without being completely focused helping the ICs develop their full potential professionally for themselves and thus for the company.
It isn't too hard to write code, but it is very, very hard to write really good code that is secure, reusable and maintainable in the future. Young people in the business tend to have a problem with this because they haven't had the chance to really screw up good yet. This is why a company that needs creativity in the software domain needs to focus on its ICs even more, not less.
Hell yes! This is the best thing I've read in a while.
The way to get awesome software -- along all axes: creativity, security, robustness, aesthetics, etc -- is to invest the ICs with a sense of pride, the tools to constantly improve, and a work environment which allows for creativity and a degree of spontaneity.
Quality gates and process-as-a-crutch drive towards well-defined mediocrity. Craftsmanship and pride lead to true quality.
Software is a unique in its plasticity and the amount of creativity necessitated in its design, as well as its staggering complexity if it's allowed to grow out of hand. Other companies have a tremendous advantage over Microsoft because they've sidestepped the complexities that are killing Microsoft while Microsoft is busy tripping over its millions of metaphorical feet.
"...while Apple markets a "portable music player." The iTunes Music Store doesn't even mention the word "AAC" or "bitrate" anywhere--you just download music tracks. God, just compare the iTunes interface to Windows Media Player."
That's because it's all about emotions. Simple design and such are a posteriori rationalizations. I would not give Apple credit for how dumb their customers are--just look at the (pink) leather covers that sell for $70. Anybody recalls the Razor craze of the late '90's?
At Sony, the top guy is a new (Stringer). He may be able to put the whole industry in order--some steps must have already been taken, just look at Sony's latest MD-players/recorders. For example, you could get for $150 a recorder that uploads (alas, once) the content you take in via a mic-line. Better battery life/cost, and the ability to burn 8 CD's into a minidisc at higher quality than top MP3. Now, the icing on the cake is that it plays natively WMA and MP3! I would not even go into speeds and feeds, but only mention its HI-FI amplifier and 6 band equalizer. Sorry for all this story, SNE is not even my customer...
On the other hand, if servers sold like shampoos, Cindy Crawford would have already been in MSFT's payroll!
"Apple is steering digital media and has therefore threatened Microsoft's push into the living room, so some bitterness is to be expected."
MSFT messed it all up with its DRM (and so did others) and has never had a clear digital strategy going forward. Or, if it has one, it is probably the result of a thousand constraints placed by industry "partners". Have a look at the FOO media player bar, and you'll get my point. But that's Robin Hood, I know ;-)
Nice comment about the SmartWatch.
When I first saw that thing, I said to those around me, "Has anyone ever been into a bedroom and looked at what people put there - do they want a watch charger on their Ethan Allen dresser?"
The watch was HUGE and clunky and shouted "I AM A GEEK WHO WILL WEAR ANYTHING."
I don't know if it's failed yet. But what's it done recently?
It's the MS Bob of hardware, I suppose.
"MSFT has messed it all up with its DRM (and so have others) and has never had a clear digital strategy going forward. Or, if it has one, it is probably the result of a thousand constraints placed by industry "partners". Have a look at the FOO media player bar, and you'll get my point. But that's Robin Hood, I know ;-)"
You can actually insert (almost any MS tech) in place of DRM. It's not that the talent isn't there, it's that bureaucratic necessity requires that your mgr's buddies have more input into the direction of architecture than the people who were hired to innovate.
And then you have to navigate through cultural and social hurdles that prevent good ideas from being properly explored and\or implemented. All in all, MS is a mess and it's no one's fault but those in charge
That's because it's all about emotions. Simple design and such are a posteriori rationalizations. I would not give Apple credit for how dumb their customers are--just look at the (pink) leather covers that sell for $70. Anybody recalls the Razor craze of the late '90's?
1.) It's not about emotions. I just explained why the iPod/iTunes combo is selling--superior function and design. It's that easy. iPods are simply the best music players out there.
2.) Calling over 80% of the market "dumb" strikes me as bitterness over a succeeding competitor. Apple made portable digital music easy, accessible, and fun. Something nobody else was interested in doing. While people market "WMA-based FM-tuner digital media player with Napster-To-Go capability," Apple markets "portable music player." I think you're not recognizing the point of why I mention that. Everyone else is selling crappy geek devices that look ugly and are cumbersome to navigate. Apple was the first with the non-geek music player.
At Sony, the top guy is new (Stringer). He may be able to put the whole industry in order--some steps have already been undertaken if you look at Sony's latest MD-players/recorders. For example, you could get for $150 a recorder that uploads (alas, once) the content you take in via a mic-line. Better battery life/cost, and the ability to burn 8 CD's into a minidisc at higher quality than top MP3. Now, the icing on the cake is that it plays natively WMA and MP3! I would not even go into speeds and feeds, but only mention its HI-FI amplifier and 6 band equalizer. Sorry for all this story, SNE is not even my customer...
Haha, sure, Sony's going to put the industry in order, after they let go of 10,000 of their workforce. I'll let you in on a secret--consumers don't care about any of the things you just listed.
I think the fact competitors continue to believe people only buy iPod because of its "look" or due to "emotion" illustrates why they continue to fail, and why Microsoft is close to completely losing the battle for the living room.
1.) It's not about emotions. I just explained why the iPod/iTunes combo is selling--superior function and design. It's that easy. iPods are simply the best music players out there.
I don't think I can lend myself to a quasi-religious war by taking the conversation beyond this point. I respect you perspective, and wish you well and happy iPod-casting!
2.) Calling over 80% of the market "dumb" strikes me as bitterness over a succeeding competitor. Apple made portable digital music easy, accessible, and fun. Something nobody else was interested in doing. While people market "WMA-based FM-tuner digital media player with Napster-To-Go capability," Apple markets "portable music player." I think you're not recognizing the point of why I mention that. Everyone else is selling crappy geek devices that look ugly and are cumbersome to navigate. Apple was the first with the non-geek music player.
You are missing an important point. I have NO stake in any one of the players involved in the conversation. My only stakes are as consumer and business professional. As consumer I started being comfortable with such players only with the most recent iteration in Sony minidiscs. I had not liked ANY one of the players on the market before that, and think that even Sony has some way to go. If you really want to see cool from looks to speeds and feeds, check out Sony's Qualia! We'll hear more and more about it, I am sure.
In any event, we've strayed away a bit. You think so strongly about these things? Drop me a line and let's capitalize on such ideas in the context of a business!
"If you really want to see cool from looks to speeds and feeds, check out Sony's Qualia! We'll hear more and more about it, I am sure."
Hey fCh, read this:
http://quote.bloomberg.com/apps/news?pid=10000006&sid=ajtFgFy6CNuA&refer=home
Stringer and Chubachi have labeled 15 unprofitable products that they may cut, declining to give details. Chubachi did say that Sony will reduce its research and development at its robot business, and are not developing new models in its high-end Qualia electronics brand.
Still think we will hear more and more about it?
Still think we will hear more and more about it?
Anonymous, thank you for the link. It reads something like this: "and are not developing new models in its high-end Qualia electronics brand." Few paragraphs dowm, when talking about Sony's Wega flat-panel tv's, the wording is "phasing out."
It looks like Sony won't extend Qualia (for a while)...
Indeed, considering the cost controlling mesaures imposed across the board, it make sense that no more R&D can be spent on TOP concepts leading to TOP products before better operational efficiency it's being achieved, and investments in more profitable areas (e.g. cel processors and OLED screns) are made.
Qualia approaches pure craft and will top their category for years to come. Thus, I might qualify my previous statement and say that connoisseurs will talk more and more about it ;-)
To return to Qualia's future, IMO, I think they'll stay with the handfull of models already developed--mighty plenty if you asked me.
On the other hand, as I wrote here about Ken Kutaragi, instrumental in PlayStation's latest success, an astute business competitor should take this opportunity and make a bunch of cannot-refuse type of offers to some Japanese over-ruled heroes.
Always look to the future!
BusinessWeek says:
Sony's Stringer might consider taking a page from Samsung Electronics Co. Yes, the two companies have vastly different portfolios, with Samsung earning most of its profits from chips and Sony owning music and movie studios. And it's true that Samsung remade itself only after a near-death experience, following the Asian financial crisis in the late '90s. Still, the Korean company has taken many of the steps that analysts believe Sony needs to take, ranging from collaborating more with partners to doing a better job taking its cues from the market. In doing so, it has become one of the nimblest players in the business. "When Samsung wants to get something done," says Intel Executive Vice-President Sean M. Maloney, "the decision comes down from the top, and everybody moves at lightning-quick speed to just do it."
Camera-Phone Pioneer
What Samsung has done isn't rocket science -- more like Business 101. For example, the company routinely dispatches designers and engineers to labs in New Jersey, Seoul, and other places to gauge consumers' tastes for new products. Thanks to such research, Samsung was one of the first to pack digital cameras and music players into cell phones, creating instant hits.
Doesn't Sony do similar studies? Of course. Yet, with its gearhead culture, Sony continues to act like the great brand of yore, believing it can dream up products behind closed doors and unleash them on a grateful market at premium prices. This helps explain why Sony stuck with its Trinitron televisions long after flat-panel TVs had won the day. Moreover, Sony has a bias toward its home market. That's why its gadgets often feature complex software that Japanese love but that drives Americans crazy.
Samsung's top managers also come down hard on their units to make sure they're working together to come up with new products. Again, sounds pretty basic -- but over at Sony, factionalism still rules the day. The company's music, movie, and gadget businesses have conflicting agendas and often balk at cooperating. In one famous example, Sony's music division, fearing piracy, kept the consumer electronics side from making digital-music players that would let consumers play the popular MP3 format. Hello iPod; goodbye Walkman.
The last thing Sony can afford to do is miss out on the emerging portable video-player market. Yet once again, Sony's content guys are making sure that consumers can play only pricey Sony-formatted disks on its PlayStation Portable. To win in this business, say analysts, Stringer will have to overcome the qualms of the content side and open Sony's players to other formats.
Birth Of The Nano
Another knock against Sony: It doesn't play well with other industry titans. Samsung has no such qualms. Last February, chip chief Hwang Chang Gyu went to see Steven P. Jobs to try to get him to use the company's flash memory chips in Apple's (AAPL) music players. Jobs wasn't interested at first, but Hwang kept pressing him and eventually Jobs saw the potential. Bingo: The iPod nano was born, and Samsung won a big order for flash chips. How could Sony mimic this approach? By licensing designs or technology, such as its cell chip, to others.
Fch, you wanted me to post my response to your comment on my blog - well here it is ;)
http://investinsearch.blogspot.com/2005/09/will-google-hurt-microsoft-with-free.html
Thanks for you comment!
Cheers
Anders
Richard Sapper's Bright Ideas
His Halley LED lamp is the latest design innovation in a long career crowded with them. In a Q&A, he sheds light on the creative process
In his new book, The Ten Faces of Innovation, Tom Kelley devotes one chapter to the cross-pollinator -รข€“ the person who borrows a clever solution or material from one industry and applies it to another. The escalator, for instance, was originally conceived as a Coney Island amusement-park ride. Reinforced concrete was created by a gardener aiming for stronger flower pots. Richard Sapper is perhaps the supreme example of the cross-pollinator. Again and again, the German designer has created innovative products by mining the knowledge of far-flung disciplines.
You may never have heard of Sapper. Unlike Michael Graves or Philippe Starck, his name doesn't precede him. But you would more than likely recognize his work, whether it's the Tizio desk lamp, the Melodic kettle for Alessi, the Minitime kitchen timer, the iconic ThinkPad, or any of the countless IBM (IBM ) computers produced since he became Big Blue's design consultant in 1981.
Born in Munich in 1932, Sapper has a portfolio few designers can rival. After a wide-ranging education - he studied philosophy, graphic design, engineering, and economics - he joined the design department at Mercedes-Benz (DCX ). Since then, he has worked for Gio Ponti, Pirelli, and Fiat; Alessi, IBM, and Knoll; and countless others.
Most recently he collaborated with a lighting startup, Lucesco, to design the Halley task light. Taking the LED semiconductors typically used in traffic lights, he has created a desk lamp that will last some 50,000 hours. And the LEDs aren't the only element he borrowed from the computing industry.
Sapper recently spoke with Jessie Scanlon, BusinessWeek Online's Innovation & Design editor, about his new lamp, where he has been, and where he has always wanted to design. Following are edited excerpts:
How did the Halley project come about?
[Lucesco Vice-President for Sales and Marketing] David Gresham used to be a designer at IBM. Some 20 years ago I met him on a visit to the design center in Tucson, Ariz. He left IBM long ago, but almost two years ago, I got a call from him. He said that he was working for a startup in Silicon Valley making LED lighting, and would I be interested in designing a lamp for them.
Did you say yes right away?
First the Lucesco team came to Italy to meet with me. Then I said yes, O.K. I wanted to meet with them first, because in my profession, it is always important to have good human relations. If you don't, you can't achieve anything.
So what sold you on the project?
I designed the Tizio almost 30 years ago and hadn't designed another desk lamp since then. At the time, the Tizio was revolutionary. It was the first halogen desk lamp. So I thought it might be time to use another new technology to create a lamp that doesn't look like any lamp before.
So the challenge appealed. What were the toughest design problems?
The biggest challenge was heat. Heat ruins LEDs, so you need to find a way of cooling them. I wanted to create a small, very light head, but that would leave no space for traditional cooling. So I thought we might borrow a technique from the computer industry -- the technology used in laptops to cool the chips. The heat is carried by a pipe to a series of thin aluminum fins, which are cooled by a fan. So we managed to do this. We also borrowed another bit of notebook technology: We used the hinges of the screen unit for the arm joints.
For a designer, is there greater value in designing many different kinds of things than in becoming an expert in designing X, despite the learning curve involved in every project?
I'm an impatient person, so if I had designed the same object three or four times, I want to try something else. This is a very good way of acquiring a vast amount of experience. I have designed cars, watches, clocks, kettles. Each new experience naturally is a challenge, but it always gives me the opportunity to draw on solutions that I have used in another kind of product.
A few years ago I designed a folding city bike whose folding mechanism was inspired by aircraft landing gear. If I had designed lamps all of my life, I wouldn't have had the possibility of doing something like that.
You recently created a cheese grater for Alessi. Where did you get the idea for that project?
In my kitchen! In my house, I have to grate the cheese for the spaghetti. And it always takes so long because the cheese graters are so small. With my grater, one downward stroke creates enough cheese for a single serving, so now my job just takes a few seconds.
What should companies do to be more creative?
I found at IBM, to build a creative atmosphere, you have to have respect and faith. For instance, when I first visited the design center in Japan, there were 10 or 15 designers. In Japan they are very afraid of doing something that can be criticized. They were great designers, but no one came up with any revolutionary ideas. Everyone followed the rules. They were distrustful of a foreign design consultant coming in. It took me five years to earn their faith so that they would show me their ideas.
But ideas are only part of it. Thirty percent of success is having the idea, 70% is working with other people to make it a living product. So again รข€“- without good human relations, your idea is completely worthless.
Is there anything you haven't had a chance to design yet that you would like to?
I've always desired to design one of these huge agricultural harvesters that roam the fields and pick up the stuff. They are extremely exciting as a machine, like giant creatures, but they are designed without any thought to how beautiful they could be.
Not All Innovation Is Equal
Technical innovation will earn you lots of adoring fans (think Apple). Business-model innovation will earn you lots of money (think Dell).
Innovate for Cash, Not Cachet
If your cool new thing doesn't generate enough money to cover costs and make a profit, it isn't innovation. It's art.
Don't Hoard Your Goodies
Getting to market on time and at the right price is vital. If that means licensing your idea to an outside manufacturer or marketer, do it.
Innovation Doesn't Generate Growth. Management Does
If you covet awards for creativity, go to Hollywood. Managers get rewarded for results, which come from customers.
Attention Deficit Has No Place Here
Every innovation worth doing deserves your commitment. Don't leap from one new thing to another. If your creation doesn't appear important to you, it won't be important to anyone else.
If He's So Smart...Steve Jobs, Apple, and the Limits of Innovation
The battle over digital music is just another verse in Apple's sad song: This astonishingly imaginative company keeps getting muscled out of markets it creates. So what does Apple have to tell us about innovation?
From: FastCompany Issue # 78| January 2004 | Page 68 By: Carleen Hawn
--------------------------------------------------------------------------------
Everyone knows Parisians are snobs. So it probably shouldn't have come as a surprise that an unshaven, middle-aged American, speaking English and dressed in cuffed jeans, sneakers, and a worn black T-shirt, was rudely turned away from the bar at a lavish fete inside Paris's Musee d'Orsay on September 16, 2003.
Except that the man was Steven P. Jobs, the cofounder and chief executive of Apple Computer Inc., and it was his party. And some bash it was. For three hours, Apple's guests grazed on foie gras and seared tuna canapes, and sipped champagne while strolling under a massive glass arcade that shelters one of the world's largest collections of Impressionist masters, Rodin sculpture, and art nouveau furniture. In a Baroque salon at the far end of the museum, a raucous jazz band played. As one guest observing the scene intoned, "This is huge."
Not huge enough, it seems, to make room for Jobs. But if the boss was peeved at getting the bum's rush, he didn't show it. Together with his entourage of suited computer executives, Jobs retreated quietly to a bar on a lower level, and the party celebrating the 20th anniversary of Apple's European trade show, Apple Expo, proceeded without further incident. Maybe "Bad Steve" has mellowed at the age of 48.
Then again, maybe Jobs has just gotten used to being tossed out of his own parties. You could say that the personal-computer industry itself began as an Apple wingding when the Cupertino, California-based company introduced the Apple II in 1977. Ever since, Apple has played the role of generous host, spicing up the festivities with one tasty offering after another. Following the PC, Apple served up many of the features that computer users have since come to take for granted, including the graphical user interface, the mouse, the laser printer, and the color monitor. Yet Apple has been forced to watch the celebrations from out in the alley, its nose pressed longingly to the window as others feast: Today, more than a quarter-century after its founding, it commands just 2% of the $180 billion worldwide market for PCs. Almost everyone agrees that Apple's products are not only trailblazers but also easier to use, often more powerful, and always more elegant than those of its rivals. Yet those rivals have followed its creative leads and snatched for themselves the profits and scale that continually elude Apple's grasp.
All of which raises some interesting questions. If Apple is really the brains of the industry--if its products are so much better than Microsoft's or Dell's or IBM's or Hewlett-Packard's--then why is the company so damned small? (Consider that in the last 10 years alone, Apple has been issued 1,300 patents, almost one-and-a- half times as many as Dell and half as many as Microsoft--which earns 145 times as much money.)
TRUTH IS, SOME OF THE MOST INNOVATIVE INSTITUTIONS IN THE HISTORY OF AMERICAN BUSINESS HAVE BEEN COLOSSAL FAILURES.
The Creativity Conundrum
Conventional wisdom has long answered that Apple is the victim of a single, huge strategic error: the decision in the 1970s not to license its operating system. But that was long ago and far away. Apple has since had many opportunities to reverse its infamous decision, but it hasn't done so. And Apple's creativity has produced plenty of other opportunities to compensate for the initial misstep. The company could, for example, have exploited an early beachhead in the $12 billion education market for PCs (it once dominated that market but now can claim only 10% of it), to push its way back into homes. But it failed to develop the aggressive sales force to do so. Apple has missed chances to own new markets, too. It introduced the world to pen-based computing with its Newton mobile device in 1993. Newton had its problems--it was clunky, hard to use, and probably ahead of its time. But it still seems baffling that Apple failed to capture a meaningful stake in the $3.3 billion market for personal digital assistants (PDAs), a business that by some measures is now growing faster than either mobile phones or PCs.
That Apple has been frozen out time and again suggests that its problems go far beyond individual strategic missteps. Jobs may have unwittingly put his finger on what's wrong during his keynote speech earlier that day in Paris. "Innovate," he bellowed from the stage. "That's what we do." He's right--and that's the trouble. For most of its existence, Apple has devoted itself single-mindedly, religiously, to innovation.
But wait. What can possibly be wrong with that? After all, we worship innovation as an absolute corporate good, along with such things as teamwork and leadership. Even more than these virtues, it has come to be seen as synonymous with growth. Political economists have assigned tremendous significance to it since at least the mid-20th century. Innovation is at the heart of Joseph Schumpeter's idea of creative destruction, for example: the process of "industrial mutation" that keeps markets healthy and progressive. Management theorists embraced the notion in the intervening decades, and a stream of academic papers and books promoting innovation as the critical element of business success issued forth from the likes of Peters and Drucker, Foster and Christensen. Innovate or die, we were told. It's the core of excellence and the root of entrepreneurship. It's the attacker's advantage, the new imperative, the explosion, the dilemma and the solution. (You can play this game at home, too, with any of the 49,529 titles that come up for "innovation" on Amazon.) And yet it's hard to look at Apple without wondering if innovation is really all it's cracked up to be.
Nor is Apple's the only case that should give us pause. Truth is, some of the most innovative institutions in the history of American business have been colossal failures. Xerox Corp.'s famed Palo Alto Research Center (Xerox PARC) gave the world laser printing, ethernet, and even the beginnings of the graphical user interface--later developed by Apple--yet is notorious for never having made any money at all. Polaroid, which introduced us to instant images decades before digital photography, collapsed under mismanagement and filed for Chapter 11 bankruptcy protection in October 2001. The Internet boom of the late 1990s, of course, now stands revealed as a sinkhole of economically worthless innovation. ("I know: Let's offer online ordering and free delivery of $1.49 bags of Cheez Doodles!") And Enron was arguably the most innovative financial company ever. So it turns out that not all innovation is equal. Not all of it is even good.
But the paradox of Apple is in many ways more disturbing because its innovations haven't been precommercial, like Xerox PARC's; they haven't been superseded, like Polaroid's; they haven't been frivolous, like those of the dotcom bubble; and they haven't been destructive, like Enron's. They've been powerful, successful, useful, cool. Since its earliest days, Apple has been hands-down the most innovative company in its industry--and easily one of the most innovative in all of corporate America.
Jobs was justly proud as he regaled his audience of 3,700 at the Palais des Congres in September. He prowled the stage for two hours, exulting in the details of Apple's numerous 2003 product launches. Chief among them were the new G5 desktop, the first 64-bit computer and the industry's fastest ever; a new operating system called Panther; a 15-inch laptop that comes with an ambient-lit keyboard for working in the dark; and Apple's first wireless mouse. Even by Apple standards, it was a banner year for snazzy new gear.
And as if that weren't enough, Jobs then reminded the crowd of the year's most important product debut, Apple's digital-music store known as iTunes. When it was launched in late April, iTunes became the first legal, pay-as-you-go method for downloading individual tracks of recorded music. Music fans and the recording industry alike loved it, and by the end of the year, more than 20 million songs had been purchased and downloaded off Apple's site. Soon the trade press was touting iTunes as "revolutionary," "groundbreaking," and a "paradigm shift" for the market. Time magazine recently hailed it as the "Coolest Invention of 2003."
But even in that banner year, Apple's creative energy hasn't amounted to very much in financial terms. For its fiscal year ending September 27, 2003, Apple reported just $6.2 billion in revenues, three-quarters of it from the sale of personal computers. The father of the PC--and, remember, the industry's number-one vendor in 1980--has since sunk to a lowly ninth, behind competitors Dell, Hewlett-Packard, and IBM, just for starters. Sadly, Apple is also behind such no-namers as Acer (seventh) and Legend (eighth). So much for innovation and creativity. These clone-makers, based respectively in Taiwan and China, exist solely to churn out gray boxes at the lowest possible cost. It may very well be that, without its relentless innovation, Apple would have simply ceased to exist long ago, going the way of Commodore and Kaypro in this unforgivingly Darwinian industry. But all its creativity certainly hasn't put it at the top of the food chain.
Where Apple was once one of the most profitable companies in the category, its operating profit margins have declined precipitously from 20% in 1981 to a meager 0.4% today, just one-fifth the industry average of 2%. And it isn't just the hardware manufacturers that are devouring Apple. Its chief competitor in software, Microsoft, earned $2.6 billion in its most recent fiscal quarter (ending September 30). That's nearly 15 times the $177 million in software sold by Apple in its most recent fiscal quarter and roughly equal to the profits that Apple has earned from all of its businesses over the past 14 years. In just three months.
With such examples as Apple in mind, a number of skeptics are beginning to ask whether our heedless reverence for innovation is blinding us to its limits, misuse, and risks. It's possible, they say, to innovate pointlessly, to choose the wrong model for innovation, and to pursue innovation at the expense of other virtues that are at least as important to lasting business success, such as consistency and follow-through. When it comes to economic value, Schumpeter's creative destruction may have an evil twin: destructive creation.
James Andrew, of the Boston Consulting Group, for example, argues that too many companies presume that they can boost profits merely by fostering creativity. "To be a truly innovative company is not just coming up with great new ideas, or products and services," he says. "It is coming up with ones than generate enough cash to cover your costs and reward your shareholders."
Andrew says companies can boost the odds of their success by choosing the most appropriate of three innovation models. The first and most traditional is the integrator model, in which a company assumes res-ponsibility for the entire innovation process from start to finish, including the design, manufacture, and sale of a new technology. In general, large, well-heeled companies--Intel, for example--do best with this model. Second is the orchestrator approach, in which functions such as design are kept in-house, while others, including manufacturing or marketing, are handed off to a strategic partner. This model works best when speed is of the essence, or if a company wants to limit its investment. When Porsche couldn't meet demand for its popular Boxster sports coupe in 1997, for example, it turned to Finnish manufacturer Valmet rather than open another costly plant. Finally, Andrew says, there's the licensor approach, in which, for example, a software company licenses a new operating system to a series of PC manufacturers to ensure that its product gets the widest distribution at the lowest possible investment cost. That's you, Microsoft.
From the beginning, Apple appears to have employed the integrator approach--the model with both the highest costs and highest risks. On the one hand, it was the least appropriate choice for a startup with scant financial resources and a nonexistent customer base. But it was probably the inevitable choice for Apple's innovation-venerating culture, which demanded something akin to absolute artistic control. Jobs declined to comment for this story, but he has expressed an almost mystical reverence for the power of innovation over the years. In 1995 remarks to the Smithsonian Institution, for example, he compared innovation to "fashioning collective works of art" and said it afforded "the opportunity to amplify your values" over the rest of society.
The ambition to build the "perfect machine" drove Jobs and his cofounders, A.C. "Mike" Markkula and Steve Wozniak, to strive to build everything, from hardware to software, in-house regardless of cost. Even in those early days, peers like Microsoft were moving to specialize in one dimension of computing or another. (Apple now farms out much of its manufacturing, but won't say how much.)
This pursuit of perfection also led Apple's founders to opt for a closed operating environment on the early Macintosh computers. A closed computing environment is easier to control than an open one. Applications can be written to integrate with one another seamlessly, making the system less buggy. A better user experience!
"There was a lot of elitism at the company," says engineer and Apple alum Daniel Kottke. A Reed College classmate of Jobs who later traveled with him to India, Kottke became Apple's first paid employee in 1976. "Steve definitely cultivated this idea that everyone else in the industry were bozos. But the goal of keeping the system closed had to do with ending the chaos that had existed on the earlier machines." Kottke left Apple in 1984, a year before Jobs himself was forced out.
Apple's purist approach may well have made certain early innovations possible--networking, for example, which it introduced on the first Mac machines in 1984. Windows PCs didn't have printer networking until the mid-1990s. But time and again, Apple's obsession with controlling the entire process of innovation has also demonstrated the truth of Voltaire's dictum that the perfect is the enemy of the good. Today, the company has just 300,000 independent and in-house developers writing programs and making products for its operating systems, including the latest, OS X. More than 7 million developers build applications for the Windows platform worldwide.
APPLE'S DEMAND TO CONTROL THE ENTIRE PROCESS OF INNOVATION SHOWS HOW THE PERFECT CAN BE THE ENEMY OF THE GOOD.
Fewer developers mean fewer new products to run on Apple machines. That means fewer options for end users, which influences purchasing decisions, and therefore sales and profits. One example of a popular product not easily available for the Mac is the personal video recorder, or PVR. That's the TiVo-like device that lets users pause, rewind, and record live television programs on their PCs. Only two developers offer a PVR for the Mac: Elgato Systems' EyeTV and Formac's Studio TVR, retailing at $199 and $299 respectively. At least six software or consumer electronics vendors produce Windows-compatible PVRs, and Microsoft itself gives away PVR capability as a standard feature on Windows XP Media Center.
Apple has consistently rejected opportunities to adjust its innovation strategy to another model. Licensing its operating system to hardware manufacturers would have been an obvious choice. Yet when Jobs returned to Apple in 1997, he terminated the first and last licensing program, championed by former chief executive Gilbert Amelio. Jobs is reported to have told Apple managers that he feared "Mac knockoffs" would dilute the Apple brand.
Spurning The "Gee Job"
At the heart of Apple's innovation conundrum also lies a powerful cultural bias: the lionization of purely technical innovation. Ours is a material society. So it's natural that when we think of innovation, we are more inclined to think of objects, things that we can see, touch, and feel, and of inventors such as the Wright brothers and Thomas Edison. It turns out, though, that the most economically valuable forms of innovation often aren't the tangible kind. Instead, they are forms of innovation that we might belittle as less heroic, less glamorous: the innovation of business models. Don't think innovator-as-hero; think innovator-as-bureaucrat. Even Edison--who held 1,093 patents (more than anyone else in U.S. history) and who in- vented such doodads as electric light, the phonograph, and the motion picture--fared pretty badly when it came to choosing business models. He waged and lost one of the world's first technology-format fights, be-tween alternating and direct currents. And he abandoned the recording business after, among other things, insisting that Edison disks be designed to work only on Edison phonographs. Sound familiar?
In virtually any industry, business-model innovators rather than technical innovators have reaped the greatest rewards in recent decades, argues Gary Hamel, the chairman of Strategos, an international consulting company that focuses on helping businesses innovate successfully. Hamel points to Amazon, eBay, and JetBlue. Each company either delivered goods and services differently (by bringing distribution of books or secondhand goods to the Web) or more cheaply (by becoming a sort of Wal-Mart of the skies). Dell has done both. "Dell hasn't done anything to make PCs more attractive, more powerful, or easier to use. To the extent that there is innovation there, it has come from other companies," like Apple, Hamel says. "All of Dell's contributions have been in providing [other companies' technical] innovations to a wider audience at lower cost."
In some cases, innovation that we might think of as technical is actually business-model based. Henry Ford, for example, didn't invent the automobile--but he did develop the production process that drove costs down and enabled him to pay his assembly workers enough that they could afford cars of their own. "You can be tremendous at innovation on the technical side," Hamel says. "But if you can't wrap that innovation into a compelling value proposition, with a dynamic distribution strategy and attractive price points, then the innovation isn't worth much at all."
And it turns out that such value-driven business-model innovation is precisely the sort of thing that Apple is lousy at. Even back in 1989, for example, when the company still commanded a healthy 10% of the global PC market, some internal developers worried that the company couldn't stay competitive without expanding its customer base. And that, they felt, meant bringing down the cost of the Mac, which made its debut in 1984 at $2,500. That's more than $4,300 in today's dollars, which is why the Mac was first marketed to high earners and early adopters of technology. A group of those developers launched an unsanctioned project some called a "gee job," as in "Oh, gee, I'll do that in my spare time," to design a lower-cost Mac for schools.
Moonlighting for about a year, the team found ways to take costs out of the Mac, such as cheapening the floppy drive and using a less expensive, smaller power supply. In the end, they produced a fully functional Mac with a parts cost of about $340. Even with the typical 60%-plus gross margin on Macs at the time, the computer could have retailed for $1,000--far less than the standard Mac. But when the team presented the Mac LC (for low cost) to management, the marketing department nixed it.
"They said things about the computer weren't Mac-like enough, that it made the machine feel cheap," says Owen Rubin, a former Mac software developer who was on the team. Apparently, one sticking point was the floppy drive, which didn't inhale disks the way the original Mac did. Such subtle conventions cost money. Rubin and his team were sent back to the drawing board. The Mac LC hit the market in 1990, at $2,400. Adjusted for inflation, that's more than $3,300 today, meaning that the Mac LC really wasn't low cost after all.
AS BIG RIVALS SWARM, iPOD AND iTUNES MAY HAVE STARTED ONE MORE PARTY THAT APPLE WILL END UP GETTING TOSSED OUT OF.
A Digital-Music Donnybrook?
There's one last essential element to successful innovation that has often been missing at Apple: follow-through. As Howard Anderson, founder of both the consulting firm Yankee Group and the Boston-based venture capital firm Battery Ventures, puts it, "Innovation isn't the key to economic growth. Management is the key to economic growth." In practice, that means supporting product innovation with such things as a solid sales force, a strategy for collaborating with developers and makers of complementary products, and a strategy for customer ser-vice. "Companies that rely too heavily on creativity flame out," Anderson says. "In many ways, execution is more important. Apple is innovative, but Dell executes."
Apple's dismissal of such mundane pursuits is another paradoxical by-product of its restless, driven culture of creativity. Things such as sales and service are gritty, not cool; plodding, not imaginative; boring, not sexy. Standing in a darkened hallway just outside the jazz-filled salon of the Musee d'Orsay, technology consultant and Apple fan Anthony Knowles puts his finger on it. "By the time their products hit the market," he says, "they're on to the next thing."
The current focus of Apple's marketing efforts is clear to anyone walking the streets of Paris (or driving up Highway 101 in San Francisco, for that matter). Brightly colored silhouettes of hipsters dancing to their iPods are plastered on bus stops and billboards and flapping against the sides of buildings.
It makes sense that Apple would make so much fuss over the gadget. Since it was first introduced in October 2001, Apple has sold more than 1.5 million iPods, or about 300,000 per quarter today. This means that in two years, Apple has achieved roughly the run rate for the iPod that it took 25 years to achieve with its home PCs.
No one knows the cost to Apple to manufacture and market the iPod, and estimates of its operating margin range widely: 2.5% to 18%. But even at iPod's lowest list price of $299--and using a conservative margin estimate of 8%--it's clear that the iPod contributed substantially all of Apple's 2003 estimated operating income of $24.8 million, excluding onetime charges. Without the iPod, Apple is in trouble.
That's why recent releases of competing portable music players take on great significance. Selling for as little as $299, the Dell DJ is about $100 cheaper than the iPod with the same 5,000 song capacity. (A $500 iPod holds 10,000 songs). A third product, a 20-GB unit made by Samsung to work with Napster 2.0, costs $100 less than the 20-GB iPod, or about $300, and boasts a lot more features, including a built-in FM transmitter--to play songs on a car radio--and a voice recorder.
In terms of its innovative legacy, the iPod and iTunes together probably represent Apple's greatest achievement since the introduction of the Apple II in 1977. First, because they mark an important evolution inside Apple as it moves further away from its roots as a PC company and closer to a new role as a consumer-electronics and entertainment shop. Promoting the Mac as the "hub of a digital lifestyle" certainly indicates recognition that Apple may do better to cut its losses in the PC business. In this arena, Apple may benefit from its consumer focus, artful design, and strong brand equity.
ITunes also deserves recognition as Apple's first foray into business-model innovation. It is, after all, nothing but a novel distribution and pricing arrangement. Apple's ability to get users to pay for songs, rather than steal them, also convinced the recording industry that digital-music delivery was worth supporting. Without this leadership, Roxio Inc.'s Napster 2.0 and Dell/Musicmatch might never have negotiated their own digital-rights agreements.
Still, Apple may have learned these important lessons only partially, and too late. The iPod works only with the iTunes service, and has a $0.99 fee-per-song pricing structure. Dell/Musicmatch and Napster offer consumers more choice. Their Windows-based players and services are interchangeable; they sell individual songs and let users listen to (but not keep) as much music as they want for flat fees of less than $10 per month. Meanwhile, the $15 million or so that iTunes has generated in revenue thus far is statistically meaningless even for Apple. And after it has paid the music labels and covered its costs, Apple is left with just pennies per song. Even using a generous operating margin estimate, iTunes won't turn a meaningful profit until it hits Jobs's stated goal of 100 million songs sold. Jobs has said he hopes to do so by April, but at the current rate of 1.5 million songs sold per week, that is more than a year away.
And the competition is swarming. Dell and Samsung are challenging enough, but this business is about to turn into a battle of the titans. Wal-Mart is launching a cut-price online music store of its own--and now Microsoft and Sony, no less, are joining the fray. So Apple's venture into online music is beginning to look like yet another case of frustration-by-innovation. Once again, Apple has pioneered a market--created a whole new business, even--with a cool, visionary product. And once again, it has drawn copycats with the scale and financial heft to undersell and out-market it. In the end, digital music could turn out to be just one more party that Apple started, but ultimately gets tossed out of.
Sidebar: Tuscan Stone?
Early on a sunny Saturday morning in July, a line of eager people stretches down the block and around the corner. Sleeping bags, camping chairs, and food are scattered evidence of a long night on the sidewalk. These eager fans--groupies, really--aren't faithful followers of the latest cult rock band: They're here for the opening of the newest Apple store, in the sleepy Bay Area town of Burlingame, California.
When the doors finally open at 10 a.m. to the rousing strains of U2's "Beautiful Day," a double line of 35 Apple Store employees clap to the music and high-five the cheering sidewalk sleepers as they pour into the store. Ron Johnson, Apple's vice president of retail, enters a few minutes later with his two children and is greeted like a rock star. It's quite the reception for an opening that's hardly novel: Burlingame marks the 63rd addition to Apple's retail chain--and the fifth in an already crowded Bay Area market. But these folks don't seem to notice.
The store is done in iPod shades of white. "We chose hand-selected Tuscan stone for the floors--a stone that's somewhere between sandstone and limestone," Johnson says. "It's the same stuff Florence was built on." Each store boasts a Genius Bar, where customers can get technical support from Mac experts. Apple products are laid out on broad tables that are grouped by category: photo, music, movies. They're configured with speakers, iPods, and other peripherals so that users can see how an ideal "digital media hub" works. "The real reason we're here is to drive market share, so we devote most of the space in the front of the store to our products and the experience of them," Johnson says. "I'd love to see Apple get back to 15% market share someday."
Well, sure. But two years since the launch of Apple Stores, it's still unclear whether the strategy has moved the needle at all. Apple claims that 50% of all its retail-store buyers are new to Macs (some buying their first computer, others switching from Windows machines), but analysts such as Roger Kay, from the technology market research firm IDC, dismiss any notion of progress. "They're losing as many to Windows as they're gaining."
The real problem, according to Kay, is the enormous investment that Apple must make to open each retail location. Apple is holding leases on some of the most expensive real estate in the country, in places such as tony Michigan Avenue in Chicago and New York's trendy SoHo. And then there are those Tuscan stone floors. "Apple is creating a boutique environment, and they're doing it in a very expensive way," says Kay. "It doesn't seem very reliable as an approach for selling large quantities of goods."
Johnson argues that it's far too soon to claim success or failure on the issue of overall market-share growth--but that the retail stores are a very definite success. "Wal-Mart has taken 40 years to get where it is today," he says. "We've been open for just over two years. As we continue to add more and more stores, you'll see. We will move the needle." -Alison Overholt
Post a Comment