As we commemorate the history of Apple Computer, books, articles and blog posts often prioritize the mercurial genius of Steve Jobs and the engineering brilliance of Steve Wozniak. However, any serious technical historian knows that without the “adult in the room,” Apple might have remained a footnote in a 1970s hobbyist magazine. That adult was A.C. “Mike” Markkula Jr. While the two Steves provided the spark and the fuel, Markkula provided the engine, the steering, and—most importantly—the map.
The Millionaire Who Came Out of Retirement
In 1976, Mike Markkula was already a success story. Having made a fortune from stock options at Fairchild Semiconductor and Intel, he had retired at the age of 32. His plan was simple: spend his days tinkering and consulting. That changed when a mutual contact (the venture capitalist Don Valentine) pointed a young, scruffy Steve Jobs in his direction.
Unlike the traditional bankers who saw two college dropouts in a garage, Markkula saw a nascent industry. He didn’t just give Apple advice; he brought a $250,000 investment ($92,000 as equity and $157,000 as a loan). In exchange, he became a one-third owner of the company. It was the moment Apple transitioned from a partnership of friends into a legitimate corporation.
The “Apple Marketing Philosophy”: A Legacy of Three Words
Markkula’s most enduring contribution wasn’t just the capital—it was the culture. He authored a one-page document titled “The Apple Marketing Philosophy,” which outlined three core principles that still dictate Apple’s DNA today:
Empathy: A deep, almost obsessive understanding of the customer’s feelings and needs.
Focus: To do a few things well, one must eliminate all unimportant opportunities.
Impute: The belief that people judge a book by its cover. This led to Apple’s legendary obsession with high-quality packaging and industrial design. Markkula understood that if a computer was presented in a sloppy way, the user would perceive the technology inside as sloppy.
The Professional CEO and the “Steves”
Markkula served as Apple’s second CEO from 1981 to 1983, bridging the gap between the chaotic early days of Mike Scott and the corporate era of John Sculley. He was a mentor to Jobs, teaching him the nuances of business, and a protector of Wozniak, ensuring the engineering side had the resources it needed.
However, his tenure was not without friction. Markkula was a proponent of the Apple III and the Lisa, projects that struggled commercially. Perhaps his most controversial historical moment was his support for the board’s decision to side with John Sculley in 1985, leading to Steve Jobs’ famous departure. Markkula believed in the institution over the individual—a pragmatic view that was necessary for Apple to survive its turbulent adolescence.
The End of an Era
Markkula remained on Apple’s board for two decades, serving until Steve Jobs returned in 1997. As Jobs “cleaned house” to start the “Think Different” era, the old guard—including Markkula—stepped aside.
While he rarely seeks the spotlight today, his influence is visible in every minimalist Apple Store and every “designed in California” box. He was the venture capitalist who actually understood the product, the marketer who understood the human psyche, and the strategist who knew that for a computer to change the world, it first had to look like it belonged in the world.
Historical Assessment: The Essential Third Pillar
In the history of technology, we often romanticize the lone inventor. But the Apple I and II would likely have been overtaken by Commodore or IBM if Markkula hadn’t professionalized the operation. He transformed Apple into a “real” company that could secure credit lines, manage supply chains, and build a brand. If Jobs was the heart and Woz was the brain, Mike Markkula was the backbone of Apple.
Year
Milestone
Impact on Apple
1976
The Meeting
Retired at 32, Markkula is introduced to Jobs and Wozniak. He sees the potential they couldn’t yet articulate.
1977
The Investment
He invests $250,000, writes the business plan, and officially incorporates Apple Computer Co.
1977
The Manifesto
Writes “The Apple Marketing Philosophy,” establishing the brand’s psychological foundation.
1981
The CEO Seat
Takes over as CEO following Mike Scott’s departure, guiding the company through its IPO and early growth.
1983
The Sculley Era
Steps down as CEO to make room for John Sculley but remains a powerful Chairman of the Board.
1985
The Schism
Sides with John Sculley during the power struggle with Steve Jobs, leading to Jobs’ 12-year exile.
1997
The Departure
Resigns from the Board of Directors as Steve Jobs returns, marking the end of the “original” Apple leadership.
Source: Historical Records of Apple Computer Leadership (1976-1997).
It’s a story of corporate courtship, a brief period of “magical” synergy, and a cold-blooded boardroom coup that fundamentally reshaped the trajectory of personal computing. At the heart of this drama stood two men: the visionary co-founder Steve Jobs and the seasoned marketing executive John Sculley.
The Seduction: From Sugar Water to Silicon
By 1983, Apple was no longer a hobbyist’s garage project; it was a public company facing fierce competition from IBM. Steve Jobs, while brilliant, was perceived by the board as too young and volatile to lead a multinational corporation. They wanted “adult supervision.”
Jobs set his sights on John Sculley, the then-president of PepsiCo. Sculley was a marketing prodigy responsible for the “Pepsi Challenge.” The recruitment process was a months-long pursuit that culminated in one of the most famous lines in business history. When Sculley initially hesitated, Jobs challenged him:
“Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?” Steve Jobs challenges John Sculles
Jobs wanted Sculley to share his excitement about the Macintosh and showed him a prototype. “This product means more to me than anything I’ve done,” Jobs said. “I want you to be the first person outside of Apple to see it.” He dramatically pulled the prototype out of a vinyl bag and gave a demonstration. Sculley found Jobs as memorable as his machine. “He seemed more a showman than a businessman. Every move seemed calculated, as if it was rehearsed, to create an occasion of the moment.”
Jobs had asked Software Developer Andy Hertzfeld and the gang to prepare a special screen display for Sculley’s amusement. “He’s really smart,” Jobs said. “You wouldn’t believe how smart he is.” The explanation that Sculley might buy a lot of Macintoshes for Pepsi “sounded a little bit fishy to me,” Hertzfeld recalled, but he and Susan Kare created a screen of Pepsi caps and cans that danced around with the Apple logo. Hertzfeld was so excited he began waving his arms around during the demo, but Sculley seemed underwhelmed. “He asked a few questions, but he didn’t seem all that interested,” Hertzfeld recalled. He never ended up warming to Sculley. “He was incredibly phony, a complete poseur,” he later said. “He pretended to be interested in technology, but he wasn’t. He was a marketing guy, and that is what marketing guys are: paid poseurs.”
Steve Jobs was not very impressed by the concerns of his team and decided to hire the marketing specialist: Sculley joined Apple as CEO in April 1983, bringing the corporate discipline and marketing prowess Apple’s board craved.
John Sculley and Steve Jobs: The “Dynamic Duo” Phase
Apple’s Dynamic Duo
Initially, the partnership was remarkably harmonious. The media dubbed them the “The Dynamic Duo.”. Jobs was the product visionary, dreaming up the Macintosh; Sculley was the operational expert who knew how to scale a brand.
The Bond: They were nearly inseparable, often finishing each other’s sentences in interviews. Sculley gave Jobs the professional validation he sought, and Jobs gave Sculley a sense of higher purpose beyond consumer goods.
During this honeymoon period, they successfully launched the Macintosh in 1984, backed by the iconic Super Bowl commercial. It seemed, for a moment, that the marriage of counter-culture innovation and Madison Avenue marketing was invincible.
The Apple Macintosh had not been a success from the outset. The hardware was not designed particularly generously for the requirements of a graphical user interface. Especially the main memory had been calculated rather tightly. Moreover, there was no hard disk for the Mac at that time.
“The original 128K Mac had too many problems to list,” wrote Jack Schofield from the Guardian 20 years later in an article about the 20th anniversary of the apple Macintosh. “It had too little software, you couldn’t expand it (no hard drive, no SCSI port, no ADB port, no expansion slots), it was horribly underpowered and absurdly overpriced. The way MacWrite and MacPaint worked together was brilliant, but producing anything more than a short essay was a huge struggle. Just copying a floppy was a nightmare.” In addition, there was a lack of appropriate business software.
The Mac lacked the applications that dragged the Charlie Chaplin figure across the screen box by box in the IBM’s advertising spot for the PC. Therefore, Guy Kawasaki and other “Software Evangelists” of Apple made an effort to convince the developers of other software companies to write programs for the Mac. The Mac’s ROM, which had been calculated far too narrowly at 128 kilobytes, did not make this a simple task. Not until the “Fat Mac” with 512 kilobytes was launched one year after the first Macintosh had this narrow bottleneck been removed.
The problem came to a head when by the beginning of 1985, the Macs that had not found purchasers during the Christmas sales of 1984 were piling up in storage. Apple had to publish the first quarterly loss in the company’s history and release a fifth of the staff. During a marathon meeting on April 10 and 11, 1985, Apple’s CEO John Sculley demanded to have Steve Jobs relieved of his position as an Apple vice president and general manager of the Macintosh department.
According to Sculley’s wishes, Steve Jobs was to represent the company externally as a new Apple chairman without influencing the core business. As Jobs got wind of these plans to deprive him of his power, he tried to arrange a coup against Sculley on the Apple board. Sculley told the board: “I’m asking Steve to step down and you can back me on it and then I take responsibility for running the company, or we can do nothing and you’re going to find yourselves a new CEO.” The majority of the board backed the ex-Pepsi man and turned away from Steve Jobs.
On May 31, 1985, Jobs lost his responsibilities and was shuffled off to the chairman position. In September, the Apple co-founder left the company with a few people in order to found NeXT Computer. “I feel like somebody just punched me in the stomach and knocked all my wind out. I’m only 30 years old and I want to have a chance to continue creating things. I know I’ve got at least one more great computer in me. And Apple is not going to give me a chance to do that,” Jobs wrote to Mike Markkula on parting. Ten years later, Steve Jobs also commented on his disempowerment with bitterness in the TV documentary “Triumph of the Nerds” (1996):
Excerpt from the TV documentary “Triumph of the Nerds” with Robert Cringley
Jobs: What can I say? I hired the wrong guy. – Question: That was Sculley? Jobs: Yeah and he destroyed everything I spent ten years working for. Starting with me but that wasn’t the saddest part. I would have gladly left Apple if Apple would have turned out like I wanted it to.
Apple’s Heart and Soul
Andy Hertzfeld, one of the Macintosh’s fathers, later recalled the events:
Andy Hertzfeld
The conflict came to a head at the April 10th board meeting. The board thought they could convince Steve to transition back to a product visionary role, but instead he went on the attack and lobbied for Sculley’s removal. After long wrenching discussions with both of them, and extending the meeting to the following day, the board decided in favor of John, instructing him to reorganize the Macintosh division, stripping Steve of all authority. Steve would remain the chairman of Apple, but for the time being, no operating role was defined for him. John didn’t want to implement the reorganization immediately, because he still thought that he could reconcile with Steve, and get him to buy into the changes, achieving a smooth transition with his blessing. But after a brief period of depressed cooperation, Steve started attacking John again, behind the scenes in a variety of ways. I won’t go into the details here, but eventually John had to remove Steve from his management role in the Macintosh division involuntarily. Apple announced Steve’s removal, along with the first quarterly loss in their history as well as significant layoffs, on Friday, May 31, 1985, Fridays being the traditional time for companies to announce bad news. It was surely one of the lowest points of Apple history.
Hertzfeld mourned for Steve Jobs openly: “Apple never recovered from losing Steve. Steve was the heart and soul and driving force. It would be quite a different place today. They lost their soul.” In contrast, Larry Tessler, who had come to Apple from Xerox, refers to mixed reactions of the Apple staff: “People in the company had very mixed feelings about it. Everyone had been terrorized by Steve Jobs at some point or another and so there was a certain relief that the terrorist had gone but on the other hand I think there was an incredible respect for Steve Jobs by the very same people and we were all very worried – what would happen to this company without the visionary, without the founder without the charisma…”
The Sculley Era: From Visionary Disruption to Corporate Precision
The departure of Steve Jobs in 1985 marked a fundamental shift in Apple’s DNA. While the “Jobsian” approach was rooted in creating “insanely great” products regardless of immediate market demand, John Sculley’s leadership transitioned the company toward a market-driven, high-margin business model.
Embracing the “Open” Mac
One of the most significant shifts was the reversal of Jobs’s “closed box” philosophy. Jobs famously insisted that the Macintosh should be a sealed appliance, devoid of expansion slots, to maintain total control over the user experience. Sculley, listening to corporate customers, greenlit the Macintosh II (1987). It featured color graphics and expansion slots. This move was a massive commercial success, as it allowed the Mac to compete with IBM in the high-end workstation market.
Screenshot of PageMaker 1.0 (french version)
The Desktop Publishing Revolution
Under Sculley, Apple stopped trying to sell the Mac as a general-purpose “appliance for the rest of us” and found its “killer app”: Desktop Publishing (DTP).
By pairing the Macintosh with the LaserWriter printer and Aldus PageMaker software, Sculley targeted a specific niche—graphic designers and publishers. This strategy saved the company, creating a loyal, high-paying user base that allowed Apple to charge premium prices during the late 80s.
Milking the Apple II “Cash Cow”
While Jobs had viewed the Apple II as an obsolete distraction, Sculley recognized it was the company’s financial lifeblood. He continued to iterate on the line (notably the Apple IIGS), using the profits from the aging platform to fund the expensive research and development of future Macintosh models. This pragmatism provided the stability Apple needed to survive the mid-80s tech slump.
The PowerBook Triumph
In 1991, Apple released the PowerBook, arguably the most successful product of the Sculley years. While Jobs’s earlier attempt at a portable (the 1989 Macintosh Portable) was a 16-pound failure, the PowerBook was a masterpiece of industrial design. It introduced the ergonomic layout we still see in laptops today—the keyboard pushed back to provide palm rests and a centered pointing device (the trackball). It captured 40% of the laptop market at its peak.
Strategy by Proliferation (The Downfall)
However, the post-Jobs era also saw the beginning of “product sprawl.” Without Jobs’s obsessive focus, the product line became bloated. Apple began releasing dozens of confusingly named models—Performa, Centris, Quadra—often with nearly identical specs. This led to:
Customer Confusion: It was impossible for a buyer to know which Mac was right for them.-
Inventory Bloat: Managing so many different hardware configurations became a logistical nightmare.
Brand Dilution: Apple started to look like just another PC manufacturer, losing its “special” status.
The Newton: A Visionary Leap Too Soon
Sculley’s final major push was the Apple Newton (MessagePad). He coined the term “Personal Digital Assistant” (PDA) and envisioned a world of handheld computing. However, without Jobs’s perfectionism, the product was launched prematurely. The handwriting recognition—its core feature—was unreliable, turning the device into a punchline in popular culture.
The immediate post-Jobs era was characterized by professionalization. Sculley turned Apple into a highly profitable, organized, and respected corporate entity. However, the cost was the loss of a singular, coherent vision. By the early 1990s, Apple was a company that made great hardware but had lost its way. Even before Microsoft introduced Windows 95, Apple sales were under heavy pressure. In 1993 Sculley had to step down.
Searching for a Strategy of Survival
Michael Spindler, known as “The Diesel” for his work ethic, took the helm after Sculley. His tenure was marked by a desperate attempt to keep Apple relevant in a world increasingly dominated by Microsoft’s Windows. Spindler successfully oversaw the architectural shift from Motorola 68k processors to the PowerPC. While technically impressive, it was an expensive and exhausting transition for developers and users alike. But Spindler made ohne bis mistake: In a move that Steve Jobs would later describe as “selling the soul of the company,” Spindler licensed the Macintosh Operating System to third-party manufacturers (cloners). The goal was to increase market share, but instead, it cannibalized Apple’s own high-margin hardware sales. Michael Spindler resigned as CEO of Apple on January 31, 1996. His departure followed a period of severe financial struggle for the company, including a massive quarterly loss and a failed attempt to sell Apple to Sun Microsystems.
The Amelio Era: 500 Days of Crisis
When Gil Amelio took over in 1996, Apple was hemorrhaging cash. The company was suffering from massive quarterly losses and a bloated product line that lacked any clear direction. Apple’s greatest asset, the Mac OS, was aging rapidly. It lacked modern features like protected memory and multitasking. Amelio realized that Apple could not build a new operating system in-house fast enough (the internal “Copland” project had failed).
Amelio began looking for an external operating system to buy. It is an irony of computer history that Jobs later saved the struggling Apple Computer company. NeXT’s subsequent 1997 buyout by Apple brought Jobs back to the company he co-founded, and he has served as its CEO since then. Jobs did not only save Apple, but revolutionized the world with the iMac, the iPod, the iPhone and the iPad.
Steve Jobs and John Sculley: Not on good terms
In an Interview (June 2010) Sculley credits Jobs for everything Apple has accomplished and still laments the way things turned out. “I haven’t spoken to Steve in 20-odd years,” Sculley told The Daily Beast. “Even though he still doesn’t speak to me, and I expect he never will, I have tremendous admiration for him.”
Sculley said in the interview he accepts responsibility for his role but also believes that Apple’s board should have understood that Jobs needed to be in charge. “My sense is that it probably would never have broken down between Steve and me if we had figured out different roles,” Sculley said. “Maybe he should have been the CEO and I should have been the president. It should have been worked out ahead of time, and that’s one of those things you look to a really good board to do.”
Jonathan Paul “Jony” Ive is a legendary designer who was responsible for the design of many of Apple’s most iconic products, including the iMac, iPod, iPhone, and iPad. Jony Ive has played a critical role in the company’s product design strategy, helping to establish its reputation for sleek and innovative design.
Jony Ive was born on 27 February 1967 in Chingford, a town in east London in England. His father, Michael John Ive, was a silversmith, and his mother, Pamela Mary Ive, was a psychotherapist. Jony Ive attended Chingford Foundation School just outside London, later to be the alma mater of David Beckham. While in school, Ive was diagnosed with dyslexia, but it never seriously affected his education.
Jony Ive at his High School
Ive was curious about the inner workings of things throughout his childhood and was fascinated by how objects were put together. He would carefully dismantle radios and cassette recorders, exploring how they were assembled and how the pieces fit. Although when he tried to put the equipment back together again, he didn’t always succeed.
In a 2003 interview conducted at London’s Design Museum he said, “I remember always being interested in made objects. The fact they had been designed was not obvious or even interesting to me initially. As a kid, I remember taking apart whatever I could get my hands on. Later, this developed into more of an interest in how they were made, how they worked, their form and material.”
By the age of thirteen or fourteen he was pretty certain that he wanted to draw and make stuff. “I knew that I wanted to design but I had no idea what I’d design as I was interested in everything: cars, products, furniture, jewellery, boats. After visiting a few design consultancies I eventually decided that product design would be a pretty good foundation as it seemed the most general.”
He studied art and design at school and went on to Newcastle Polytechnic. “I figured out some basic stuff — that form and colour defines your perception of the nature of an object, whether or not it is intended to. I learnt the fundamentals of how you make things and I started to understand the historical and cultural context of an object’s design. I wish my drawing skills had improved, but while that bothered me then, it doesn’t now.”
It was during his college years when Ive further developed his signature design style based on German Bauhaus tradition. This design philosophy embraced a minimalist approach, where designers should only design what is needed. And these were the same principles followed by former Braun designer Dieter Rams, and later you can see similarities in the products each of them Ive had created.
How Jony Ive joined Apple
After graduating 1989, Ive joined London-based design startup Tangerine, bringing bathroom manufacturer Ideal Standard along as a major client. But Ive’s designs for sinks were never turned into a product because they would have been too expensive to manufacture. In the fall of 1991, then-Apple design chief Bob Brunner appeared at Tangerine and commissioned four design studies.
Under Ive’s guidance, Tangerine’s Juggernaut project included the never-built “Macintosh Folio” tablet computer – which was to be operated with a stylus and was still five times as thick as the first iPad 18 years later. Ive recalls his experience there: “I was pretty naïve. I hadn’t been out of college for long but I learnt lots by designing a range of different objects: from hair combs and ceramics, to power tools and televisions. Importantly, I worked out what I was good at and what I was bad at. It became pretty clear what I wanted to do. I was really only interested in design. I was neither interested, nor good at building a business.
It was not inevitable in Ive’s career that he would one day shape the design of a major computer company like Apple. Quite the opposite. The young Brit initially had problems finding his way in the world of personal computers. “I went through college having a real problem with computers“, Ive recalls in the interview with the Design Museum. “I was convinced that I was technically inept, which was frustrating as I wanted to use computers to help me with various aspects of my design.” Right at the end of his time at college Ive discovered the Mac. “I remember being astounded at just how much better it was than anything else I had tried to use. I was struck by the care taken with the whole user experience. I had a sense of connection via the object with the designers.”
Ive started to learn more about the company, how it had been founded, its values and its structure. The more he learnt about this cheeky, almost rebellious company the more it appealed to him, as it unapologetically pointed to an alternative in a complacent and creatively bankrupt industry. “Apple stood for something and had a reason for being that wasn’t just about making money.”
In the early 1990s, Ive was living in London again and working with a number of clients in Japan, the US and Europe at Tangerine. Apple did a search to find a new design consultant and decided to work with him. Ive: “I still remember Apple describing this fantastic opportunity and being so nervous that I would mess it all up. While I had never thought that I could work successfully as part of a corporation — always assuming that I would work independently — at the end of a big program of work for Apple, I decided to accept a full-time position there and to move to California.” In September 1992, at age twenty-seven, Ive accepted a full-time position at Apple, and his first assignment was to redesign the Newton MessagePad.
At the time, Apple was being run by John Sculley since Steve Jobs had been forced out six years prior. The desktop publishing revolution was putting Macs in businesses all over the world and Apple had just celebrated its first quarter earning two billion in revenue. With all this cash, Apple was expanding it product lines and Sculley was investing heavily in R&D to speed up development of new products like the Apple Newton.
Failure and success with the Apple Newton
The first version of the Apple Newton was a failure, but Apple was hoping to change that with its second iteration. Ive worked tirelessly on the project and involved himself in every last detail. He even traveled to Taiwan to fix manufacturing problems. But no amount of effort from Ive was enough to save the Newton. Apple had made marketing and engineering mistakes that plagued the Newton until it was finally discontinued.
The Apple Newton proved to be a business failure and strategic setback, but Jonathan Ive’s Newton was a design success. It earned him four of the top awards in the industry and the honor of being featured in the permanent collection of the San Francisco Museum of Modern Art.
However, Apple CEO John Sculley not only had a poor sense of direction when it came to developing new product categories, but also had to contend with major difficulties in the core business. In 1995, the crisis became obvious. Windows 95 was released, and cheap PCs began to fly off the shelves, undercutting the Mac. In the first quarter of 1996, Apple reported 69 million dollar loss and laid off 1300 employees.
This caused Apple’s focus to shift from developing high-quality, well-designed Macs to pushing out the cheapest machines they could possibly make. This was truly a period of no innovation at Apple, and it destroyed Jonathan Ive’s moral. He said: “All they wanted from us designers was a model of what something was supposed to look like on the outside, and then engineers would make it as cheap as possible. I was about to quit.”
Rediscovered by Steve Jobs and John Rubenstein
But before Jony Ive could resign, Jon Rubinstein, his new boss, talked him out of it. Rubinstein gave him a raise and told him that eventually the company would turn around and they’d have the opportunity to make history. And with Jobs return to Apple in 1997, Rubinstein was exactly right. Jobs brought focus to not only the company but also Ive’s design group. In the interview with the Design Museum Ive recalls: “When I joined Apple the company was in decline. It seemed to have lost what had once been a very clear sense of identity and purpose. Apple had started trying to compete to an agenda set by an industry that had never shared its goals. While as a designer I was certainly closer to where the decisions were being made, but I was only marginally more effective or influential than I had been as a consultant. This only changed when Steve Jobs returned to the company. By re-establishing the core values he had established at the beginning, Apple again pursued a direction which was clear and different from any other company. Design and innovation formed an important part of this new direction.”
Jobs refocused the design team and got them working together on a new project called the Mac NC, which would later become the iMac. The team only had nine months to get it from design to production. To meet this deadline, Ive implemented a radical, integrated design process that transformed the way Apple developed its products. The workflow was so successful that it became permanent, and it’s essentially the same system the design group uses today. So the iMac was released nine months later and ended up being the best-selling Mac in Apple’s history up to that point.
Ive and his team became famous for its fanatical care beyond the obvious stuff: the obsessive attention to details that are often overlooked, like cables and power adaptors. Ive recalls: “Take the iMac — our attempts to make it less exclusive and more accessible occurred at a number of different levels. A detail example is the handle. While its primary function is obviously associated with making the product easy to move, a compelling part of its function is the immediate connection it makes with the user by unambiguously referencing the hand. That reference represents, at some level, an understanding beyond the iMac’s core function. Seeing an object with a handle, you instantly understand aspects of its physical nature — I can touch it, move it, it’s not too precious.”
Jonathan Ive had finally found a company that gave him the freedom to practice his craft effectively without limitations. In fact, part of Jobs reorganization of Apple included giving the design team power over any other group, including engineering. And because Ive was head of design, he had a tremendous amount of operational power at Apple, second only to Steve Jobs. Jonathan “He’s not just a designer,” Mr. Jobs told his biographer, Walter Isaacson. “He has more operational power than anyone at Apple, except me.”
Starting in 2002, Jobs and Ive set about turning the vision of an iPod smartphone into reality. In a few detours and with the help of a team that was supposed to develop a tablet computer, the iPhone was born and presented to the astonished public in January 2007.
Inspired by Dieter Rams and Braun Design
The iPhone also meant a tribute by Ive to Dieter Rams, the legendary chief designer of Braun. Users should understand products intuitively, without an instruction manual, was Rams’ motto. The minimalism and simplicity of the iPhone proved such a resounding success that competitors like Samsung quickly and shamelessly copied the concept.
Jony Ive and Braun Design
In one respect, however, Ive was unfaithful to the model from Germany: for Rams, the maxim that form must follow function still applies today. Ive, on the other hand, gave the engineers such strict guidelines that certain functions fell by the wayside. For example, the headphone jack in the iPhone 7 was sacrificed to save space and make the case a bit slimmer. The same reasoning was used to justify the MacBook’s lack of multiple input and output interfaces – forcing users to buy cumbersome adapters to be able to read photos from an SD card, for example. Critics also blame Ive’s efforts to produce thinner and thinner devices for the latest fiasco and the butterfly keyboards in the MacBooks.
Sir Jonathan Ive at The Goodwood Festival of Speed Cartier Style et Luxe party Photo: Marcus Dawes, Licence: CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=19277958
Alongside Steve Jobs, Jony Ive has always appeared in the past to be the gentle type who strives for balance. But this thesis is not correct: Ive repeatedly engaged in heated controversies with other Apple managers about the technical and financial effort required to implement design ideas. When designing the first Mac mini, for example, Ive and his team designed the case so that it was just two millimeters too narrow to use a conventional 3.5-inch hard drive. So they had to use a much more expensive 2.5-inch drive, which was usually only used in laptops. Under Steve Jobs’ umbrella, Jony Ive and design took precedence over cost considerations.
Ive clashed primarily with his old mentor Jon Rubinstein, who was actually his supervisor. But when in doubt, Steve Jobs always decided against Rubinstein and other pragmatists in Cupertino. When the Apple boss also promoted his protégé Ive to senior vice president in 2005, it was time for Rubinstein to leave Apple. His successor, Tony Fadell, was also constantly at odds with Ive. When Fadell left Apple in October 2008 with a golden handshake, the controversy was still kept under wraps. It wasn’t until the publication of Leander Kahney’s Ive biography that the conflict became known in full detail.
The most prominent loser in a power struggle with Ive in 2013 was Scott Forstall, who three years earlier had been considered a possible successor to Steve Jobs, who had fallen ill. Ive disliked the software design (“skeuomorphism”) preferred by Forstall: Until the iPhone operating system iOS 6, it used the shapes of familiar objects that actually had no function, such as the virtual leather cover on the calendar application. Ive enforced “flat design,” a much more abstract design language, on iOS 7. After this defeat, Forstall left the company. Ive then also took over responsibility for “human interface” design. However, some Appleians in Cupertino still mourn Forstall’s passing.
During Steve Jobs’ lifetime, Jony Ive was clearly in second place in Apple’s internal hierarchy. But when the company’s co-founder died of cancer in October, it was clear that Tim Cook would be his successor, not Ive. Jony Ive stood for the soul of the company, but had little interest in the business figures.
After the death Steve Jobs Ive in a moving eulogy convince the assembled staff that Apple will not lose sight of its great vision even without the charismatic leader Steve Jobs.
Jony Ive’s tribute to Steve Jobs
But the death from cancer of his friend apparently hit Ive much harder than most Apple employees. The Brit missed the almost daily exchange of views with Jobs over lunch and the regular visits by the company boss to the otherwise almost hermetically sealed design studio.
At the same time, public and shareholder pressure was growing at the time for Apple to launch a new smash hit after the iPhone (2007) and iPad (2010). Some stock market analysts doubted whether Apple would even be able to innovate after Jobs’ death.
At this stage, Ive pushed to build a smartwatch. For the first model of the Apple Watch in 2015, Ive pursued his idea of making the smartwatch primarily a fashion accessory. The watch was sold in fashion stores and, for a jet-setting audience, was also offered in a sinfully expensive high-end variant with an 18-carat gold case.
Jony Ive presents the first Apple Watch (2014)
But it quickly became apparent that the mass of buyers did not want to follow Ive. Many of the gold models went unsold. Users weren’t looking for a luxury watch, but a practical gadget that would allow them to keep track of their fitness activities and see notifications without constantly pulling their iPhone out of their pocket. After this setback, Ive asked Apple Group CEO Tim Cook to be relieved of his day-to-day management duties, the Wall Street Journal reported.
Cook, however, was all about keeping Ive on board at all costs during this critical phase. It didn’t matter that Cook, in contrast to his predecessor Steve Jobs, showed up much less often at Apple’s design studios. He appointed Ive as “Chief Design Officer”, who would not only be responsible for the hardware and packaging of the devices, but also for the software design as well as the design of the Apple Stores and the new company campus Apple Park.
Paradoxically, after his promotion to “Chief Design Officer” in May 2015, Ive had hardly any influence on the design of concrete Apple products. Only the Apple Watch was an exception. Ive was now primarily concerned with the design of the company’s new Apple Park site. For two years, he was more likely to be seen in rubber boots on the construction site than in his team’s design studio. It wasn’t until the end of 2017 that the Apple Park chapter was largely closed for Ive.
Tim Cook and Jony Ive at WWDC 2019
Afterwards, the members of the design team had hoped that Ive would now be available more often in Cupertino for detailed decisions. But these expectations were not fulfilled. Ive now frequently worked in his personal design studio in San Francisco near his home, saving the annoying stop-and-go trips on the dust-laden highway to Cupertino. Communication with the design team suffered as a result.
Apple announced on 27 June 2019 that Ive would depart the company. The decisive factor for Apple’s exit was probably Ive’s pronounced desire to no longer take responsibility for things that he could only influence to a limited extent. This includes topics such as the development of iPhone sales or the company’s share price. In a February 2015 Ive portrait in The New Yorker magazine, the designer is quoted as saying he is “deeply, deeply tired” and “always anxious.” And Laurence Powell Jobs, Steven Jobs’ widow, chimes in with this assessment: “Jony is an artist with an artist’s temperament, and he’d be the first to tell you artists aren’t supposed to be responsible for this kind of thing.”
Apple stated that Ive would start an independent firm named LoveFrom, along with fellow Apple industrial designer Marc Newson, that would work with Apple as its primary client. LoveFrom is known to keep a low profile, and does not disclose information about its employees. LoveFrom unveiled its minimalistic official website in October 2021.
The Final Severing of Ties and LoveFrom’s Diversification
In July 2022, the decades-long professional relationship between Jony Ive and Apple officially concluded when both parties agreed not to renew their $100 million consulting contract. This formal split allowed Ive’s creative collective, LoveFrom, to shed its restrictive “non-compete” clauses and expand into diverse luxury and industrial sectors. Since then, LoveFrom has established itself as a multidisciplinary powerhouse, designing the coronation emblem for King Charles III, a modular high-fashion collection for Moncler, and an interior overhaul for Ferrari’s first electric vehicle. These projects emphasize Ive’s move toward “gentle, human-centric design” across varied physical mediums, ranging from custom typefaces for historic San Francisco bookstores to high-end audio hardware like the Linn Sondek LP12.
The OpenAI Venture and the “Post-Smartphone” Frontier
The most significant chapter of Ive’s post-Apple career began in 2023 through a quiet collaboration with OpenAI CEO Sam Altman, which culminated in the May 2025 announcement that OpenAI had acquired Ive’s hardware startup, io, for approximately $6.5 billion. Now leading a dedicated hardware division within OpenAI while maintaining LoveFrom’s independence, Ive is developing a “screen-less” AI companion designed to move users beyond the smartphone era. Internally code-named “Sweetpea,” the device—rumored to be a voice-centric, multimodal wearable—is built on the philosophy of “ambient computing,” where technology recedes into the background. As of early 2026, working prototypes are in active testing, with a mainstream consumer launch targeted for late 2026, potentially marking the most significant hardware revolution since the original iPhone.
Susan Kare is an artist and designer and pioneer of pixel art; she created many of the graphical interface elements for the original Apple Macintosh in the 1980s as a key member of the Mac software design team, and continued to work as Creative Director at NeXT for Steve Jobs.
She was born on February 5, 1954 in n Ithaca, New York. Her father was a professor at the University of Pennsylvania and director of the Monell Chemical Senses Center, a research facility for the senses of taste and smell. Her mother taught her counted-thread embroidery as she immersed herself in drawings, paintings, and crafts. She graduated from Harriton High School (Rosemont, Pennsylvania) in 1971. In her high school years, she met Andy Hertzfeld, who would later become one of the key software engineers at Apple in the development of the Macintosh.
She graduated summa cum laude with a B.A. in Art from Mount Holyoke College, a private liberal arts women’s college in South Hadley, Massachusetts, in 1975, with an undergraduate honors thesis on sculpture. She received a M.A. and a Ph.D. in fine arts from New York University in 1978 with a doctoral dissertation on “the use of caricature in selected sculptures of Honoré Daumier and Claes Oldenburg”. Her goal was “to be either a fine artist or teacher”.
Susan Kare’s career has always focused on fine art. For several summers during high school she interned at the Franklin Institute for designer Harry Loucks, who introduced her to typography and graphic design while she did phototypesetting with “strips of type for labels in a dark room on a PhotoTypositor”. Because she did not attend an artist training school, she built her experience and portfolio by taking many pro-bono graphics jobs such as posters and brochure design in college, holiday cards, and invitations. After her Ph.D., she moved to San Francisco to work at the Fine Arts Museums of San Francisco (FAMSF), as sculptor and occasional curator.
In 1982, Kare was welding a life-sized razorback hog sculpture commissioned by an Arkansas museum when she received a phone call from high school friend Andy Hertzfeld. In exchange for an Apple II computer, he solicited her to hand-draw a few icons and font elements to inspire the upcoming Macintosh computer. However, she had no experience in computer graphics and “didn’t know the first thing about designing a typeface” or pixel art so she drew heavily upon her fine art experience in mosaics, needlepoint, and pointillism.
Hertzfeld suggested that she get a US$2.50 grid notebook of the smallest graph paper she could find at the University Art store in Palo Alto and mock up several 32 × 32 pixel representations of his software commands and applications. This includes an icon of scissors for the “cut” command, a finger for “paste”, and a paintbrush for MacPaint.
Compelled to actually join the team for a fixed-length part-time job,she interviewed “totally green” but undaunted, bringing a variety of typography books from the Palo Alto public library to show her interest alongside her well-prepared notebook. “The interview lasted five minutes”, Kare recalled in 2014. “It was, when can you start? And I found myself the next week in the Mac Software Group.” She was officially hired in January 1983 with Badge #3978. Her business cards read “HI Macintosh Artist”.
Kare was working on-site in Cupertino. “I definitely learned on the job”, she said in an interview conducted by Alex Pang on 8 September 2000. “As when I went to Macintosh, there wasn’t really an icon editor, but there was a way to turn pixels on and off. I did some work on paper, but obviously, it was much better to see it on the screen, so there was a rudimentary icon editor. First, they showed me how I could take the art and figure out the hex equivalent so it could be keyboarded in. Then Andy made a much better icon editor that automatically generated the hex under the icons. That was how I did the first ones. I think I did the fonts that way, going letter by letter before we had a font editor.”
Susan Kare was looking at a screen like this:
“You can see some of the little rounded wrecks. They have words like trash and disk drive and printer. So those were the things that I needed to make icons for. And on the right, you see the kind of verb of those buttons. “Do it” became “Okay” because “Do it” is maybe more clear but people read it as “Dolt” (a stupid person).
Among her first Macintosh designs were the Elefont, Overbrook, Merion, Ardmore, and Rosemont typefaces, which, however, were named after Chicago, London, New York, and other world cities after objections from Apple CEO Steve Jobs. Susan Kare then took on “icons” to operate the computer, drawing on art historical antecedents such as mosaics.
“The first thing I did in a seven pixels wide, nine pixels high typeface Chicago, that I tried to make bold and not have jagged edges. The typeface Chicago ended up in the title bars. And it ended up having a bit of another life in the first iPod.
The typeface Chicago ended up in the title bars. And it ended up having a bit of another life in the first iPod.
Kare: “So, when I got there, the goals were explained to me that it (the Macintosh) was a computer for people who were not computer literate so your mom could use it. Not too politically correct. And that it should be discoverable like our cade games, you shouldn’t have to have to use the manual to be able to do it and that it should be friendly. So, I started out making these icons in 32 by 32. This was an icon, I designed. They just said there should be something on the screen. 32 by 32 dots that sits there while the computer boots So I know that I reached back to my junior high favorite symbol to do that (the smiley).”
Kare was also asked to design an icon for a system crash. “I designed this image as a bomb because I was told they would never be seen by anyone. So I thought I could be a little irreverent. But unfortunately, that was not the case. But the programmers truly thought at the time, they would be deeply hidden.”
Icon for system crash
At the EG conference in May 2014, Kare shared an amusing anecdote about the bomb icon: “Right after the Mac had shipped, we were in our software area and a call came in – fielded through apple, and it was a woman who was using MacWrite. It had crashed and she was afraid her computer was going to blow up. So I felt kind of bad.
There weren’t any tools on the Mac to make icons. “So I did it on paper. And then Andy Hertzfeld wrote this icon editor that allowed you. (to make icons) I mean, it seemed so fantastic. What you could do is turn one pixel off or on.”
Macintosh Icon Editor (1983) by Andy Hertzfeld
“You could draw a line or circle or anything and then you could see it magnified and edit the size that you’d see it on the screen. And it generated the code, you needed for the programmers to get it into the software.”
The smiling Mac, the bomb with the fuse, the clock, the floppy disk, and the wastebasket are legendary.
But she found the loop square for the Apple command key in a symbol dictionary; it originated in ancient Scandinavia and later served as a marker for landmarks.
Borgholm Castle
She created a plethora of “dingbats” for her writings, including a cute animal symbol that combines features of a dog and a cow. As Clarus the Dog Cow, it now has its own fan base.
Susan Kare. Clarus, the Dogcow
Sometimes for fun, Susan Kare did pictures of her colleagues in icon format. Her digital portrait of Steve Jobs became quite famous.
Portrait of Steve Jobs by Susan Kare in icon format.
Susan Kare worked only three years at Apple. But this experience put her on the leading edge of a whole new field of graphic design. Working with only a grid of pixels, she began to master a peculiar sort of minimal pointillism. She spent her days turning tiny dots on and off to craft instantly understandable visual metaphors for computer commands.
Initially, her job was to shape individual letters and numbers to bring a semblance of print’s elegance to the grainy domain of computer screens. But Kare’s most memorable legacy is the playful quality of some of her icons. She’s quick to point out that Xerox PARC had already created a garbage can for disposing of files, but Kare’s can is so viewer- riendly that one half-expects Oscar the Grouch to pop out.
Susan Kare: Life after Apple
In 1986, Kare followed Steve Jobs in leaving Apple to launch NeXT, Inc. as its Creative Director and 10th employee. She introduced Jobs to her design hero Paul Rand and hired him to design NeXT’s logo and brand identity, admiring his table-pounding exactitude and confidence. She created and re-created slideshows to Jobs’s exacting last-minute requirements.
She realized that she wanted “to be back doing bitmaps” so she left NeXT to become an independent designer with a client base including graphical computing giants Microsoft, IBM, Sony Pictures, Motorola, General Magic, and Intel. Her projects for Microsoft include the card deck for Windows 3.0’s solitaire game, which taught early computer users to use a mouse to drag and drop objects on a screen.
Windows Solitaire
In 1987, she designed a “baroque” wallpaper, numerous other icons, and design elements for Windows 3.0, using isometric 3D and 16 dithered colors. Many of her icons, such as those for Notepad and various Control Panels, remained essentially unchanged by Microsoft until Windows XP.
Magic Cap OS
For IBM, she produced pinstriped isometric bitmap icons and design elements for OS/2. For General Magic, she made Magic Cap’s “impish” cartoon of dad’s office desktop. For Eazel, she rejoined Andy Hertzfeld and many from the former Macintosh team and contributed iconography to the Nautilus file manager which the company permanently donated to the public for free use.
Between 2006 and 2010, Susan Kare produced hundreds of 64 × 64 pixels icons for the virtual gifts feature of Facebook. For these items, Facebook was charging $1 each. Initially, profits from gift sales were donated to the Susan G. Komen for the Cure foundation until Valentine’s Day 2007.The most popular gift icon, titled “Big Kiss” is featured in some versions of Mac OS X as a user account picture.
Susan Kare: Big Kiss
In 2007, she designed the identity, icons, and website for Chumby Industries, Inc., as well as the interface for its Internet-enabled alarm clock.
Since 2008, The Museum of Modern Art (MoMA) store in New York City has carried stationery and notebooks featuring her designs.
Susan Kare at MoMa: Graphic icon sketch1 (982–1983)
In 2015 MoMA acquired her notebooks of sketches that had led to the early Mac GUI.
Susan Kare at Moma: Apple Macintosh OS icon sketchbook (1982) MoMa – Susan Kare. “Mac OS Icon sketchbook.” Bound sketchbook, ink and felt-tipped pen on paper. Gift of the designer.
This is an excellent example of the often “invisible” design that goes behind the graphic user interfaces that we use on a day-to-day basis, an intangible piece of design that we all carry in our pockets. Kare’s humble icons were all based on a 16×16 pixel grid; a grid that has expanded ad infinitum into the screens of our desktops, our laptops, our cell phones.”
MoMa Curatorial Assistant Evangelos Kotsioris
In 2015, Kare was hired by Pinterest as a product design lead as her first corporate employment in three decades. Working with design manager Bob Baxley, the former design manager of the original Apple Online Store, she compared the diverse and design-driven corporate cultures of Pinterest and early 1980s Apple. In February 2021, Kare became Design Architect at Niantic Labs. As of 2022, she concurrently heads a digital design practice in San Francisco and sells limited-edition, signed fine-art prints.
Michael Spindler was the CEO of Apple Computer from 1990 to 1993. During his tenure at the company, Spindler oversaw a number of significant changes at Apple, including the introduction of new products and a shift in the company’s focus towards enterprise customers.
Born in Berlin in 1942, he lived through all the transformations of the computer world, from the birth of mainframes to the advent of personal computers, and even the development of the first mobile devices. Hired by Mike Markkula in 1980, he climbed all the ranks of Apple’s organization chart, until he succeeded John Sculley as CEO.
Michael Spindler began his career at Siemens in 1966 after graduating from the prestigious Rheinische Fachhochschule in Cologne as an electrical engineer. Tired of working on such technical aspects as tape controllers for mainframes, he wanted to get closer to the customers and joined Schlumberger’s British telemetry subsidiary.
He returned to Munich in 1970 to take on the role of salesman at DEC, which was then competing with IBM in the minicomputer market. The story goes that he earned the nickname “Diesel” at the time because he was big and strong, grunted more than he spoke, and worked tirelessly from dawn to dusk. But this is just a story, probably invented in the 1980s, and repeated ad nauseam despite the denials of the person concerned.
After seven years at DEC, he knows how to manage teams and set up long-term strategies, but he also knows that the microcomputer will dethrone the minicomputer. Spindler left Munich for Brussels, where he became head of European marketing for Intel. There he met Mike Markkula, who had been advising Steve Jobs and Steve Wozniak for a long time, and was about to take over Apple himself.
It was 1980, and Spindler took over as head of marketing for Apple’s small European office. Legend has it that he went without a salary for a good six months while the company opened a European account. Apple Europe moved to Paris the following year, and Spindler distinguished himself with original and aggressive campaigns that got him noticed in Cupertino.
Michel Spindler with Steve Jobs
Mike Markkula wanted to return to his golden retirement, financed by the stock options he had received at Fairchild and Intel but did not intend to hand over the reins to the young and capricious Steve Jobs. Apple recruited John Sculley, the CEO of Pepsi, whom Jobs finally convinced with a formula that went down in history.
A talented strategist who had put Pepsi back on the map, Sculley appreciated Spindler’s talent, and even more so what he perceived as a lack of ambition. In 1983, Sculley promoted Spindler to executive vice president of marketing, reporting to Del Yocam, who had just taken over as head of the Apple II group. Spindler then showed the best, but also the worst, of himself.
The best first: with one heart in Cupertino and the other in Louveciennes, to paraphrase his famous aphorism, he encourages the internationalization of the company. Local subsidiaries operate as independent companies, which can create their own campaigns and even design their own products, like KanjiTalk, which made the Mac popular in Japan.
Then the worst: beset by panic attacks, Spindler struggles to express himself clearly, when he doesn’t run away from his responsibilities. The anecdote is famous: newly appointed director of Apple USA, Allan Loren wanted to introduce himself to Spindler, but couldn’t find him at his desk. A few minutes later, Spindler comes out of his office and shakes hands with Loren – had he been hiding in his closet?
The complexities of Spindler’s career are then confused with the complexities of Apple’s procrastination. The contenders to succeed John Sculley were disqualified one after the other: Del Yocam because he was an authoritarian and brittle COO, Jean-Louis Gassée. He spent tens of millions of dollars looking for technological dead ends, Allan Loren because he was unable to turn Apple USA around.
Spindler was appointed head of Apple Europe. He implements his “global-local” strategy, harmonizing the offer and the prices while letting each country adapt its message, and triples the turnover in two years. His fortune was strengthened by the disgrace of his competitors: he became COO in 1990, pushed Jean-Louis Gassée out of the company, and finally reached the highest level of responsibility when Sculley was fired in 1993, due to strategic errors.
One of Spindler’s major merits was his efforts to expand Apple’s product line and increase its market share. During his time as CEO, Apple introduced several new products including the Macintosh LC, Macintosh Quadra, and the Macintosh Performa. These products were designed to appeal to a wider range of customers and helped to broaden Apple’s customer base.
Another of Spindler’s merit was his attention to the enterprise market. He acknowledge the enterprise market was an important area of growth for Apple and he focused on expanding Apple’s presence in this market. He attempted to build stronger relationships with business customers and to develop new software and services that would appeal to enterprise users.
However, Spindler also faced a number of challenges during his tenure as CEO. One of the major weaknesses was that he faced criticism for his lack of focus on the consumer market. The competition was stronger in consumer market and Apple was facing tough competition from companies such as Dell, Compaq and IBM in the personal computer market.
Additionally, during his tenure, the company faced financial struggles, causing profits to decline. This led to a series of cost-cutting measures and layoffs at the company, which further damaged morale among employees and customers.
Ultimately, Spindler was replaced as CEO by Gil Amelio in 1993, who was tasked with turning around the company’s fortunes. While Spindler’s tenure as CEO was marked by both successes and challenges, his efforts helped to set the stage for Apple’s resurgence in the years that followed.
Father of three children, Michael Spindler lived between San Francisco and Paris with his wife. He died on September 5, 2016 after a short illness.
Software developer Andy Hertzfeld is one of the most important heroes of Macintosh development, but he has rarely been in the spotlight. He was the technical lead for the Macintosh system software and was the second programmer to join the project, after Bud Tribble. Hertzfeld was responsible for the overall architecture of the system and wrote a substantial portion of the system code himself, while helping the other programmers to integrate their parts.
He was born on April 6, 1953, in Philadelphia, Pennsylvania, and grew up in the eastern United States.
He had his first exposure to computers in high school, although there was no computer at all in high school itself. “Back in the late ’60s and ’70s, high schools couldn’t afford computers, but there was a teleprinter-connected GE timesharing computer about 10 miles away. I was lucky enough to use a program in 11th grade. And I talked to it like a fish to water,” Andy told in an interview.
Back then, you couldn’t use the computer in real time; you had to write your program in advance, print it out on a punch tape, and then read it with a special reader to transmit the program to the computer. “It was a little cumbersome, but I loved it anyway.”
After high school, which he attended with Susan Kare, Andy Hertzfeld moved to California and studied at UC Berkeley. But after buying an Apple II, he found computing much more interesting than college. “I started spending all my time on it and dropped out of college to go to work for Apple in August of ’79.”
He was hired by Apple Computer as a systems programmer in 1979 and developed the Apple Silentype printer firmware and wrote the firmware for the Sup’R’Terminal, the first 80-column card for the Apple II. In the early 1980s, he invited his high school friend, artist Susan Kare, to join Apple in order to help design what would become standard Macintosh icons.
Apple Silentype Thermodrucker Foto: Von StromBer – CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=14972849
The story of how Andy Hertzfeld was recruited for the Macintosh project is legendary: “Steve came by my cubicle. This was on a Thursday afternoon, late on a Thursday afternoon.,” Hertzfeld recalls in the Land of Giants podcast, talking with Peter Kafka. “I said ‘Okay, I’ll start Monday. Just give me half a day to document the work I’ve been doing so someone else can pick it up’. And he goes: ‘No, the Macintosh is the future of Apple. You’re going to start on it now.’ And he grabbed my Apple Two off my desk and started walking away with it. What could I do but follow my computer?”
Steve Wozniak and Andy Hertzfeld (right) at a Apple Computer Users Group Meeting (1985). Photo: Tony Wills – CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=18512032
That’s how Andy Hertzfeld joined the Macintosh team. There he wrote much of the original system software for the Mac, including the User Interface Toolbox, Windows Manager, Menu Manager, Control Manager. Steve Jobs constantly gave the impression that there was a much bigger goal at stake: „We weren’t just designing a computer. We were saving mankind. We were enhancing the future of humanity. And so some of that you roll your eyes a little bit, but enthusiasm is contagious.“
With the legendary “1984” commercial, Jobs then also told the whole world that he was going to unveil something world-changing. But there was still one problem. At the time, he hadn’t completed this world-changing device. He needed Hertzfeld and a small team working around the clock, driven by the idea that they would create something great.
“It was a monumental effort to finish the Macintosh system software. All of the software engineers, about a dozen of us were up like 48 hours with no sleep”, Andy Hertzfeld recalls. “We barely succeeded. We were exhausted, lying around on the floor the next day in a happy haze. Hey, we did it. We’re finally done. When Steve walks into the software area saying, ‘Get up off the floor. You’re not done. I want the Macintosh to be the first computer to announce itself.”
Jobs verlangte also, dass Hertzfeld und sein erschöpftes Team dem Mac das Sprechen beibringen soll. Das war ein technischer Vorstoß, der das Team bis zur letzten Minute an die Grenzen der Belastungsfähigkeit getrieben hat. Und als Jobs dann auf der Apple-Aktionärsversammlung 1984 die Bühne betrat, zog er den Mac aus der Tasche, von dem Hertzfeld nicht wirklich überzeugt war, dass er funktionieren würde.
Jobs had gambled high and won. It worked. The Macintosh could talk, more or less at least. And it looked almost cuddly like it was smiling. And it was sealed so you couldn’t open it. Unlike other computers, which were designed so you could poke around and modify the guts of the machine, the Mac was designed for people who didn’t know or care about the difference between RAM and ROM or other computer terms. Turn it on and go, says Apple expert Peter Kafka.
Andy Hertzfeld: “The Mac had this great intro. People saw it as the revolution that it was. We started selling them to the colleges by tens of thousands. But by the fall of 1984, the sales started falling off. By Christmas 1984, the sales were very disappointing. They were maybe a 10th of what Apple had predicted.”
Andy Hertzfeld left Apple in April 1984, fairly soon after the Mac was introduced. When asked why he turned his back on Apple, Hertzfeld replied on “NerdTV”:
“Bob Cringley: Why did you choose to leave Apple? Andy: I had a bad manager, a manager who wanted me to salute to him. I didn’t salute crisply enough. Bob: Literally salute? Andy: That’s metaphor. I call the story in my book “to big for my britches.” He took me for a walk to give me a verbal review. For the period of time I’d written most of the Macintosh system software he gave me a bad review because I was insubordinate to him. That disillusioned me. That took place in February 1983. I would have left then except I was too committed to the Mac. I had to stay to see it through. But as soon as it January 1984 left. I was still able to do system software. Did Switcher, the first multitasking environment for the Macintosh just as an outsider – I was able to sell it back to Apple. I wasn’t going to work there because I loved the spirit of the Mac group but this guy came in – his name is Bob Belleville – the bad manager who made me quit, and wasn’t able to compromise enough to stay there. Bob: Why didn’t Steve protect you? Andy: Steve had promised to. When Bud Tribble, the former manager, had left, Burrell and I both almost quit. We were afraid we’d get a bad manager and Steve promised to protect me but at the time we had already essentially developed the Mac. The technical work was done and what Steve needed at that point were managers to take over the rest of Apple. The Macintosh having shipped, his next agenda was to turn the rest of Apple into the Mac group. He had perceived the rest of Apple wasn’t as creative or motivated as the Mac team, and what you need to take over the company are managers, not innovators or technical people. Bob: So at that moment he needed Bob Belleville more than he needed you? Andy: That was what I thought. A little later on I think he thought he needed me more, but during that time, yeah. And of course it all kind of backfired on him. He (Steve) got kicked out of Apple just shortly afterwards… much o the ill fate of Apple. I almost killed Apple.”
Andy Hertzfeld left Apple in April of 1984, pretty soon after the introduction of the Mac. He helped his friend Burrell Smith who did the digital hardware for the Mac start a company called Radius in 1986 that made peripherals for the Mac. “I did a lot of stuff as a third-party developer, sold system software back to Apple. In 1990, along with Bill Atkinson, who was sort of my mentor on the Mac project, we started a company called General Magic that made some of the first handheld computers.”
Magic Cap Introduction Demo 1/6/94 with Andy Hertzfeld
Andy Hertzfeld: “Apple was our benefactor at starting General Magic, but about a year later decided they would rather BE General Magic and tried to make us blink out of existence… which we eventually did, but it took a few years. I left General Magic in 1996 to become an Internet hobbyist – got a T-1 line to my house. At one point I had all four food banks of the Bay Area hosted from this house here.”
But it was much too early for a real retirement. Andy got bitten by the free software bug in February of 1998 around the time of the Mozilla announcement. “I was despairing of the structural problems in the software industry and suddenly, after reading Eric Raymond’s book The Cathedral and the Bazaar, I realized that free software could be the path to an open and fair software industry.”
Hertzfeld decided to work for change in the traditional software industry. In August 1999, he founded a company called Eazel that would make free software easier to use. Eazel failed to secure its second round of funding, however, and he was forced to close the company in May 2001.
He then began working with Mitch Kapor. This is the man who developed the legendary spreadsheet program “Lotus 1-2-3.” Hertzfeld helped Kapor launch the Open Source Applications Foundation. Together they developed an innovative personal information manager called Chandler. He then launched a project to preserve for posterity the memories of the makers of the first Apple Macintosh.
To do this, Hertzfeld even developed his own software to publish these memories on the Web: the Folklore Project. “I established a website called folklore.org devoted to what I call “collective historical storytelling” — allowing a group of people to cooperate telling their shared stories. I published on the web about 60 anecdotes about the development of the Mac in time for the Mac’s 20th birthday in 2004.” This then became the book “Revolution in The Valley: The Insanely Great Story of How the Mac Was Made.”
Hertzfeld then spent the last years of his active professional career at Google.
From August 2005 to July 2013, Hertzfeld worked for Google, where he was primarily responsible for the Google+ user interface.
Hertzfeld is now mainly retired and describes himself as a retired hacker, but he still sometimes appears as an investor, most recently in the start-up Spatial.
Sources:
Interview Christoph Dernbach with Andy Hetzfeld on August 25, 2011 at WWDC 2011.
When you talk about the founders of Apple, you first think of Steve Jobs and Steve Wozniak. Jobs, the far-sighted business strategist – and “Woz”, the brilliant inventor. In fact, the Apple Computer Company was founded by three men: “Mr. Stephen G. Wozinak, Mr. Steven P. Jobs and Mr. Ronald G. Wayne”, as it is also stated in the company’s founding contract of April 1, 1976.
Three men founded Apple Computer on April 1st, 1976
Ron Wayne worked at Atari before founding Apple Computer, where he met Steve Jobs, who was involved in new game development. Through Jobs, Wayne became aware of Steve Wozniak, a brilliant hardware designer and engineer who had created the first usable personal computer that would go down in technological history as the Apple I.
But it soon became clear that the two Steves were pursuing very different goals: Steve Wozniak wanted to impress his pal’s guys at the Homebrew Computer Club with his brilliant development steps. Woz had no serious interest in making a big business out of it. Steve Jobs, on the other hand, quickly recognized the business potential. With a computer like Apple, I could break the dominance of the mainframes of companies like IBM and turn the personal computer into a mass product.
However, Jobs was not able to match his interests with Woz. Therefore he relied on Ron Wayne. So he and against this background offered him a minority share in the foundation of Apple Computer on April 1, 1976. Wayne had a 10 percent stake in the company, and Jobs and Wozniak each held 45 percent. Ron Wayne was to tip the scales when the two main players were at loggerheads with each other. “We both trusted him so much that he would resolve any conflicts we had,” said Steve Jobs.
The first larger order came from the Byte Shop
r Wayne got cold feet: “I was practically the adult in the room. I was in my 40s, those kids were around 20. Then about a week and a half later, when I had time to think, I did what most people thought was absolutely crazy. I had my name taken off the contract.”
Wayne was particularly worried about the financial obligations associated with the first major contract for Apple computers. Jobs was in the process of signing supply contracts for the Apple I with first customers like “The Byte Shop”. He was getting the components for it on tick.
“I had very good reasons to remove my name from the contract,” Wayne said in a 2016 TV interview looking back. “The main reason was that we had started a company. And if we failed, Jobs and Woz would not have had two nickels in their pockets. Who would be left to stick? Well, I had a house and a car, and a bank account, too. I was handy. So I was financially vulnerable. But I also wanted to be my own man. I also realized I was in the shadow of two grand. I was never gonna get my own project. I didn’t like the idea of spending the next 20 years behind piles of paper in an office. That was another reason to get out.”
Ron Wayne talks to Jules Pochy und Jérôme Schmidt (2016)
His co-founder Steve Wozniak could not understand this step. “I don’t know why Ron gave up. “He sold his shares and got out. He may have had good reasons. He got out and he was happy with it. He couldn’t see the big picture. But you couldn’t see the big picture at the time. There was only the Apple I.”
Wayne could
not count on Steve Jobs’ understanding. Jobs described the process as follows:
“He decided that he really wanted a VCR or something. So he sold us back
10 percent for eight hundred dollars.”
But there
were also reservations in the opposite direction: “I didn’t know whether
jobs were the kind of personality I wanted to work for either,” Wayne said
in the ARTE interview (2016). “He was just the way he was. He knew exactly
what he wanted to do. And it was better not to get in his way. You would end up
with a footprint on your forehead. Jobs had a very aggressive character. And if
you had to choose between Steve Jobs and an ice cube to warm them up, you’d
snuggle up to the ice cube. But it was the only way he could achieve what he
had achieved with Apple. Apple was Steve Jobs.”
So Wayne has no regrets about getting out of Apple so early: “Those kids were wild hotshots, like whether you have to hold a tiger by the tail. If I’d stayed with Apple, I’d have been caught in a hurry. I’d just be the richest man in the cemetery.”
One thing in life Wayne does regret, though. In the mid-nineties, he saw an ad in a magazine for a company that dealt with autographs and signatures. He remembered the old Apple contract that was lying around in a closet collecting dust. Finally, he sold the historical document for 500 dollars to the autograph dealer.
Years
later, Wayne then saw on television how his old contract came under the hammer
for almost 1.6 million dollars at the Sotheby’s auction house.
“I’m sorry for this incident. But what can I say? It’s the story of my life, right? A day late and a dollar short in my pocket.”
Ron Wayne has left his mark on Apple’s history even after he left as a shareholder. The first trace should be removed quickly because Steve Jobs did not like the first logo of the young company, which Wayne had designed. “In this logo, which they had me design, I captured Wozniak’s bizarreness. The logo with Newton in this Gothic frame with the ribbon and the inscription ‘Apple Computer Company’ was of course a 19th-century design, not a 20th-century design. I knew that already.”
The second mark was deeper: Jobs asked Wayne if he could design a case for the Apple II. Although his design was never used. But Apple adopted the basic design principle.
Design for a housing for the Apple II by Ron Wayne.
To this day, Wayne is amused by the thought that he may have influenced all his later work. “A typical computer today consists of a tower and a circuit board mounted vertically. The keyboard and screen are separate. In my design for the Apple II, I had a horizontal design with a horizontally mounted board. The keyboard was integrated into the housing. The monitor stood on top, as one unit. They used this design for all future models, the Macintosh and Lisa, and so on. All were built this way. This form was unique among modern computers.”
The two main Apple founders – Steve Jobs and Steve Wozniak – both came from humble backgrounds and were not endowed with commercial success. In order to afford the first pieces of the Apple I in 1976, they almost literally sold the shirts off their backs. Jobs invested the proceeds from the sale of his VW bus ($1,500 dollars). “Woz” parted with his beloved programmable calculator Hewlett-Packard 65 and deposited 250 dollars in the company’s treasury.
Ronald Wayne (Photo courtesy of Owen Linzmayer)
Ronald Gerald Wayne, the “third founder” of Apple Computer, was with the company for only a short time. He illustrated the first Apple logo and wrote the Apple I manual. While at Apple, he also wrote their partnership agreement. Wayne worked with Jobs at Atari before co-founding Apple Computer on April 1, 1976. He was given a 10% stake in Apple, but relinquished his stock for 800 dollars only two weeks later because legally, all members of a partnership are personally responsible for any debts incurred by any of the other partners.
After Apple’s IPO, Wayne’s stake could have been worth as much as US$ 1.5 billion. He claimed that he didn’t regret selling the stock as he had made “the best decision available at that time.” According to CNET, as of 1997 Wayne was working as an engineer for a defense contractor in Salinas, California.
The foundations for the commercial success were laid in 1977 by venture capitalist Arthur Rock as well as by the ex-Intel manager Mike Markkula, who invested 92,000 dollars in Apple and secured a bank loan of 250,000 dollars. Markkula was lured out of retirement by Steve Jobs, who was referred to him by Regis McKenna and venture capitalist Don Valentine.
Valentine—who after meeting the young, unkempt Jobs asked McKenna, “Why did you send me this renegade from the human race?”—was not interested in funding Apple, but mentioned Jobs’ new company to Markkula. Jobs visited him and convinced Markkula of the market for the Apple II and personal computers in general. Later Valentine asked Markkula if he could also invest in Apple.
Mike Markkula at the Apple offices April 1, 1977
In 1977, Markkula brought his business expertise along with US$250,000 ($80,000 as an equity investment in the company and $170,000 as a loan) and became employee number 3. The investment would pay off for Markkula. Before Apple went public in 1980, he owned a third of the company.
Markkula also brought in Apple’s first CEO, Michael Scott, then took the job himself from 1981 to 1983. Markkula served as chairman from 1985 until 1997, when a new board was formed after Jobs returned to the company. Wozniak, who virtually single-handedly created the first two Apple computers, credits Markkula for the success of Apple more than himself. “Steve and I get a lot of credit, but Mike Markkula was probably more responsible for our early success, and you never hear about him,” told Wozniak the Failure Magazine in July 2000.
With the initial public offering on December 12, 1980, Jobs and Wozniak became multimillionaires, as Apple Computer was now valued at 1.8 billion dollars. Jobs possessed 7.5 million stocks (217 million dollars); “Woz” was assigned four million stocks (116 million dollars). Markkula’s share of seven million stocks was worth 203 million.
“I was worth about over a million dollars when I was twenty-three and over ten million dollars when I was twenty-four, and over a hundred million dollars when I was twenty-five,” Jobs said in an interview with Robert Cringley (”Triumph of the Nerds“) in 1996. “And it wasn’t that important because I never did it for the money.”
Read next page: Steve Jobs: It’s not about the money
In his most political speech ever Apple chief executive Tim Cook has demanded a tough new US data protection law.
Referring to the misuse of “deeply personal” data, he said it was being “weaponised against us with military efficiency”.
“We shouldn’t sugar-coat the consequences,” he added. “This is surveillance.”
The strongly-worded speech presented a striking defence of user privacy rights from a tech firm’s chief executive.
Mr Cook also praised the EU’s new data protection regulation, the General Data Protection Regulation (GDPR).
Here is a transcript of his speech:
Good morning.
It is an honor to be here with you today in this grand hall…a room that represents what is possible when people of different backgrounds, histories, and philosophies come together to build something bigger than themselves.
I am deeply grateful to our hosts. I want to recognize Ventsislav Karadjov for his service and leadership. And it’s a true privilege to be introduced by his co-host, a statesman I admire greatly, Giovanni Butarelli.
Now Italy has produced more than its share of great leaders and public servants. Machiavelli taught us how leaders can get away with evil deeds…And Dante showed us what happens when they get caught.
Giovanni has done something very different. Through his values, his dedication, his thoughtful work, Giovanni, his predecessor Peter Hustinx—and all of you—have set an example for the world. We are deeply grateful.
We need you to keep making progress—now more than ever. Because these are transformative times. Around the world, from Copenhagen to Chennai to Cupertino, new technologies are driving breakthroughs in humanity’s greatest common projects. From preventing and fighting disease…To curbing the effects of climate change…To ensuring every person has access to information and economic opportunity.
At the same time, we see vividly—painfully—how technology can harm rather than help. Platforms and algorithms that promised to improve our lives can actually magnify our worst human tendencies. Rogue actors and even governments have taken advantage of user trust to deepen divisions, incite violence, and even undermine our shared sense of what is true and what is false.
This crisis is real. It is not imagined, or exaggerated, or “crazy.” And those of us who believe in technology’s potential for good must not shrink from this moment.
Now, more than ever — as leaders of governments, as decision-makers in business, and as citizens — we must ask ourselves a fundamental question: What kind of world do we want to live in?
I’m here today because we hope to work with you as partners in answering this question.
At Apple, we are optimistic about technology’s awesome potential for good. But we know that it won’t happen on its own. Every day, we work to infuse the devices we make with the humanity that makes us. As I’ve said before, “Technology is capable of doing great things. But it doesn’t want to do great things. It doesn’t want anything. That part takes all of us.”
That’s why I believe that our missions are so closely aligned. As Giovanni puts it, “We must act to ensure that technology is designed and developed to serve humankind, and not the other way around.”
We at Apple believe that privacy is a fundamental human right. But we also recognize that not everyone sees things as we do. In a way, the desire to put profits over privacy is nothing new.
As far back as 1890, future Supreme Court Justice Louis Brandeis published an article in the Harvard Law Review, making the case for a “Right to Privacy” in the United States.
He warned: “Gossip is no longer the resource of the idle and of the vicious, but has become a trade.”
Today that trade has exploded into a data industrial complex. Our own information, from the everyday to the deeply personal, is being weaponized against us with military efficiency.
Every day, billions of dollars change hands, and countless decisions are made, on the basis of our likes and dislikes, our friends and families, Our relationships and conversations…Our wishes and fears…Our hopes and dreams.
These scraps of data…each one harmless enough on its own…are carefully assembled, synthesized, traded, and sold.
Taken to its extreme, this process creates an enduring digital profile and lets companies know you better than you may know yourself. Your profile is then run through algorithms that can serve up increasingly extreme content, pounding our harmless preferences into hardened convictions. If green is your favorite color, you may find yourself reading a lot of articles—or watching a lot of videos—about the insidious threat from people who like orange.
In the news, almost every day, we bear witness to the harmful, even deadly, effects of these narrowed worldviews.
We shouldn’t sugarcoat the consequences. This is surveillance. And these stockpiles of personal data serve only to enrich the companies that collect them.
This should make us very uncomfortable. It should unsettle us. And it illustrates the importance of our shared work and the challenges still ahead of us.
Fortunately, this year, you’ve shown the world that good policy and political will can come together to protect the rights of everyone. We should celebrate the transformative work of the European institutions tasked with the successful implementation of the GDPR. We also celebrate the new steps taken, not only here in Europe, but around the world. In Singapore, Japan, Brazil, New Zealand, and many more nations, regulators are asking tough questions and crafting effective reforms.
It is time for the rest of the world—including my home country—to follow your lead.
We at Apple are in full support of a comprehensive federal privacy law in the United States. There, and everywhere, it should be rooted in four essential rights: First, the right to have personal data minimized. Companies should challenge themselves to de-identify customer data—or not to collect it in the first place. Second, the right to knowledge. Users should always know what data is being collected and what it is being collected for. This is the only way to empower users to decide what collection is legitimate and what isn’t. Anything less is a sham. Third, the right to access. Companies should recognize that data belongs to users, and we should all make it easy for users to get a copy of…correct…and delete their personal data. And fourth, the right to security. Security is foundational to trust and all other privacy rights.
Now, there are those who would prefer I hadn’t said all of that. Some oppose any form of privacy legislation. Others will endorse reform in public, and then resist and undermine it behind closed doors.
They may say to you, ‘our companies will never achieve technology’s true potential if they are constrained with privacy regulation.’ But this notion isn’t just wrong, it is destructive.
Technology’s potential is, and always must be, rooted in the faith people have in it…In the optimism and creativity that it stirs in the hearts of individuals…In its promise and capacity to make the world a better place.
It’s time to face facts. We will never achieve technology’s true potential without the full faith and confidence of the people who use it.
At Apple, respect for privacy—and a healthy suspicion of authority—have always been in our bloodstream. Our first computers were built by misfits, tinkerers, and rebels—not in a laboratory or a board room, but in a suburban garage. We introduced the Macintosh with a famous TV ad channeling George Orwell’s 1984—a warning of what can happen when technology becomes a tool of power and loses touch with humanity.
And way back in 2010, Steve Jobs said in no uncertain terms: “Privacy means people know what they’re signing up for, in plain language, and repeatedly.”
It’s worth remembering the foresight and courage it took to make that statement. When we designed this device we knew it could put more personal data in your pocket than most of us keep in our homes. And there was enormous pressure on Steve and Apple to bend our values and to freely share this information. But we refused to compromise. In fact, we’ve only deepened our commitment in the decade since.
From hardware breakthroughs…that encrypt fingerprints and faces securely—and only—on your device…To simple and powerful notifications that make clear to every user precisely what they’re sharing and when they are sharing it.
We aren’t absolutists, and we don’t claim to have all the answers. Instead, we always try to return to that simple question: What kind of world do we want to live in.
At every stage of the creative process, then and now, we engage in an open, honest, and robust ethical debate about the products we make and the impact they will have. That’s just a part of our culture.
We don’t do it because we have to, we do it because we ought to. The values behind our products are as important to us as any feature.
We understand that the dangers are real—from cyber-criminals to rogue nation states. We’re not willing to leave our users to fend for themselves. And, we’ve shown, we’ll defend those principles when challenged.
Those values…that commitment to thoughtful debate and transparency…they’re only going to get more important. As progress speeds up, these things should continue to ground us and connect us, first and foremost, to the people we serve.
Artificial Intelligence is one area I think a lot about. Clearly, it’s on the minds of many of my peers as well.
At its core, this technology promises to learn from people individually to benefit us all. Yet advancing AI by collecting huge personal profiles is laziness, not efficiency. For Artificial Intelligence to be truly smart, it must respect human values, including privacy.
If we get this wrong, the dangers are profound.
We can achieve both great Artificial Intelligence and great privacy standards. It’s not only a possibility, it is a responsibility.
In the pursuit of artificial intelligence, we should not sacrifice the humanity, creativity, and ingenuity that define our human intelligence.
And at Apple, we never will.
In the mid-19th Century, the great American writer Henry David Thoreau found himself so fed up with the pace and change of Industrial society that he moved to a cabin in the woods by Walden Pond.
Call it the first digital cleanse.
Yet even there, where he hoped to find a bit of peace, he could hear a distant clatter and whistle of a steam engine passing by. “We do not ride on the railroad,” he said. “It rides upon us.”
Those of us who are fortunate enough to work in technology have an enormous responsibility.
It is not to please every grumpy Thoreau out there. That’s an unreasonable standard, and we’ll never meet it.
We are responsible, however, for recognizing that the devices we make and the platforms we build have real…lasting…even permanent effects on the individuals and communities who use them.
We must never stop asking ourselves…What kind of world do we want to live in?
The answer to that question must not be an afterthought, it should be our primary concern.
We at Apple can—and do—provide the very best to our users while treating their most personal data like the precious cargo that it is. And if we can do it, then everyone can do it.
Fortunately, we have your example before us.
Thank you for your work…For your commitment to the possibility of human-centered technology…And for your firm belief that our best days are still ahead of us.
Steve Jobs and John Sculley on the Cover of Business Week (Nov. 1984)
Rhiannon Williams, writing for The Telegraph, had the chance to interview John Sculley, the man who was instrumental in ousting Steve Jobs from Apple back in 1985.
When asked if he ever feels frustrated at how Jobs is presented, or misrepresented, in popular culture, Sculley pauses. “Misrepresented in what way?” he asks, tersely. People tend to draw on the more tyrannical aspects of his personality, I venture.
“I don’t think that’s fair. I think…” He pauses again. “People exaggerate, it’s simple to summarise and exaggerate. I found Steve, remember – at the time we were friends, we were incredibly close friends, and… he was someone who even then, showed compassion, and caring about people. “Didn’t mean he couldn’t be tough in a meeting and make decisions, and sometimes they seemed, y’know, overly harsh. But the reality was, the Steve Jobs I knew was still a very decent person, with very decent values. So I think he was misrepresented in popular culture.”
And:
The pair worked in harmony together on Apple’s 1984 Ridley Scott-directed Super Bowl television advert, but cracks began to appear when Sculley disagreed with Jobs’ plans to drop the price of the Macintosh and direct a large proportion of the marketing budget from Apple II to the Mac in the wake of the poorly-received Macintosh Office network, which later became Desktop Publishing.
“I said ‘Steve, the only cash for the company is coming from the Apple II, and we can’t do that,’” Sculley recalls sadly.
The working relationship between the two descended into a desperate struggle for power. The increasingly-erratic Jobs tried to lead an unsuccessful rebellion against Sculley in May 1985 with the goal of replacing him with Jean-Louis Gassée, then Apple’s director of European Operations. Gassée informed Sculley of the coup, who confronted Jobs at an executive committee meeting and demanded those present choose between the two men as to who they thought best to run the company. They backed Sculley, and Jobs fled the room.