Advertisement
computer history museum membership: The Charisma Machine Morgan G. Ames, 2019-11-19 A fascinating examination of technological utopianism and its complicated consequences. In The Charisma Machine, Morgan Ames chronicles the life and legacy of the One Laptop per Child project and explains why—despite its failures—the same utopian visions that inspired OLPC still motivate other projects trying to use technology to “disrupt” education and development. Announced in 2005 by MIT Media Lab cofounder Nicholas Negroponte, One Laptop per Child promised to transform the lives of children across the Global South with a small, sturdy, and cheap laptop computer, powered by a hand crank. In reality, the project fell short in many ways—starting with the hand crank, which never materialized. Yet the project remained charismatic to many who were captivated by its claims of access to educational opportunities previously out of reach. Behind its promises, OLPC, like many technology projects that make similarly grand claims, had a fundamentally flawed vision of who the computer was made for and what role technology should play in learning. Drawing on fifty years of history and a seven-month study of a model OLPC project in Paraguay, Ames reveals that the laptops were not only frustrating to use, easy to break, and hard to repair, they were designed for “technically precocious boys”—idealized younger versions of the developers themselves—rather than the children who were actually using them. The Charisma Machine offers a cautionary tale about the allure of technology hype and the problems that result when utopian dreams drive technology development. |
computer history museum membership: Software Rights Gerardo Con Diaz, 2019-10-22 A new perspective on United States software development, seen through the patent battles that shaped our technological landscape This first comprehensive history of software patenting explores how patent law made software development the powerful industry that it is today. Historian Gerardo Con Díaz reveals how patent law has transformed the ways computing firms make, own, and profit from software. He shows that securing patent protection for computer programs has been a central concern among computer developers since the 1950s and traces how patents and copyrights became inseparable from software development in the Internet age. Software patents, he argues, facilitated the emergence of software as a product and a technology, enabled firms to challenge each other’s place in the computing industry, and expanded the range of creations for which American intellectual property law provides protection. Powerful market forces, aggressive litigation strategies, and new cultures of computing usage and development transformed software into one of the most controversial technologies ever to encounter the American patent system. |
computer history museum membership: Biomedical Computing Joseph A. November, 2012-06-01 Winner of the Computer History Museum Prize of the Special Interest Group: Computers, Information, and Society Imagine biology and medicine today without computers. What would laboratory work be like if electronic databases and statistical software did not exist? Would disciplines like genomics even be feasible if we lacked the means to manage and manipulate huge volumes of digital data? How would patients fare in a world absent CT scans, programmable pacemakers, and computerized medical records? Today, computers are a critical component of almost all research in biology and medicine. Yet, just fifty years ago, the study of life was by far the least digitized field of science, its living subject matter thought too complex and dynamic to be meaningfully analyzed by logic-driven computers. In this long-overdue study, historian Joseph November explores the early attempts, in the 1950s and 1960s, to computerize biomedical research in the United States. Computers and biomedical research are now so intimately connected that it is difficult to imagine when such critical work was offline. Biomedical Computing transports readers back to such a time and investigates how computers first appeared in the research lab and doctor's office. November examines the conditions that made possible the computerization of biology—including strong technological, institutional, and political support from the National Institutes of Health—and shows not only how digital technology transformed the life sciences but also how the intersection of the two led to important developments in computer architecture and software design. The history of this phenomenon has been only vaguely understood. November's thoroughly researched and lively study makes clear for readers the motives behind computerizing the study of life and how that technology profoundly affects biomedical research today. |
computer history museum membership: AI 2041 Kai-Fu Lee, Chen Qiufan, 2024-03-05 How will AI change our world within twenty years? A pioneering technologist and acclaimed writer team up for a “dazzling” (The New York Times) look at the future that “brims with intriguing insights” (Financial Times). This edition includes a new foreword by Kai-Fu Lee. A BEST BOOK OF THE YEAR: The Wall Street Journal, The Washington Post, Financial Times Long before the advent of ChatGPT, Kai-Fu Lee and Chen Qiufan understood the enormous potential of artificial intelligence to transform our daily lives. But even as the world wakes up to the power of AI, many of us still fail to grasp the big picture. Chatbots and large language models are only the beginning. In this “inspired collaboration” (The Wall Street Journal), Lee and Chen join forces to imagine our world in 2041 and how it will be shaped by AI. In ten gripping, globe-spanning short stories and accompanying commentary, their book introduces readers to an array of eye-opening settings and characters grappling with the new abundance and potential harms of AI technologies like deep learning, mixed reality, robotics, artificial general intelligence, and autonomous weapons. |
computer history museum membership: Programmed Inequality Mar Hicks, 2018-02-23 This “sobering tale of the real consequences of gender bias” explores how Britain lost its early dominance in computing by systematically discriminating against its most qualified workers: women (Harvard Magazine) In 1944, Britain led the world in electronic computing. By 1974, the British computer industry was all but extinct. What happened in the intervening thirty years holds lessons for all postindustrial superpowers. As Britain struggled to use technology to retain its global power, the nation’s inability to manage its technical labor force hobbled its transition into the information age. In Programmed Inequality, Mar Hicks explores the story of labor feminization and gendered technocracy that undercut British efforts to computerize. That failure sprang from the government’s systematic neglect of its largest trained technical workforce simply because they were women. Women were a hidden engine of growth in high technology from World War II to the 1960s. As computing experienced a gender flip, becoming male-identified in the 1960s and 1970s, labor problems grew into structural ones and gender discrimination caused the nation’s largest computer user—the civil service and sprawling public sector—to make decisions that were disastrous for the British computer industry and the nation as a whole. Drawing on recently opened government files, personal interviews, and the archives of major British computer companies, Programmed Inequality takes aim at the fiction of technological meritocracy. Hicks explains why, even today, possessing technical skill is not enough to ensure that women will rise to the top in science and technology fields. Programmed Inequality shows how the disappearance of women from the field had grave macroeconomic consequences for Britain, and why the United States risks repeating those errors in the twenty-first century. |
computer history museum membership: Computer Structures C. Gordon Bell, Allen Newell, 1971 |
computer history museum membership: ENIAC in Action Thomas Haigh, Peter Mark Priestley, Crispin Rope, 2016-02-05 This work explores the conception, design, construction, use, and afterlife of ENIAC, the first general purpose digital electronic computer. |
computer history museum membership: The Intel Trinity Michael S. Malone, 2014-07-15 Based on unprecedented access to the corporation’s archives, The Intel Trinity is the first full history of Intel Corporation—the essential company of the digital age— told through the lives of the three most important figures in the company’s history: Robert Noyce, Gordon Moore, and Andy Grove. Often hailed the “most important company in the world,” Intel remains, more than four decades after its inception, a defining company of the global digital economy. The legendary inventors of the microprocessor-the single most important product in the modern world-Intel today builds the tiny “engines” that power almost every intelligent electronic device on the planet. But the true story of Intel is the human story of the trio of geniuses behind it. Michael S. Malone reveals how each brought different things to Intel, and at different times. Noyce, the most respected high tech figure of his generation, brought credibility (and money) to the company’s founding; Moore made Intel the world’s technological leader; and Grove, has relentlessly driven the company to ever-higher levels of success and competitiveness. Without any one of these figures, Intel would never have achieved its historic success; with them, Intel made possible the personal computer, Internet, telecommunications, and the personal electronics revolutions. The Intel Trinity is not just the story of Intel’s legendary past; it also offers an analysis of the formidable challenges that lie ahead as the company struggles to maintain its dominance, its culture, and its legacy. With eight pages of black-and-white photos. |
computer history museum membership: Ada Lovelace Christopher Hollings, Ursula Martin, Adrian Clifford Rice, 2018 Ada, Countess of Lovelace and daughter of Romantic poet Lord Byron, is sometimes referred to as the world's first computer programmer. But how did a young woman in the nineteenth century without a formal education become a pioneer of computer science? Drawing on previously unpublished archival material, including a remarkable correspondence course with eminent mathematician Augustus De Morgan, this book explores Ada Lovelace's development from her precocious childhood into a gifted, perceptive and knowledgeable mathematician who, alongside Mary Somerville, Michael Faraday and Charles Dickens, became part of Victorian London's social and scientific elite. Featuring images of the 'first programme' together with mathematical models and contemporary illustrations, the authors show how, despite her relatively short life and with astonishing prescience, Ada Lovelace explored key mathematical questions to understand the principles behind modern computing.--Page 4 de la couverture. |
computer history museum membership: Facebook Steven Levy, 2020-02-25 One of the Best Technology Books of 2020—Financial Times “Levy’s all-access Facebook reflects the reputational swan dive of its subject. . . . The result is evenhanded and devastating.”—San Francisco Chronicle “[Levy’s] evenhanded conclusions are still damning.”—Reason “[He] doesn’t shy from asking the tough questions.”—The Washington Post “Reminds you the HBO show Silicon Valley did not have to reach far for its satire.”—NPR.org The definitive history, packed with untold stories, of one of America’s most controversial and powerful companies: Facebook As a college sophomore, Mark Zuckerberg created a simple website to serve as a campus social network. Today, Facebook is nearly unrecognizable from its first, modest iteration. In light of recent controversies surrounding election-influencing “fake news” accounts, the handling of its users’ personal data, and growing discontent with the actions of its founder and CEO—who has enormous power over what the world sees and says—never has a company been more central to the national conversation. Millions of words have been written about Facebook, but no one has told the complete story, documenting its ascendancy and missteps. There is no denying the power and omnipresence of Facebook in American daily life, or the imperative of this book to document the unchecked power and shocking techniques of the company, from growing at all costs to outmaneuvering its biggest rivals to acquire WhatsApp and Instagram, to developing a platform so addictive even some of its own are now beginning to realize its dangers. Based on hundreds of interviews from inside and outside Facebook, Levy’s sweeping narrative of incredible entrepreneurial success and failure digs deep into the whole story of the company that has changed the world and reaped the consequences. |
computer history museum membership: A History of Personal Workstations Adele Goldberg, 1988 This distinctive book presents a history of an increasingly important class of computers, personal workstations. It is a history seen from the unique perspective of the people who pioneered their development. |
computer history museum membership: Leading Matters John L. Hennessy, 2018-09-04 In Leading Matters, current Chairman of Alphabet (Google's parent company), former President of Stanford University, and Godfather of Silicon Valley, John L. Hennessy shares the core elements of leadership that helped him become a successful tech entrepreneur, esteemed academic, and venerated administrator. Hennessy's approach to leadership is laser-focused on the journey rather than the destination. Each chapter in Leading Matters looks at valuable elements that have shaped Hennessy's career in practice and philosophy. He discusses the pivotal role that humility, authenticity and trust, service, empathy, courage, collaboration, innovation, intellectual curiosity, storytelling, and legacy have all played in his prolific, interdisciplinary career. Hennessy takes these elements and applies them to instructive stories, such as his encounters with other Silicon Valley leaders including Jim Clark, founder of Netscape; Condoleezza Rice, former U.S. Secretary of State and Stanford provost; John Arrillaga, one of the most successful Silicon Valley commercial real estate developers; and Phil Knight, founder of Nike and philanthropist with whom Hennessy cofounded Knight-Hennessy Scholars at Stanford University. Across government, education, commerce, and non-profits, the need for effective leadership could not be more pressing. This book is essential reading for those tasked with leading any complex enterprise in the academic, not-for-profit, or for-profit sector. |
computer history museum membership: Fearless Genius Doug Menuez, 2014-06-03 An award-winning documentary photographer delivers a stunning visual history of the Silicon Valley technology boom, in which he was witness to key moments in the careers of Steve Jobs and more than seventy other leading innovators as they created today’s digital world. An eye-opening chronicle of the Silicon Valley technology boom, capturing key moments in the careers of Steve Jobs and more than seventy other leading innovators as they created today’s digital world In the spring of 1985, a technological revolution was under way in Silicon Valley, and documentary photographer Doug Menuez was there in search of a story—something big. At the same time, Steve Jobs was being forced out of his beloved Apple and starting over with a new company, NeXT Computer. His goal was to build a supercomputer with the power to transform education. Menuez had found his story: he proposed to photograph Jobs and his extraordinary team as they built this new computer, from conception to product launch. In an amazing act of trust, Jobs granted Menuez unlimited access to the company, and, for the next three years, Menuez was able to get on film the spirit and substance of innovation through the day-to-day actions of the world’s top technology guru. From there, the project expanded to include the most trailblazing companies in Silicon Valley, all of which granted Menuez the same complete access that Jobs had. Menuez photographed behind the scenes with John Warnock at Adobe, John Sculley at Apple, Bill Gates at Microsoft, John Doerr at Kleiner Perkins, Bill Joy at Sun Microsystems, Gordon Moore and Andy Grove at Intel, Marc Andreessen at Netscape, and more than seventy other leading companies and innovators. It would be fifteen years before Menuez stopped taking pictures, just as the dotcom bubble burst. An extraordinary era was coming to its close. With his singular behind-the-scenes access to these notoriously insular companies, Menuez was present for moments of heartbreaking failure and unexpected success, moments that made history, and moments that revealed the everyday lives of the individuals who made it happen. This period of rapid, radical change would affect almost every aspect of our culture and our lives in ways both large and small and would also create more jobs and wealth than any other time in human history. And Doug Menuez was there, a witness to a revolution. In more than a hundred photographs and accompanying commentary, Fearless Genius captures the human face of innovation and shows what it takes to transform powerful ideas into reality. |
computer history museum membership: The Innovation Delusion Lee Vinsel, Andrew L. Russell, 2020-09-08 “Innovation” is the hottest buzzword in business. But what if our obsession with finding the next big thing has distracted us from the work that matters most? “The most important book I’ve read in a long time . . . It explains so much about what is wrong with our technology, our economy, and the world, and gives a simple recipe for how to fix it: Focus on understanding what it takes for your products and services to last.”—Tim O’Reilly, founder of O’Reilly Media It’s hard to avoid innovation these days. Nearly every product gets marketed as being disruptive, whether it’s genuinely a new invention or just a new toothbrush. But in this manifesto on thestate of American work, historians of technology Lee Vinsel and Andrew L. Russell argue that our way of thinking about and pursuing innovation has made us poorer, less safe, and—ironically—less innovative. Drawing on years of original research and reporting, The Innovation Delusion shows how the ideology of change for its own sake has proved a disaster. Corporations have spent millions hiring chief innovation officers while their core businesses tank. Computer science programs have drilled their students on programming and design, even though theoverwhelming majority of jobs are in IT and maintenance. In countless cities, suburban sprawl has left local governments with loads of deferred repairs that they can’t afford to fix. And sometimes innovation even kills—like in 2018 when a Miami bridge hailed for its innovative design collapsed onto a highway and killed six people. In this provocative, deeply researched book, Vinsel and Russell tell the story of how we devalued the work that underpins modern life—and, in doing so, wrecked our economy and public infrastructure while lining the pockets of consultants who combine the ego of Silicon Valley with the worst of Wall Street’s greed. The authors offer a compelling plan for how we can shift our focus away from the pursuit of growth at all costs, and back toward neglected activities like maintenance, care, and upkeep. For anyone concerned by the crumbling state of our roads and bridges or the direction our economy is headed, The Innovation Delusion is a deeply necessary reevaluation of a trend we can still disrupt. |
computer history museum membership: Digital Resilience Ray Rothrock, 2018-04-19 In the Digital Age of the twenty-first century, the question is not if you will be targeted, but when. Are you prepared? If not, where does one begin? For an enterprise to be fully prepared for the immanent attack, it must be actively monitoring networks, taking proactive steps to understand and contain attacks, enabling continued operation during an incident, and have a full recovery plan already in place. Cybersecurity expert Ray Rothrock has provided for businesses large and small a must-have resource that highlights: the tactics used by today’s hackers, vulnerabilities lurking in networks, and strategies not just for surviving attacks, but thriving while under assault. Businesses and individuals will understand better the threats they face, be able to identify and address weaknesses, and respond to exploits swiftly and effectively. From data theft to downed servers, from malware to human error, cyber events can be triggered anytime from anywhere around the globe. Digital Resilience provides the resilience-building strategies your business needs to prevail--no matter what strikes. |
computer history museum membership: The Autonomous Revolution William H. Davidow, Michael S. Malone, 2020-02-18 The coauthors of the seminal book The Virtual Corporation describe how the rise of artificial intelligence and virtual environments are ushering in an epic cultural transformation—and how we can thrive in this new era. We are at the dawn of the Autonomous Revolution, a turning point in human history as decisive as the Agricultural and Industrial Revolutions. More and more, AI-based machines are replacing human beings, and online environments are gathering our data and using it to manipulate us. This loss of human autonomy amounts to nothing less than a societal phase change, a fundamental paradigm shift. The same institutions will remain—schools, banks, churches, and corporations—but they will radically change form, obey new rules, and use new tools. William H. Davidow and Michael S. Malone go deeply into the enormous implications of these developments. They show why increases in productivity no longer translate into increases in the GDP and how zero cost, one-to-many communications have been turned into tools for cybercrime and propaganda. Many of the book's recommendations—such as using taxes to control irresponsible internet behavior and enabling people to put their data into what are essentially virtual personal information “safety deposit boxes”—are bold and visionary, but we must figure out how we will deal with these emerging challenges now, before the Autonomous Revolution overcomes us. |
computer history museum membership: The Big Score Michael Shawn Malone, 1985 |
computer history museum membership: The Man Behind the Microchip Leslie Berlin, 2006-11-13 This is the life of a giant of the high-tech industry - co-founder of Fairchild Semiconductor and Intel - and inventor of the integrated circuit, which is used in every modern computer, microwave, telephone and car. |
computer history museum membership: The Age of Edison Ernest Freeberg, 2014-01-28 A sweeping history of the electric light revolution and the birth of modern America The late nineteenth century was a period of explosive technological creativity, but more than any other invention, Thomas Edison’s incandescent light bulb marked the arrival of modernity, transforming its inventor into a mythic figure and avatar of an era. In The Age of Edison, award-winning author and historian Ernest Freeberg weaves a narrative that reaches from Coney Island and Broadway to the tiniest towns of rural America, tracing the progress of electric light through the reactions of everyone who saw it and capturing the wonder Edison’s invention inspired. It is a quintessentially American story of ingenuity, ambition, and possibility in which the greater forces of progress and change are made by one of our most humble and ubiquitous objects. |
computer history museum membership: Core Memory , 2018-08 An unprecedented combination of computer history and striking images, Core Memory reveals modern technology's evolution through the world's most renowned computer collection, the Computer History Museum in the Silicon Valley. Vivid photos capture these historically important machines including the Eniac, Crays 1 3, Apple I and II while authoritative text profiles each, telling the stories of their innovations and peculiarities |
computer history museum membership: A History of Modern Computing, second edition Paul E. Ceruzzi, 2003-04-08 From the first digital computer to the dot-com crash—a story of individuals, institutions, and the forces that led to a series of dramatic transformations. This engaging history covers modern computing from the development of the first electronic digital computer through the dot-com crash. The author concentrates on five key moments of transition: the transformation of the computer in the late 1940s from a specialized scientific instrument to a commercial product; the emergence of small systems in the late 1960s; the beginning of personal computing in the 1970s; the spread of networking after 1985; and, in a chapter written for this edition, the period 1995-2001. The new material focuses on the Microsoft antitrust suit, the rise and fall of the dot-coms, and the advent of open source software, particularly Linux. Within the chronological narrative, the book traces several overlapping threads: the evolution of the computer's internal design; the effect of economic trends and the Cold War; the long-term role of IBM as a player and as a target for upstart entrepreneurs; the growth of software from a hidden element to a major character in the story of computing; and the recurring issue of the place of information and computing in a democratic society. The focus is on the United States (though Europe and Japan enter the story at crucial points), on computing per se rather than on applications such as artificial intelligence, and on systems that were sold commercially and installed in quantities. |
computer history museum membership: Quantum Mechanics (A Ladybird Expert Book) Jim Al-Khalili, 2017-01-26 What is quantum mechanics? Learn from the experts in the ALL-NEW LADYBIRD EXPERT SERIES A clear, simple and entertaining introduction to the weird, mind-bending world of the very, very small. Written by physicist and broadcaster Professor Jim Al-Khalili, Quantum Mechanics explores all the key players, breakthroughs, controversies and unanswered questions of the quantum world. You'll discover: - How the sun shines - Why light is both a wave and a particle - The certainty of the Uncertainty Principle - Schrodinger's Cat - Einstein's spooky action - How to build a quantum computer - Why quantum mechanics drives even its experts completely crazy 'Jim Al-Khalili has done an admirable job of condensing the ideas of quantum physics from Max Planck to the possibilities of quantum computers into brisk, straightforward English' THE TIMES Learn about other topics in the Ladybird Experts series including The Big Bang, Gravity, Climate Change and Evolution. Written by the leading lights and most outstanding communicators in their fields, the Ladybird Expert books provide clear, accessible and authoritative introductions to subjects drawn from science, history and culture. For an adult readership, the Ladybird Expert series is produced in the same iconic small format pioneered by the original Ladybirds. Each beautifully illustrated book features the first new illustrations produced in the original Ladybird style for nearly forty years. |
computer history museum membership: Minitel Julien Mailland, Kevin Driscoll, 2017-06-23 The first scholarly book in English on Minitel, the pioneering French computer network, offers a history of a technical system and a cultural phenomenon. A decade before the Internet became a medium for the masses in the United States, tens of millions of users in France had access to a network for e-mail, e-commerce, chat, research, game playing, blogging, and even an early form of online porn. In 1983, the French government rolled out Minitel, a computer network that achieved widespread adoption in just a few years as the government distributed free terminals to every French telephone subscriber. With this volume, Julien Mailland and Kevin Driscoll offer the first scholarly book in English on Minitel, examining it as both a technical system and a cultural phenomenon. Mailland and Driscoll argue that Minitel was a technical marvel, a commercial success, and an ambitious social experiment. Other early networks may have introduced protocols and software standards that continue to be used today, but Minitel foretold the social effects of widespread telecomputing. They examine the unique balance of forces that enabled the growth of Minitel: public and private, open and closed, centralized and decentralized. Mailland and Driscoll describe Minitel's key technological components, novel online services, and thriving virtual communities. Despite the seemingly tight grip of the state, however, a lively Minitel culture emerged, characterized by spontaneity, imagination, and creativity. After three decades of continuous service, Minitel was shut down in 2012, but the history of Minitel should continue to inform our thinking about Internet policy, today and into the future. |
computer history museum membership: A to Z of Computer Scientists, Updated Edition Harry Henderson, 2020-01-01 Praise for the previous edition: Entries are written with enough clarity and simplicity to appeal to general audiences. The additional readings that end each profile give excellent pointers for more detailed information...Recommended.—Choice This well-written collection of biographies of the most important contributors to the computer world...is a valuable resource for those interested in the men and women who were instrumental in making the world we live in today. This is a recommended purchase for reference collections.—American Reference Books Annual ...this one is recommended for high-school, public, and undergraduate libraries.—Booklist The significant role that the computer plays in the business world, schools, and homes speaks to the impact it has on our daily lives. While many people are familiar with the Internet, online shopping, and basic computer technology, the scientists who pioneered this digital age are generally less well-known. A to Z of Computer Scientists, Updated Edition features 136 computer pioneers and shows the ways in which these individuals developed their ideas, overcame technical and institutional challenges, collaborated with colleagues, and created products or institutions of lasting importance. The cutting-edge, contemporary entries explore a diverse group of inventors, scientists, entrepreneurs, and visionaries in the computer science field. People covered include: Grace Hopper (1906–1992) Dennis Ritchie (1941–2011) Brian Kernighan (1942–present) Howard Rheingold (1947–present) Bjarne Stroustrup (1950–present) Esther Dyson (1951–present) Silvio Micali (1954–present) Jeff Bezos (1964–present) Pierre Omidyar (1967–present) Jerry Yang (1968–present) |
computer history museum membership: The Man Who Knew Sebastian Mallaby, 2017-12-05 WINNER OF THE 2016 FT & McKINSEY BUSINESS BOOK OF THE YEAR AWARD, this is the biography of one of the titans of financial history over the last fifty years. Born in 1926, Alan Greenspan was raised in Manhattan by a single mother and immigrant grandparents during the Great Depression but by quiet force of intellect, rose to become a global financial 'maestro'. Appointed by Ronald Reagan to Chairman of the Federal Reserve, a post he held for eighteen years, he presided over an unprecedented period of stability and low inflation, was revered by economists, adored by investors and consulted by leaders from Beijing to Frankfurt. Both data-hound and eligible society bachelor, Greenspan was a man of contradictions. His great success was to prove the very idea he, an advocate of the Gold standard, doubted: that the discretionary judgements of a money-printing central bank could stabilise an economy. He resigned in 2006, having overseen tumultuous changes in the world's most powerful economy. Yet when the great crash happened only two years later many blamed him, even though he had warned early on of irrational exuberance in the market place. Sebastian Mallaby brilliantly shows the subtlety and complexity of Alan Greenspan's legacy. Full of beautifully rendered high-octane political infighting, hard hitting dialogue and stories, The Man Who Knew is superbly researched, enormously gripping and the story of the making of modern finance. |
computer history museum membership: Makers of the Microchip Christophe Lecuyer, David C. Brock, 2010-09-03 The first years of the company that developed the microchip and created the model for a successful Silicon Valley start-up. In the first three and a half years of its existence, Fairchild Semiconductor developed, produced, and marketed the device that would become the fundamental building block of the digital world: the microchip. Founded in 1957 by eight former employees of the Schockley Semiconductor Laboratory, Fairchild created the model for a successful Silicon Valley start-up: intense activity with a common goal, close collaboration, and a quick path to the market (Fairchild's first device hit the market just ten months after the company's founding). Fairchild Semiconductor was one of the first companies financed by venture capital, and its success inspired the establishment of venture capital firms in the San Francisco Bay area. These firms would finance the explosive growth of Silicon Valley over the next several decades. This history of the early years of Fairchild Semiconductor examines the technological, business, and social dynamics behind its innovative products. The centerpiece of the book is a collection of documents, reproduced in facsimile, including the company's first prospectus; ideas, sketches, and plans for the company's products; and a notebook kept by cofounder Jay Last that records problems, schedules, and tasks discussed at weekly meetings. A historical overview, interpretive essays, and an introduction to semiconductor technology in the period accompany these primary documents. |
computer history museum membership: Masterminds of Programming Federico Biancuzzi, Chromatic, 2009-03-21 Masterminds of Programming features exclusive interviews with the creators of several historic and highly influential programming languages. In this unique collection, you'll learn about the processes that led to specific design decisions, including the goals they had in mind, the trade-offs they had to make, and how their experiences have left an impact on programming today. Masterminds of Programming includes individual interviews with: Adin D. Falkoff: APL Thomas E. Kurtz: BASIC Charles H. Moore: FORTH Robin Milner: ML Donald D. Chamberlin: SQL Alfred Aho, Peter Weinberger, and Brian Kernighan: AWK Charles Geschke and John Warnock: PostScript Bjarne Stroustrup: C++ Bertrand Meyer: Eiffel Brad Cox and Tom Love: Objective-C Larry Wall: Perl Simon Peyton Jones, Paul Hudak, Philip Wadler, and John Hughes: Haskell Guido van Rossum: Python Luiz Henrique de Figueiredo and Roberto Ierusalimschy: Lua James Gosling: Java Grady Booch, Ivar Jacobson, and James Rumbaugh: UML Anders Hejlsberg: Delphi inventor and lead developer of C# If you're interested in the people whose vision and hard work helped shape the computer industry, you'll find Masterminds of Programming fascinating. |
computer history museum membership: Gay, Catholic, and American Greg Bourke, 2021-09-01 Catholic Greg Bourke's profoundly moving memoir about growing up gay and overcoming discrimination in the battle for same-sex marriage in the US. In this compelling and deeply affecting memoir, Greg Bourke recounts growing up in Louisville, Kentucky, and living as a gay Catholic. The book describes Bourke’s early struggles for acceptance as an out gay man living in the South during the 1980s and ’90s, his unplanned transformation into an outspoken gay rights activist after being dismissed as a troop leader from the Boy Scouts of America in 2012, and his historic role as one of the named plaintiffs in the landmark United States Supreme Court decision Obergefell vs. Hodges, which legalized same-sex marriage nationwide in 2015. After being ousted by the Boy Scouts of America (BSA), former Scoutmaster Bourke became a leader in the movement to amend antigay BSA membership policies. The Archdiocese of Louisville, because of its vigorous opposition to marriage equality, blocked Bourke’s return to leadership despite his impeccable long-term record as a distinguished boy scout leader. But while making their home in Louisville, Bourke and his husband, Michael De Leon, have been active members at Our Lady of Lourdes Catholic Church for more than three decades, and their family includes two adopted children who attended Lourdes school and were brought up in the faith. Over many years and challenges, this couple has managed to navigate the choppy waters of being openly gay while integrating into the fabric of their parish life community. Bourke is unapologetically Catholic, and his faith provides the framework for this inspiring story of how the Bourke De Leon family struggled to overcome antigay discrimination by both the BSA and the Catholic Church and fought to legalize same-sex marriage across the country. Gay, Catholic, and American is an illuminating account that anyone, no matter their ideological orientation, can read for insight. It will appeal to those interested in civil rights, Catholic social justice, and LGBTQ inclusion. |
computer history museum membership: A People’s History of Computing in the United States Joy Lisi Rankin, 2018-10-08 Silicon Valley gets all the credit for digital creativity, but this account of the pre-PC world, when computing meant more than using mature consumer technology, challenges that triumphalism. The invention of the personal computer liberated users from corporate mainframes and brought computing into homes. But throughout the 1960s and 1970s a diverse group of teachers and students working together on academic computing systems conducted many of the activities we now recognize as personal and social computing. Their networks were centered in New Hampshire, Minnesota, and Illinois, but they connected far-flung users. Joy Rankin draws on detailed records to explore how users exchanged messages, programmed music and poems, fostered communities, and developed computer games like The Oregon Trail. These unsung pioneers helped shape our digital world, just as much as the inventors, garage hobbyists, and eccentric billionaires of Palo Alto. By imagining computing as an interactive commons, the early denizens of the digital realm seeded today’s debate about whether the internet should be a public utility and laid the groundwork for the concept of net neutrality. Rankin offers a radical precedent for a more democratic digital culture, and new models for the next generation of activists, educators, coders, and makers. |
computer history museum membership: The World of Digital Typesetting John W. Seybold, 1984 |
computer history museum membership: Circuits, Packets, and Protocols James L. Pelkey, Andrew L. Russell, Loring G. Robbins, 2022-04-19 As recently as 1968, computer scientists were uncertain how best to interconnect even two computers. The notion that within a few decades the challenge would be how to interconnect millions of computers around the globe was too far-fetched to contemplate. Yet, by 1988, that is precisely what was happening. The products and devices developed in the intervening years—such as modems, multiplexers, local area networks, and routers—became the linchpins of the global digital society. How did such revolutionary innovation occur? This book tells the story of the entrepreneurs who were able to harness and join two factors: the energy of computer science researchers supported by governments and universities, and the tremendous commercial demand for Internetworking computers. The centerpiece of this history comes from unpublished interviews from the late 1980s with over 80 computing industry pioneers, including Paul Baran, J.C.R. Licklider, Vint Cerf, Robert Kahn, Larry Roberts, and Robert Metcalfe. These individuals give us unique insights into the creation of multi-billion dollar markets for computer-communications equipment, and they reveal how entrepreneurs struggled with failure, uncertainty, and the limits of knowledge. |
computer history museum membership: The Friendly Orange Glow Brian Dear, 2018-10-02 At a time when Steve Jobs was only a teenager and Mark Zuckerberg wasn’t even born, a group of visionary engineers and designers—some of them only high school students—in the late 1960s and 1970s created a computer system called PLATO, which was light-years ahead in experimenting with how people would learn, engage, communicate, and play through connected computers. Not only did PLATO engineers make significant hardware breakthroughs with plasma displays and touch screens but PLATO programmers also came up with a long list of software innovations: chat rooms, instant messaging, message boards, screen savers, multiplayer games, online newspapers, interactive fiction, and emoticons. Together, the PLATO community pioneered what we now collectively engage in as cyberculture. They were among the first to identify and also realize the potential and scope of the social interconnectivity of computers, well before the creation of the internet. PLATO was the foundational model for every online community that was to follow in its footsteps. The Friendly Orange Glow is the first history to recount in fascinating detail the remarkable accomplishments and inspiring personal stories of the PLATO community. The addictive nature of PLATO both ruined many a college career and launched pathbreaking multimillion-dollar software products. Its development, impact, and eventual disappearance provides an instructive case study of technological innovation and disruption, project management, and missed opportunities. Above all, The Friendly Orange Glow at last reveals new perspectives on the origins of social computing and our internet-infatuated world. |
computer history museum membership: Communities of Computing Thomas J. Misa, 2016-11-10 Communities of Computing is the first book-length history of the Association for Computing Machinery (ACM), founded in 1947 and with a membership today of 100,000 worldwide. It profiles ACM's notable SIGs, active chapters, and individual members, setting ACM's history into a rich social and political context. The book's 12 core chapters are organized into three thematic sections. Defining the Discipline examines the 1960s and 1970s when the field of computer science was taking form at the National Science Foundation, Stanford University, and through ACM's notable efforts in education and curriculum standards. Broadening the Profession looks outward into the wider society as ACM engaged with social and political issues - and as members struggled with balancing a focus on scientific issues and awareness of the wider world. Chapters examine the social turbulence surrounding the Vietnam War, debates about the women's movement, efforts for computing and community education, and international issues including professionalization and the Cold War. Expanding Research Frontiers profiles three areas of research activity where ACM members and ACM itself shaped notable advances in computing, including computer graphics, computer security, and hypertext. Featuring insightful profiles of notable ACM leaders, such as Edmund Berkeley, George Forsythe, Jean Sammet, Peter Denning, and Kelly Gotlieb, and honest assessments of controversial episodes, the volume deals with compelling and complex issues involving ACM and computing. It is not a narrow organizational history of ACM committees and SIGS, although much information about them is given. All chapters are original works of research. Many chapters draw on archival records of ACM's headquarters, ACM SIGs, and ACM leaders. This volume makes a permanent contribution to documenting the history of ACM and understanding its central role in the history of computing. |
computer history museum membership: Digital Retro Gordon Laing, 2004-09-21 This book tells the story of the classic home computers that paved the way for the PCs we use today - from 1977s pioneering MITS Altair to the latest swivel screen designs of the iMac and the Tablet PC. |
computer history museum membership: The Computing Universe Anthony J. G. Hey, Gyuri Pápay, 2015 This exciting and accessible book takes us on a journey from the early days of computers to the cutting-edge research of the present day that will shape computing in the coming decades. It introduces a fascinating cast of dreamers and inventors who brought these great technological developments into every corner of the modern world, and will open up the universe of computing to anyone who has ever wondered where his or her smartphone came from. |
computer history museum membership: Marketing High Technology William H. Davidow, 1986-06-02 Marketing is civilized warfare. And as high-tech products become increasingly standardized—practically identical, from the customer's point of view—it is marketing that spells life or death for new devices or entire firms. In a book that is as fascinating as it is pragmatic, William H. Davidow, a legend in Silicon Valley, where he was described as the driving force behind the micro processor explosion, tells how to fight the marketing battle in the intensely competitive world of high-tech companies—and win. Blunt, pithy, and knowledgeable, Davidow draws on his successful marketing experience at Intel Corporation to create a complete program for marketing victory. He drives home the basics, such as how to go head-on against the competition; how to plan products, not devices; how to give products a soul; and how to engineer promotions, market internationally, motivate salespeople, and rally distributors. Above all, he demonstrates the critical importance of servicing and supporting customers. Total customer satisfaction, Davidow makes clear, must be every high-tech marketer's ultimate goal. The only comprehensive marketing strategy book by an insider, Marketing High Technology looks behind the scenes at industry-shaking clashes involving Apple and IBM, Visicorp and Lotus, Texas Instruments and National Semiconductor. He recounts his own involvement in Crush, Intel's innovative marketing offensive against Motorola, to demonstrate, step-by-step, how it became an industry prototype for a winning high-tech campaign. Davidow clearly spells out sixteen principles which increase the effectiveness of marketing programs. From examples as diverse as a Rolling Stones concert and a microprocessor chip, he defines a true product. He analyzes and explains in new ways the strategic importance of distribution as it relates to market sector, pricing, and the pitfalls it entails. He challenges some traditional marketing theory and provides unique and important insights developed from over twenty years in the high-tech field. From an all-encompassing philosophy that great marketing is a crusade requiring total commitment, to a careful study of the cost of attacking a competitor, this book is an essential tool for survival in today's high-risk, fast- changing, and very lucrative high-tech arena. |
computer history museum membership: Giants of Computing Gerard O’Regan, 2013-08-19 It has been upon the shoulders of giants that the modern world has been forged. This accessible compendium presents an insight into the great minds responsible for the technology which has transformed our lives. Each pioneer is introduced with a brief biography, followed by a concise account of their key contributions to their discipline. The selection covers a broad spread of historical and contemporary figures from theoreticians to entrepreneurs, highlighting the richness of the field of computing. Suitable for the general reader, this concise and easy-to-read reference will be of interest to anyone curious about the inspiring men and women who have shaped the field of computer science. |
computer history museum membership: Careers in Focus Ferguson, 2008 Profiles jobs in computers such as college professors, computer science, computer and video game designers, computer network administrators, hardware engineers, software designers, webmasters, and more. |
computer history museum membership: Coding Literacy Annette Vee, 2017-07-28 How the theoretical tools of literacy help us understand programming in its historical, social and conceptual contexts. The message from educators, the tech community, and even politicians is clear: everyone should learn to code. To emphasize the universality and importance of computer programming, promoters of coding for everyone often invoke the concept of “literacy,” drawing parallels between reading and writing code and reading and writing text. In this book, Annette Vee examines the coding-as-literacy analogy and argues that it can be an apt rhetorical frame. The theoretical tools of literacy help us understand programming beyond a technical level, and in its historical, social, and conceptual contexts. Viewing programming from the perspective of literacy and literacy from the perspective of programming, she argues, shifts our understandings of both. Computer programming becomes part of an array of communication skills important in everyday life, and literacy, augmented by programming, becomes more capacious. Vee examines the ways that programming is linked with literacy in coding literacy campaigns, considering the ideologies that accompany this coupling, and she looks at how both writing and programming encode and distribute information. She explores historical parallels between writing and programming, using the evolution of mass textual literacy to shed light on the trajectory of code from military and government infrastructure to large-scale businesses to personal use. Writing and coding were institutionalized, domesticated, and then established as a basis for literacy. Just as societies demonstrated a “literate mentality” regardless of the literate status of individuals, Vee argues, a “computational mentality” is now emerging even though coding is still a specialized skill. |
computer history museum membership: We Are What We Sell Danielle Sarver Coombs, Bob Batchelor, 2014-01-15 For the last 150 years, advertising has created a consumer culture in the United States, shaping every facet of American life—from what we eat and drink to the clothes we wear and the cars we drive. In the United States, advertising has carved out an essential place in American culture, and advertising messages undoubtedly play a significant role in determining how people interpret the world around them. This three-volume set examines the myriad ways that advertising has influenced many aspects of 20th-century American society, such as popular culture, politics, and the economy. Advertising not only played a critical role in selling goods to an eager public, but it also served to establish the now world-renowned consumer culture of our country and fuel the notion of the American dream. The collection spotlights the most important advertising campaigns, brands, and companies in American history, from the late 1800s to modern day. Each fact-driven essay provides insight and in-depth analysis that general readers will find fascinating as well as historical details and contextual nuance students and researchers will greatly appreciate. These volumes demonstrate why advertising is absolutely necessary, not only for companies behind the messaging, but also in defining what it means to be an American. |
Computer - Wikipedia
A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can perform …
Computer | Definition, History, Operating Systems, & Facts
A computer is a programmable device for processing, storing, and displaying information. Learn more in this article about modern digital electronic computers and their design, constituent …
What is a Computer?
Feb 6, 2025 · What is a Computer? A computer is a programmable device that stores, retrieves, and processes data. The term "computer" was originally given to humans (human computers) …
Micro Center - Computer & Electronics Retailer - Shop Now
Shop Micro Center for electronics, PCs, laptops, Apple products, and much more. Enjoy in-store pickup, top deals, and expert same-day tech support.
What is a Computer? - GeeksforGeeks
Apr 7, 2025 · A computer is an electronic device that processes, stores, and executes instructions to perform tasks. It includes key components such as the CPU (Central Processing Unit), RAM …
Computer Basics: What is a Computer? - GCFGlobal.org
What is a computer? A computer is an electronic device that manipulates information, or data. It has the ability to store, retrieve, and process data. You may already know that you can use a …
What is a Computer? (Definition & Meaning) - Webopedia
Oct 9, 2024 · A computer is a programmable machine that responds to specific instructions and uses hardware and software to perform tasks. Different types of computers, including desktops, …
Computer - Simple English Wikipedia, the free encyclopedia
A computer is a machine that uses electronics to input, process, store, and output data. Data is information such as numbers, words, and lists. Input of data means to read information from a …
Laptop & Desktop Computers - Staples
Buy the computer that fits your exact needs. Choose from laptops, desktops PCs, notebooks, and accessories. Invest in a quality computer for work or personal use.
What is Computer? Definition, Characteristics and Classification
Aug 7, 2024 · A computer is an electronic device wherein we need to input raw data to be processed with a set of programs to produce a desirable output. Computers have the ability to …
Computer - Wikipedia
A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can …
Computer | Definition, History, Operating Systems, & Facts
A computer is a programmable device for processing, storing, and displaying information. Learn more in this article about modern digital electronic computers and their design, constituent …
What is a Computer?
Feb 6, 2025 · What is a Computer? A computer is a programmable device that stores, retrieves, and processes data. The term "computer" was originally given to humans (human computers) …
Micro Center - Computer & Electronics Retailer - Shop Now
Shop Micro Center for electronics, PCs, laptops, Apple products, and much more. Enjoy in-store pickup, top deals, and expert same-day tech support.
What is a Computer? - GeeksforGeeks
Apr 7, 2025 · A computer is an electronic device that processes, stores, and executes instructions to perform tasks. It includes key components such as the CPU (Central Processing Unit), RAM …
Computer Basics: What is a Computer? - GCFGlobal.org
What is a computer? A computer is an electronic device that manipulates information, or data. It has the ability to store, retrieve, and process data. You may already know that you can use a …
What is a Computer? (Definition & Meaning) - Webopedia
Oct 9, 2024 · A computer is a programmable machine that responds to specific instructions and uses hardware and software to perform tasks. Different types of computers, including …
Computer - Simple English Wikipedia, the free encyclopedia
A computer is a machine that uses electronics to input, process, store, and output data. Data is information such as numbers, words, and lists. Input of data means to read information from a …
Laptop & Desktop Computers - Staples
Buy the computer that fits your exact needs. Choose from laptops, desktops PCs, notebooks, and accessories. Invest in a quality computer for work or personal use.
What is Computer? Definition, Characteristics and Classification
Aug 7, 2024 · A computer is an electronic device wherein we need to input raw data to be processed with a set of programs to produce a desirable output. Computers have the ability to …