Sam Champagne

From Hst250
Jump to: navigation, search

Wiki Entry 1: Ada Lovelace

Ada Lovelace’s contribution to computing technology should not go unnoticed. Pairing with Charles Babbage on the building of the Analytical Engine, Ada Lovelace not only contributed her mathematical skills to Babbage’s project but she also kept accurate records of the Engine, and diagrams that made it possible for people to understand the very intricate and complex machine. To quote Howard, in his book Tools for Thought, “Lady Lovelace's published notes are still understandable today and are particularly meaningful to programmers, who can see how truly far ahead of their contemporaries were the Analytical Engineers” (Rheingold). This is most certainly the truth as in her record keeping; she created a complete validation of this impressive and remarkable undertaking that marked the beginning of many great discoveries to come.


She was born in 1815 as Augusta Ada Byron; her father being the promiscuous poet Lord Byron ("Ada Lovelace"). In her upbringing, Ada was astute in mathematics which made her a most perfect match as a coworker for the tireless and perpetual revolving mind of Charles Babbage. She was able to see the Analytical Engine, and its predecessor the Difference Engine, in the same eyes as Charles Babbage. She understood that these inventions were different from what was already available and that they possessed great potential. “Ada was one of the few to recognize that the Difference Engine was altogether a different sort of device than the mechanical calculators of the past. Whereas previous devices were analog (performing calculation by means of measurement), Babbage's was digital (performing calculation by means of counting). More importantly, Babbage's design combined arithmetic and logical functions” (Rheingold).


Ada’s astuteness in mathematics came from being tutored “by De Morgan, the foremost logician of his time.” She, in considering the Analytical Engine, “had ideas of her own about the possibilities of what one might do with such devices. Of Ada's gift for this new type of partially mathematical, partially logical exercise, Babbage himself noted: ‘She seems to understand it better than I do, and is far, far better at explaining it'” (Rheingold).


In 1843, she translated an Italian article by Menabrea about the Analytical Engine into French. She not only provided a complete translation but she also added some notes of her own. As mentioned before, her mathematical skills enabled her to partake in constructing the “software” of the engine, but her greatest contribution to Babbage’s Engine, and to the development of computing overall, is the recording of the Analytical Engine in a series of these notes. These notes “included the first published description of a stepwise sequence of operations for solving certain mathematical problems and Ada is often referred to as ‘the first programmer’” ("Ada Lovelace"). Furthermore, in her notes, she did not just make a mere observation, but explained the mathematical function of the machine and the punch card system that it used. In a sense, she explained the software of the Analytical Engine and was responsible for being its programmer.


In 1852, at the age of only thirty-six, Ada Lovelace died of cancer. Babbage continued to work on his machine, “but without Ada's advice, support, and sometimes stern guidance, he was not able to complete his long-dreamed-of Analytical Engine” (Rheingold). Ada’s contribution to the development of computing was quite an accomplishment for a woman in the 19th Century. Not only did Babbage lose a partner with her passing, but he also lost the soul foundation and drive behind his Analytical Engine.


Works Cited:

"Ada Lovelace." Computer History Musuem. N.p., 2008. Web. 14 May 2012. <http://www.computerhistory.org/babbage/adalovelace/ >.

Rheingold, Howard. Tools for Thought. 2000. 25-44. Web. 12 May 2012. <http://www.rheingold.com/texts/tft/2.html>.


Wiki Entry 2: Spacewar!

Spacewar, being the first and original video game introduced into the computing technology world, is one of the most profound discoveries as it introduced a new generation of entertainment. It is also worth pointing out that the market for video games, whether it is handheld, computer, or console gaming, is growing and doing so rapidly. All this can be contributed to the efforts of a young Steve Russell.

At the original time of the founding of Spacewar, it was the "mid-sixties, when computer time was still very expensive, [however] Spacewar could be found on nearly every research computer in the country" (Bellis). The founder, Steve Russell, was a young programmer that was attending MIT. He was inspired by science fiction novels and dreamt up the design for the game Spacewar; which unbeknownst to him would be the start of a large and popular industry.

The game was founded in 1962, and the development actually "took the team about 200 man-hours to write the first version of Spacewar. Steve Russell wrote Spacewar on a PDP-1, an early DEC (Digital Equipment Corporation) interactive [mini-computer] which used a cathode-ray tube type display and keyboard input" (Bellis). The PDP-1 system "was the first to allow multiple users to share the computer simultaneously. This was perfect for playing Spacewar, which was a two-player game involving warring spaceships firing photon torpedoes. Each player could maneuver a spaceship and score by firing missiles at his opponent while avoiding the gravitational pull of the sun" (Bellis). The multiplayer aspect of the game made it extremely popular. What would be the fun of video games, or Spacewar for that matter, if you could not contend with and shoot your friends out of the sky?

The game started with ships maneuvering around shooting photon torpedoes at each other. But, like with every cool invention, cooler ideas came to surface like gravity and hyperspace. Hyperspace allowed the players to quickly speed out of a dangerous, and most likely fatal, situation and reappear somewhere random on the map, if they were lucky (Brand). If not just for pure entertainment, the game proved to the world the capabilities of the computing technology at the time being. The game Spacewar "was frequently used in demonstrations to potential customers as a way of showing the power of this unusually-small computer. For example, Spacewar! required over 100,000 calculations per second to compute ship motion, gravity, user control inputs and the relative position of stars and the sun and the computer’s CPU sent over 20,000 points per second to the Type 30 display while running the game" (Spacewar!).

The game was something new and expensive, but as time went on the availability and the diversity of gaming grew. The market has grown to the point that instant access to a form of mindless computing gaming is practically available for everyone. Video games have become a great source of entertainment and as time has gone on, the entertainment reached beyond computer programmers and adolescents. Video games have now been made for children and adults alike but we owe our thanks to the original: Spacewar.

Works Cited

Brand, Stewart. "Spacewar." Rolling Stone, 1972. Web. 29 May 2012. <http://www.wheels.org/spacewar/stone/rolling_stone.html>.

Bellis, Mary. "Biography of Steve Russel." About.com. N.p., n.d. Web. 29 May 2012. <http://inventors.about.com/od/sstartinventions/a/Spacewar_3.htm>.

"Spacewar!." Computer History Musuem. N.p., n.d. Web. 29 May 2012. <http://pdp- 1.computerhistory.org/pdp-1/?f=theme&s=4&ss=3>.


Wiki Entry 3: Virtual Reality

Virtual reality is a computing element that has not been fully grasped yet. It was something to be found in arcades in shopping malls even years ago, but it still has not reached its peak of accessibility and climatic enjoyment to be found in homes next to other gaming stations. We hear this phrase a lot, mostly in the gaming realm of computing, but it is also referred to as a “virtual environment”. The two mean the same and they simply involve “using computer technology to create a simulated, three-dimensional world that a user can manipulate and explore while feeling as if he were in that world. Scientists, theorists and engineers have designed dozens of devices and applications to achieve this goal” (Strickland). As the phrase describes, the world in which the user is thrown into should appear extremely life-like and the device should capture body and eye movement as if the user was actually moving about the virtual environment he is placed in.

In this virtual reality, the “user experiences immersion, or the feeling of being inside and a part of that world. He is also able to interact with his environment in meaningful ways. The combination of a sense of immersion and interactivity is called telepresence” (Strickland). The key then to a successful virtual reality program, is that telepresence is so life-like the user could confuse reality with virtual reality. Furthermore, when many people think of virtual reality they only associate it with sound and visuals. But that only would be focused on two senses, rather than all five. To have a true virtual environment, touch, smell, and taste would also have to come into play. Furthermore, there should be a visual angle to everything in a room. Where many video games only let you see items from a limited amount of viewing angles, a virtual environment game would allow you to see the item as if it was actually in front of you, tangible, and three-dimensional. As Strickland put it: “if the virtual environment consists of a single pedestal in the middle of a room, a user should be able to view the pedestal from any angle and the point of view should shift according to where the user is looking” (Strickland). On a technical side of it, if there is any lag time in the user turning his head and the picture before him following his movement, then it may take away from the reality of the program. This is called “latency”. “When a user detects latency, it causes him to become aware of being in an artificial environment and destroys the sense of immersion” (Strickland).

There have been elements of virtual reality introduced to the world, but nothing that can totally immerse the user into an absolute confusion of what is real and what is virtually real. With the Wii and the Xbox Kinect, we are slowly moving toward a system that ties together elements of a virtual environment. Eventually a system will be made; one that will be affordable, accessible, and awe inspiring. But the simple truth is, can you really take what is real, and imitate it one-hundred percent? For it seems in order to do so, our complete conscious of knowing the environment was fake would have to be put to sleep.

Work Cited

Strickland, Jonathan. "How Virtual Reality Works." How Stuff Works. N.p., n.d. Web. 22 June 2012. <http://electronics.howstuffworks.com/gadgets/other-gadgets/virtual-reality7.htm>.


Wiki Article: Bill Gates: His Success and Contribution to Society

Bill Gates is largely known in the world for his huge contribution to our modern daily computing and lifestyle. The name is so popular that one does not need to know computers to know who Bill Gates is, and it might help that he is one of the richest men alive. But all of this, his money and his renowned name, is due to his diligence, hard work and motivation to do something with the personal computer. That very something is Microsoft, of which is almost always found in the ubiquitous terms of Word, Excel, PowerPoint, Outlook, Access, and Windows. The influence of Microsoft is found in word processing software, internet browsing, and even video games. All of this can be associated with an entrepreneur who took advantage of the beginning of a computer era and grabbed hold of his dreams to pursue what would become one of the largest corporations known today.

Bill Gates was born on October 28, 1955 in Seattle Washington. Gates had an early life in computers, and primarily, “he had an early interest in software and began programming computers at the [very early] age of thirteen” (Bellis). As a teenager he learned to program using “a commercial time-sharing system on which his high school rented time. He and his close friend, Paul Allen, two years his senior, discovered a mutual passion for programming. They also shared a strong entrepreneurial flair from the very beginning: When Gates was only sixteen, long before the personal-computer revolution, the two organized a small firm for the computer analysis of traffic data, which they named Traf-O-Data” (Campbell p. 215). In 1973, Gates attended Harvard University to pursue a career in law but would soon learn that his heart was with computers. It was at Harvard where he was to lay the foundations of his notable career.

Gates was a computer nerd at the right time in history. Personal computers were just becoming popular among the young adolescents and computer hobbyists. If one was to be interested in computers, this was the time to be so. Furthermore, “[the] personal computer has been of such sweeping global importance that no book on the history of the computer could properly ignore it” (Campbell p. 207). For the modern generation, thinking about computers as huge intricate machines that filled an entire room is somewhat like looking back into history. The young people of today are not use to living without these modern pieces of art all about us. We find them in every home and in every single office. They take the form of laptops, iPods, Kindles, and cell phones. Computer technology has had an impact on the world in more ways than can be counted, and personalizing the technology to be available and affordable for the common man is a profound advancement in our modern time. Like any form of science, the technology of computing was open to large research and advancements that led to the industry that we know so well today. Both of these opened the door for competition and large production of personal computers, giving way to an intellectual leap in human history.

Though it was only relatively recent that the personal computer became so popular and domestic, the intellectual leap from the monster sized computer to the personal computer involved both a necessary change in technology and software, and a futuristic perception of the innumerable possibilities that personal computers would open the door to; and we have entrepreneurs to thank for that. “In retrospect, the most important of the early software entrepreneurs was Bill Gates… [although] his ultimate financial success has been almost without parallel; his background was quite typical of a 1970s software nerd.” Gates, while attending Harvard for law, continued to program computers. Most days, he slept in and he lacked social skills and he was “oblivious to the wider world and the need to gain qualifications and build a career” (Campbell p. 215). But that did not matter because Gates was doing what he loved most: working with computers. This software nerd characteristic that fit Gates “contains an essential truth; nor was it a new phenomenon—the programmer-by-night has existed since the 1950’s. Indeed, programming the first personal computers had many similarities to programming a 1950s mainframe: There were no advanced software tools, and programs had to be hand-crafted in the machine’s own binary codes so that every byte of the tiny memory could be used to its best advantage” (Campbell p. 215).

At the start, slow changes in technology, such as the discovery of microprocessors, attracted computer hobbyists to buy computer parts and build and program them themselves. As the technology steadily grew better, computing as a hobby also grew and it attracted more people with their curiosities and wonderment and the possibilities posed by computer technology. Even personal computers were originally built for the few who had a large understanding of them, as they were so intricate and complex. Even though the common individual without any computer experience might be able to play a game on it, there was little beyond it that was easy to comprehend until the advancement of software made it possible for anyone to sit down and actually enjoy the machine. The attraction and attention that the little personal computer received in just a few years made it popular among people.

This is what Gates had taken advantage of. With the new changes in computing technology, personal computers required (and still require) software to be written for them so that they can be of any use. While still attending Harvard University, Micro Instrumentation Telemetry Systems’ (MITS) Altair 8800 microcomputer was released and it became a turning point in the lives of both Bill Gates and Paul Allen. It was the first microprocessor-based computer, and because of its extremely low price and affordability, it is sometimes considered the first personal computer. Seeing the chance of developing software for the Altair, Gates and Allen suggested developing a BASIC program for the Altair 8800 to MITS. “The Altair 8800 was unprecedented and in no sense a ‘rational’ product—it would appeal only to an electronics hobbyist of the most dedicated kind, and even that was not guaranteed,” thus, the reasoning for developers to come along with their software to make the microcomputer more attractive. Furthermore, “the limitations of the Altair 8800 created the opportunity for small-time entrepreneurs to develop ‘add-on’ boards so that the extra memory, conventional teletypes, and audiocassette recorders…could be added to the basic machine. Almost all of these start-up companies consisted of two or three people—mostly computer hobbyists hoping to turn their pastime to profit” (Campbell p. 214). This, as one can guess, was the start of Microsoft.

Within the same year of MITS releasing Altair 8800, Allen and Gates provided their version of the BASIC system to the company. The two had created a partnership (Microsoft) and Allen even became the director of software for MITS. “The Altair 8800, and the add-on boards and software that were soon available for it, transformed hobby electronics in a way not seen since the heyday of radio. In the spring of 1975, for example, the ‘Homebrew Computer Club’ was established in Menlo Park…besides acting as a swap shop for computer components and programming tips, it also provided a forum for the computer-hobbyist and computer liberation cultures to meld” (Campbell p. 216).

Computer liberation was a phenomenon, mostly associated with the same crowd involved in the hippie movement. The people of this phenomenon want the computing technology to be available to them no matter what means it took to make it happen. It is much like software and copyright pirating of today. It “sprang from a general malaise in the under-thirty crowd in the post-Beatles, post-Vietnam War period of the early 1970s. There was still a strong anti-establishment culture that expressed itself through the phenomena of college dropouts and campus riots, communal living, hippie culture, and alternative lifestyles sometimes associated with drugs. Such a movement for liberation would typically want to wrest communications technologies from vested corporate interests” (Campbell p. 212). The idea was to bring the computing technology to the ordinary individual and out of the large bureaucratic and corporate offices. For this to be possible computers would need to become affordable and software would have to be easy to use. Money and intricate programming separated the elite computer guru from the common man, something computer liberators wanted to change. However, as above mentioned computer liberators would pursue any route to make this happen even if it was creating a black market of software and stealing software.

Gates was able to address this issue of computer liberators when MITS had held a world-wide conference. The Altair 8800 microcomputer was growing very popular and just in the first few months of 1975, after releasing the Altair 8800, over $1 million worth of orders came into MITS for the product. At the conference, Gates, as co-developer of Altair Basic, delivered a speech. He “launched a personal diatribe against hobbyists who pirated software. This was a dramatic position: He was advocating a shift in culture from the friendly sharing of free software among hobbyists to that of an embryonic branch of the packaged-software industry.” In other words, Gates was trying to protect his and Allen’s own interests: the development of software through their firm Microsoft. If pirating was to continue in software market then it would bring revenues down. Furthermore, there was already a large increase of firms entering the market as many individual saw the same opportunity that Gates and Allen had. More firms created larger competition and Gates had to protect what he established. He “encountered immense hostility—his speech was, after all, the very antithesis of computer liberation. But his position was eventually accepted by producers and consumers, and over the next two years it was instrumental in transforming the personal computer from a utopian ideal to an economic artifact” (Campbell p. 216).

And this is exactly what happened. The personal computer became a huge hit and its popularity grew as more and more companies engaged in the market with their own products and software. Companies like Apple and IBM began to produce personal computers. The computer hobbyists were making money and the computer liberators were seeing their dreams come true. “”While it had taken the mainframe a decade to be transformed from laboratory instrument to business machine, the personal computer was transformed in just two years” (Campbell p. 217). One of the reasons for the quick establishment of the personal computer on the market was that people wanted it, and when society makes the demands, firms will supply. But it was also much easier to get the personal computer rolling in the market then the main frame because “most of the subsystems required to create a personal computer already existed: keyboards, screens, disk drives, and printers. It was just a matter of putting the pieces together…Within months of its initial launch at the beginning of 1975, the Altair 8800 had itself been eclipsed by dozens of new models produced by firms such as Applied Computer Technology, IMSAI, North Star, Cromemco, and Vector” (Campbell p. 217).

The personal computer became more personalized when individuals could not only understand them better, but when word processors and video games came into play: turning the personal computer into more of a domestic item than a strictly business one. It was not until a little later that the word processing software came into play because the early personal computers could only display forty uppercase letters on the screen at once. It is also important to note that “computer games are often overlooked in discussions of the personal-computer software industry, but they played an important role in its early development. Programming computer games created a corps of young programmers who were very sensitive to what we now call human-computer interaction. The most successful games were ones that need no manuals and gave instant feedback. The most successful business software had similar, user-friendly characteristics” (Campbell pp. 222 and 224). The concept was the same, make an item that is user friendly and people will buy it. The concept also carried into schools and higher education as software could teach education programs, math, language, etc. and to this day, technology in the classroom is a growing need.

Gates was in a good spot to begin with but his ship came in when IBM approached Microsoft to build software for their personal computer line. “Although IBM was the world’s largest software developer, paradoxically it did not have the skills to develop software for personal computers. It bureaucratic software development procedures were slow and methodical, and geared to large software artifacts; the company lacked the critical skills needed to develop the ‘quick–and-dirty’ software need for personal computers” (Campbell p. 226). In 1980, IBM approached Gates and Allen with the request that Microsoft build a software operating system for IBM’s personal computer. “Gates obtained a suitable piece of software from a local software firm…for $30,000 cash and improved it. Eventually, the operating system, known as MS-DOS would be bundled with almost every IBM personal computer and compatible machine, earning Microsoft a royalty of between $10 and $50 on every copy sold.” The IBM contract launched Microsoft farther in business success then both Gates and Allen could have imagined. “Over the next decade, buoyed by the revenues from its operating system for the IBM personal computer, Microsoft became the quintessential business success story of the late twentieth century, and Gates became a billionaire at the age of thirty-one. Hence, for all of Gates’s self-confidence and remarkable business acumen, he owed almost everything to being in the right place at the right time” (Campbell pp. 226-227).

Microsoft has made a huge impact on society. The development of personal computers are of course credited for the technology leap in society for they are the more visible and tangible item that is placed before us, but without the development of software the personal computer would be absolutely useless. Through the years, Microsoft released MS-DOS, Windows 1.0-3.0, Windows 95-98, Windows 2000, Windows XP and Vista, and most recently Windows 7. The Windows operating systems have become something desirable: not just an item pre-loaded on the personal computer and shipped to your house. People update their old software as soon as new stuff is released. In fact, “by the fall of 2010, Windows 7 is selling seven copies a second—the fastest-selling operating system in history” (“A History of Windows”). Microsoft has even had its impact in the realm of video games both in the Xbox line and the personal computing realm. Furthermore, the line of Microsoft products is ubiquitous on almost every computer, and we rely on them in our households, the workplace, and in the educational system. We do our homework, write financial reports, email our friends and colleagues, browse the web, entertain ourselves with music and video, and chat with loved ones across the globe all on Microsoft products, and that is only to name a few successes of what has become of the corporations efforts. It may seem that Gates was at the right place at the right time, but it could also be said that IBM found the best and most knowledgeable software guy at a precise moment in computer history.

Works Cited

"A History of Windows: Highlights from the first 25 years." Windows. Windows. Web. 22 June 2012. <http://windows.microsoft.com/en-us/windows/history>.

Bellis, Mary. "Bill Gates-Biography and History." About.com. Web. 22 June 2012. <http://inventors.about.com/od/gstartinventors/a/Bill_Gates.htm>.

Campbell, Kelly Martin, and W. Aspray. Computer: A History of the Information Machine. New York: Harper Collins, 1996. Web. 20 June 2012. <http://history.msu.edu/hst250/files/2009/04/computer-a-history-of-the-information-machine-ch10.pdf>.