Jordan Leven

From Hst250
Jump to: navigation, search

Wiki Entry One

While almost everyone uses computers in the modern era, few understand how a computer works and stores data. The digitalality of the computer creates a façade that hides its true mechanical interworkings and the industrial relationship between data processors and electronic particles. The mysteriousness surrounding binary-data processors began with the creation of a cathode ray tube (CRT) that paved the road to modern computing.

The Williams Tube is widely regarded as an antiquated primeval processor and was one of the first technological advances that allowed for digital computations. Invented by Freddie Williams and Tom Kilburn, the processor used electronic particles stored as binary information that later would comprise more elaborate computing algorithms. It was created at the University of Manchester in England and deviated from mechanical instruments of computing by manipulating electric particles that would later inspire the creation of dynamic random access memory (DRAM). This type of RAM is intrinsic to the utilization of advanced computers because it allows for the temporary storage of data provided. As opposed to ROM (read only memory), RAM allows interim data that has not been permanently written on a hard disk or SSID.

The execution of this “tube” was not without fault. With age, it grew unreliable and deteriated by the constantant bombardment of electronic particles. The tube relied on negative charges “drawn” onto an area of the tube. A dot or dash (representing a zero or one) would be fired onto the tube and create a “potential well” surrounding the area of the dot or dash that would create a negative charge for a a few milliseconds. Before the charge completely disappeared or was erased by another marking, a metal plate would read the charge of the dot or dash by recognizing a change in perceived voltage of the plate itself. After reading the negative charge, the charge of area is changed by creating a new “potential well” and the charge is removed.

Of course, this early technological advance was short-lived and relatively infantile. The tube originally would only store between 512 and 1024 bits of data; more than one-thousand times smaller than anything close to current standards. In 1947, the tube successfully graduated to 2048 bits of data for a few hours. The tube, however, was still used on several occasions, including the IAS machine (in Princeton, New Jersey) which was a fourty bit word processing computer that could store up to 1024 words. It was also used in the IBM 701 and 702 computers until Moore’s Law succeeded in labeling the tube as obsolete. In 1955, the tube was outdone by a new (and cheaper) radom access memory called “magnetic core store.” Due to this new technology, the tube was interred into the technological graveyard.

The legend of the Williams Tube remains active today and was in the American lexicon up to 1998 to describe a storage devices that were created using the design of the Williams and Kilburn patents.

The Williams Tube. <> Web. 2010, May 29. 2 Ibid. 3 Williams Tube.<> Web. 2010, May 29. 4 Ibid 5 The Williams Tube. <> Web. 2010, May 29

Wiki Entry Two - Apple II

Before Apple was the slick and hip company introducing revoluntary products like the iPhone or iPad, it abided by the same clunky technological standards as other companies. The Apple II was the first consumer-grade computer over the Apple I that used a professional looking monitor and plastic body and set a foundation for its eventual successor: the Macintosh. The Apple II is widely regarded as the first “buyable” computer, with its hard plastic case, in juxtaposition to the wood frame of the original Apple I. The Apple II, however, is not only significant for the innate technological advents, but rather set precedent for the rising technology company to create newer, sophisticated technologies while inadvertently making older technologies obsolete.

Introduced in 1978 at the West Coast Computer Faire, the Apple II propelled the computer age into a new realm; spawning new learning areas called, “computer labs” that were designed to be hyper-functioning rooms of dynamic learning using teaching software (All Cavanaugh). Apple quickly realized that catching children at an early age would establish brand loyalty, and for thay reason they set out for the schools. The education institution supplier Bell & Howell reached a deal with Apple to rebrand the computers under the name of Bell & Howell and began to supply schools with Apple IIs, painted jet black; subsequently earning the nickname “Darth Vader” (History of Apple, 2010). The standard Apple II shipped with four kilobytes of ram, which by today’s standards is a mere sliver of memory (Rothman, 2009). The 1.023 MHz processor combined with sound and sixteen-color display made the Apple II a must-have businessman, schools and wealthy families (Rothman, 2009). The read-only memory (ROM) had Apple programmer Steve Wozniak’s AppleSoft BASIC infrastructure and ran DOS 3.1 and DOS 3.2 operating systems on two floppy disks.

The Apple II later was succeeded by a series of updates and revisions. The “Apple II [plus]”, “Apple IIe”, “Apple IIc”, “Apple IIgs” and “Apple Lisa” all were posterity of the original machine (Apple Museum). These later revisions of the Apple II incorporated new technological standards and later housed classic video games like, “The Oregon Trail” (The History of Apple, 2010).

The Macintosh, introdued in 1984, began to cannibalize the sales of the Apple II with the new rebranding of the computer company; contending that the Macintosh was the most user-friendly computer available. The Apple IIgs sold until the end of 1992, while the entire Apple II series was put to an end in 1995 with the discontinuation of the Apple IIe card (The History of Apple, 2010).

However, the Apple II isn’t entirely in the technological graveyard. The Apple II quickly became a collectors item and, in some cases, still a functioning computer in computer labs and school across the country. The “cult of mac” Apple Incorporated has created over the past thirty years has primed its followers to crave new products while savoring the old ones. For that reason, the Apple II is desined to be remember forever as the original leap for Apple Inc.

Work Cited

The Apple Museum “Apple II”<> Web. 2010, June 13.

Cavanaugh, Chris. “The Apple II.”<> Web. 2010, June 13.

The History of Apple. “The Apple II.”<> Web. 2010, June 13.

Rothman, Wilson. “Apple II: The World Catches On.”<> Web. 2010, June 13.

Long, Tony. “June 5, 1977: From a Little Apple a Mighty Industry Grows.”<> Web. 2010, June 13.

Wiki Entry Three - Cloud Computing

In recent years, cloud computing has become more prevalent than ever. The internet, referred to as the “cloud,” is the network of all computers that allows for nearly unlimited acquisition of information and manipulation of data. With the arrival of the internet, the ability to store information on a network, as opposed to locally on individual computers, is a sensible advance that will inevitably change the way we work on computers.

Companies like Yahoo, Google, Apple, Amazon and Microsoft use this ability to run native applications almost exclusively on the web to create a cohesion that is superior than running applications locally. In this way, a customer in California is able to buy an item on the same way a customer in Maine does. Similarly, business enterprises are able to globally share an interface for computing by networking computers across the world to the same server. By doing some, the centralization for information of these companies is easily achieved. However, cloud computing still provokes a level of distrust by those consumers who are using it.

In a forum discussing the trust consumers are required to wager in cloud computing, National Public Radio’s Morning Edition introduced the subject by stating, “Microsoft began selling a new version of its office software, and notably, it allows users to store documents on the Web rather than on their personal computers...With the move, Microsoft is going further into or onto - whichever you choose - the cloud. That's cloud computing, a trend that is rapidly transforming the way we work in the office and at home” (Cloud Computing Relies On Consumer Trust). This means that a Microsoft Office user is able to upload their document to Microsoft’s servers and have multiple editors work on the same document simultaneously. Apple extended this service to their consumers by creating, which allowed iWork users to publish their documents onto the web and also allowed for editing by multiple authors. This quintessential example document cloud computing is only second to the newest idea: cloud computing of operating systems. Google’s upcoming Chrome operating system “ designed to allow computers to boot up to the Web within seconds...Users of devices running Chrome will have to perform all their computing online or 'in the cloud,' without downloading traditional software applications...Devices running Chrome will receive continuous software updates, providing added security, and most user data will reside on Google's servers” (Stone). This would change the localization of all computers electing to use Chrome. This would also effectively simplify examination and troubleshooting of software glitches within Chrome. However, this would also limit the ability for a computer to remain offline, since even booting up would require an internet signal.

The concept of cloud computing is likely one that will take time to become omnipresent. Director of global practice technology and information services at Shearman & Sterling George Rudoy says, “In some European nations, for example, strict privacy laws mean that data cannot be taken outside the country. ‘In the cloud, your data could be on 1,000 servers all over the world’” (Cohen). This would mean that international laws conflicting with the concept of cloud computing would put a strain on international relations and therefore hinder the ability to become the standard technology.

Although cloud computer is becoming a rather commonly seen technology in the form of email and document editing, it will likely not become streamline for operating systems in the immediate future. Until issues regarding security concerns, law and ubiquitous internet connectivity are put to rest, it is likely cloud computing will be limited to relatively light and powerless functions until our technological era is considered to be the status quo ante.

Work Cited

"'Cloud Computing' Relies On Consumer Trust.(11:00-12:00 PM)(Microsoft)(Broadcast transcript)(Audio file)." Morning Edition. National Public Radio, 2010. NA. Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries.<>. Web. 2010, June 25.

Cohen, Alan. "Business as usual? At first glance, there appears to be little new in the litigation software arena. But change lurks beneath the surface."  American Lawyer. 32. 2 (Feb 2010): 48(3). Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. <>. Web. 2010, June 25.

Markoff, John. "U.S. scientists given access to cloud computing.(National Desk)."  The New York Times. (Feb 5, 2010): A17(L). Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. <>. Web. 2010, June 25.

Stone, Brad. "Test Flights Into the Google Cloud.(Money and Business/Financial Desk)(PING)."  The New York Times. (May 9, 2010): 3(L). Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. <>. Web. 2010, June 25.

Zittrain, Jonathan. "Lost in the cloud.(operating system)(Editorial Desk)(OP-ED CONTRIBUTOR)."  The New York Times. 158. 54742 (July 20, 2009): A19(L). Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. <>. Web. 2010, June 25.

Wiki Article One


Cellular data has become omnipresent in modern cellular standards. Carriers complete with each other to contend that their speed or access is superior to other mobile providers. The recent expansion of cellular data has grown exponentially with the phones that provide the user interface between the mobile platform and the carrier network. What is in store for the future of cellular data and its role in future computing and the mobile marketplace?


In the modern era of technology, nothing has hit the market like mobile platforms. Over sixty-five percent of America is “plugged in” to this mobile phone obsession (Dell). With the arrival of Apple’s iPhone and HTC’s Droid, the use of cellular data has defined the smartphone. With the introduction of the new iphone 4 and its video chatting capability, the future of cellular data networks has become murky; begging one to question how AT&T will charge for the use of this new feature. It is likely that cellular data will replace minutes altogether, and perhaps replace localized wireless networks in lieu of widespread networks spanning states at a time.

Beginnings of cellular networks

The evolution of the cellular network began in 1G infrastructure in the mid 1980s (Hitchings). As Hitchings states, “The advent of pre-pay tariffs in the mid-1990s and subsidised [sic] prices brought mobile phones to the mass market, especially the youth sector, who made them fashion accessories with the advent of personalized [sic] ringtones and fascias.” As the demand for mobile phones grew, so did our expectations. The introduction of data in the mid 1990s allowed for access to the internet. However, this data was limited by the user interface of the phone and the relatively high price for access at that time. Now, carrier data networks have become an important facet of features that a carrier offers. The data network is a considerable percentage of a carrier’s revenue and the potential for growth is incontrovertible. According to John Markoff, “The data revenue for American cellular carriers grew at an annual rate of more than seventy percent in the first half of l [2007]. In the third quarter, Verizon Wireless, Cingular [AT&T] and Sprint each crossed $1 billion a quarter in data revenue for the first time. They were ranked fourth, fifth and seventh in the world, in data revenue, for the first nine months of the year.” (Markoff). The profitability behind the wireless data networks provides an incentive for these companies to develop high powered networks to meet the demand for consumer consumption of information on the go. This profit-driven expansion of available information has led to multiple exclusive networks that provide their customers with access to the internet as long as they have a signal to a nearby cell tower.

Network speeds

'With a newfound desire for omnipresent access to the internet, the expansion of data networks began to grow. A new problem emerged with network speed: it was too slow. Even today, the speeds a user experiences on Wi-Fi are incredibly fast compared to the speeds the user would experience over a typical wireless data network. (AWK)In fact, an app (a native mobile application) emerged to solve this problem. Running on Android, Nokia and Windows Mobile platforms, WeFi would attempt to attach to a free and open Wi-Fi network whenever possible in order to provide a faster internet experience. (Furchgott).

As the demand for cellular data increased, wireless companies were forced to increase their bandwidth. This resulted in Sprint’s introduction of America’s first 4G compatible network. Nancy Friedrich states in the article Wireless Demands Focus Designers On Integration, “As 3G wireless networks are completed and a transition is being made to 4G systems, the use of distributed architectures and active antenna systems is driving the need for smaller and more efficient transceiver and implementations” (Friedrich). In this way, the future points to faster and more efficient wireless cellular connections to the internet. Likely, an upcoming generation of cellular data units will breaking into terahertz. At the Technology Review, author Kate Greene writes how researchers at the University of Utah, “...have found a way to control terahertz radiation...laying the foundation for a new breed of wireless devices that can take advantage of the previously untapped frequencies. Although still years from commercialization, routers...could eventually pack more data onto airwaves, speeding up wireless Internet links a thousand times”, also citing the fact that, “[m]ost wireless gadgets use radiation in the microwave frequency; Wi-Fi, for instance, operates at 2.4 gigahertz.” (Greene). These speeds could result, possibly, in a nationwide frequency that provides quick, nearly universal access the internet. This opens up the option of cloud computing.

The utilization of technology

The Best example of a piece of technology that utilizes the power of cellular data networks is the iPhone. The iPhone itself has two major features that allow multiple applications to function: internet access from either a WiFi hotspot or AT&T’s EDGE or 3G network and a mobile phone feature. The iPhone synchronizes information, such as calendar events and contact information, on a computer seamlessly and allows instant access to almost unlimited information. Without connectivity to the internet, the iPhone wouldn’t be nearly as functional as it is with this access. If fact, the iPhone’s allure may inevitably lead to itsdemise. A National Broadcast Radio piece recently quoted Associate Editor Matt Buchanan when he stated that, “[t]he all-you-can-eat buffet can only last so long because, you know, when you're talking about millions of phones downloading gigabytes of data a month, that's a huge infrastructure problem. And AT&T has just been hit the hardest by it because they've had the phone that's been most attractive for that kind of thing.” This argument is applicable to other mobile platforms, like the Android device and RIM’s BlackBerry . What began as a dip into mobile communications has resulted in a need for data connectivity and a quick change of standards. Phones are no longer rated solely on their ability to make calls, but now have a standard for “ease of use” and an ability to interact and deliver information quickly and efficiently.

Battery performance with high capacity networks

The last hinderance of mobile data is the toll that data processing takes on the battery. Edward C. Baig of USA Today says, “Sprint says peak download speeds can exceed 10 Megabits per second (Mbps), with an average of 3 to 6 Mbps. Mine topped out at 4.3 Mbps but...several readings were in the one to three Mbps range. [Soon after], I received my first low-battery warning and had to search for a power outlet. Indeed, as with many cellphones, you'll have to manage battery use carefully.” (Baig). The data network cannot be utilized to its full potential unless the batteries are able to accommodate this large consumption of data for prolonged durations of time. The future of carrier data networks remains largely uncertain, however the future of cloud computing, forgoing a computer’s localization of system software and documents in favor of accessing the information remotely via an internet connection, is dependent on the implementation of a fast and omnipresent network that encompasses bandwidth speeds that are comparable with the speeds of Wi-Fi. This probable evolution will lead to the dismissal of carrier minutes associated with mobile phones to be replaced simply with data usage. New apps like Skype allow for mobile calling without minute usage.This system circumvents the network by using Voice Over Internet Protocol (VOIP) combined with cellular data usage. Carriers like AT&T and Verizon have expressed interest in changing the measurement system, that currently charges per minute, to something else.


Just as cellular networks have changed the progress of humankind, the evolution of carrier data networks will likely expand into a currently unforeseeable realm. It is sensible to believe that the global demand and consumption of data will increase, spawning new generations of wireless frequencies that will replace the need for localized wireless networks.

Works Cited

"AT&T Institutes Wireless Data Cap.(11:00-12:00 PM)(Broadcast transcript)(Audio file)." Morning Edition. National Public Radio, 2010. NA. Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. 28 June 2010 <>.

Baig, Edward C. "Sprint HTC Evo 4G puts pedal to metal, but its battery poops out fast.(MONEY)(Product/service evaluation)."  USA Today. (May 27, 2010): 03B. Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. 30 June 2010 <>.

Dell, Kristina. "The Spy in Your Pocket.(Technology)(security of cellular telephone data)." Time. 167. 13 (March 27, 2006): 45. Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. 28 June 2010 < &tabID=T002&prodId=OVRC&docId=A143414840&source=gale&srcprod=OVRC &userGroupName=msu_main &version=1.0>.

Furchgott, Roy. "App of the Week: Find and Connect to Wi-Fi Hot Spots.(Personal Tech)(Brief article)." The New York Times. 158. 54703 (June 11, 2009): B8(L). Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. 28 June 2010 < &tabID=T002&prodId=OVRC&docId=A201577608&source=gale&srcprod=OVRC &userGroupName=msu_main &version=1.0>.

Greene, Kate. The Ultrafast Future of Wireless. Hitchings, Charlotte. "Future of 3G is written in mobile phone history." New Media Age (2001): 32. Expanded Academic ASAP. Web. 29 June 2010.

Markoff, John. "A Personal Computer To Carry In a Pocket.(Business/Financial Desk)." The New York Times. (Jan 8, 2007): C1(L). Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. 28 June 2010 < &tabID=T002&prodId=OVRC&docId=A157020903&source=gale&srcprod=OVRC &userGroupName=msu_main &version=1.0>.

Hansell, Saul. "Mobile Hot Spots In Lieu of Phones.(Business/Financial Desk)(BITS)." The New York Times. 158. 54763 (August 10, 2009): B6(L). Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. 28 June 2010 < &tabID=T002&prodId=OVRC&docId=A205437048&source=gale&srcprod=OVRC &userGroupName=msu_main &version=1.0>.

Techbits. “The History of the Mobile Phone.” "Wireless Firms Prepare New Data Services for Cellular Phones." Dallas Morning News (Dallas, TX). (May 21, 2002): NA. Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. 28 June 2010 < &contentSet=IAC-Documents&type=retrieve&tabID=T002&prodId=OVRC &docId=CJ120425960&source=gale&srcprod=OVRC&userGroupName=msu_main &version=1.0>.

Wiki Article Two


The advent of Moore’s Law questions the turnaround time of new technologies into obsolete technologies and the eventual pursuit of progress. As a result, technology has become a more completive industry which has resulted in the cannibalization of technologies; devouring each other as soon as a past technology is declared outdated.


Human kind has been existent for over two-hundred thousand years. What began as a primeval species has evolved into the dominant genus of our modern world. It is evident that there is a particular facet of humankind that has enabled this impressive utilization of knowledge in order to push its species forward. The missing facet is that of progress. Leo Marx is quoted on this subject in the book Technology and the Future. Marx writes, “Does improved technology mean progress? Yes, it certainly could mean just that. But only if we are willing and able to answer the next question: progress towards what?” (Teich). Gordon Moore, founder of Moore’s law, believed that improved technology was indicative of progress that leads to a formidable, technological future. What he didn’t explain was how technology industry would inevitably become cannibalistic by constantly reevaluating what is on the pedestal of most cunning edge. The constants demotion of the newest technology to obsolete technology is the result of the ever changing technology world.

History of Moore’s Law

In 1965, the emergence of the first computers that were based on integrated circuits have begun to be released. Gordon Moore, a wunderkind in the semiconductor business, was asked what he thought on the silicon chip and its position in the future. Fifty years later, the eventual founder of the Intel microchip has become lionized in the technology community. His law, Moore’s Law, predicts that, “...the number of transistors on a chip will double every two years,” (Lions). As author of Moore's Law Doesn't Matter Daniel Lyons states, “Back in 1965, Intel cofounder Gordon Moore predicted that the semiconductor industry could double the number of transistors on a chip every 12 months (he later amended it to 24 months) for about the same cost.” (Lions). Moore’s Law has been redefined as not only pertaining to microprocessors, but also to hard disk and flash memory capacity and in these regards, Moore was not far off.

In 2007, Apple introduced the iPhone. Weighing in at four and eight gigabytes, it was considered to be a pinnacle of technological advancements. In 2008, Apple revamped its already popular iPhone and introduced the iPhone 3G. Apart from the physical design change, the iPhone was now offered in eight and sixteen gigabytes: double the previous capacities. In 2009, Apple created the iPhone 3GS, available in sixteen and thirty-two gigabytes; again, exactly double the previous standards. While the iPhone 4, introduced in 2010, didn’t follow the path of doubling the memory capacity, it did double the random access memory (RAM) specifications. In this four year period, one product continually cannibalized itself over a period of four years.

The reason for Moore’s Law

Moore’s law doesn’t specifically advocate for the cannibalization of past technologies. Rather it predicts the average turnaround time for outdating technology by doubling the active ingredient in a silicon chip. Lyons elaborates on the issue, explaining that “[t]he trouble isn't capacity; it's speed...You can't make them faster, or they overheat and start to melt. To solve that problem, the industry began making chips that do several tasks at once, instead of doing a single thing faster and faster. These days we're seeing dual-core and quad-core chips--in essence, processors with two or four tiny computer engines on a single chip. Within a decade we will likely see chips with 100 cores…” (Lyons). In this way, the perspective in which we view this law changes. Not by increasing the computation power of a single semiconductor, but rather the number of semiconductors per chip. The future of Moore’s Law

An argument against Moore’s law is simply the statistical law of large numbers. Since Moore’s law requires an exponential increase in computing power year after year, theoretically, the speed of a computer would approach infinity. The computing power of a chip approaching infinity is rather unlikely, and in many ways, suggests the migration from silicon chip to something else. The relative inefficiency of semiconductors overheating and energy consumption suggests that Moore’s law would be reset when a new type of processor that would succeed silicon would be introduced. Moore himself speaks of the cannibalistic and competitive nature of technology in an interview with New York Times Matt Richter. He states that the consumers’ frustration behind the continuing reclassification of new products as obsolete is “...a real problem. The industry has established the fact that if you wait a year you can get something that delivers higher performance and lower cost. But by doing that you never take advantage of what's here. I think it's a byproduct that is built into a technology that's changing this fast. The solution is not to make progress and I don't think that's a good answer.” (Richter). The flat, blunt truth is that technology is a game of double-dutch jumprope; continually waiting when to jump into the match.

The future of the progression and applicability of Moore’s law is all defined in the philosophy of progress and what technology is attempting to accomplish. Dori Yang, author of Leaving Moore’s Law in the Dust says, “[t]he storage industry was growing at about the same rate as Moore's Law from 1990 to 1998... Now hard-drive capacity is growing at around 130 percent a year, doubling every nine months--twice as fast as Moore's Law.” Perhaps Moore’s Law has outdated itself already; failing to meet expectations for future technologies. Yang also contends that, “Within two years, graphics will be so realistic that developers claim you won't be able to distinguish between looking at a computer display and looking out a window… Computer scientists are working on a much harder problem--how to create images of human faces that look and act like real faces." (Yang). By creating these semiconductors to create a more realistic fabrication of life and reality, technology begins to transcend into the world of virtual reality and approach the perfection of artificial reality, thereby creating a seemingly ‘magical’ world.


So what drives the technology industry to continually label its posterity as obsolete? Some argue that the constant pursuit of progress. In the article Of Moore and Magic, author Marry Ellen Bates says, “Arthur C. Clarke's observation in 1973 that ‘Any sufficiently advanced technology is indistinguishable from magic.’ When I caught myself holding my hands under a faucet, wondering why it wasn't turning on automatically, I realized that what we now think of as commonplace looked like a 23rd-century invention in the Star Trek of the 1960s.” (Bates). Moore’s law emulates the notion that when the consumer is unable to understand how the product works, it becomes, in a sense, magical. Moore’s law has, in part, successfully predicted the doubling of microprocessor speeds that results in a huge progression of computing speeds and permits devices to process data at lightning fast speeds. These devices then, inadvertently, cannibalize after each succession. The products, that compete with each other for survival, ensure the best product by constantly challenging the abilities of its predecessor. As so eloquently put by Marc Aronson, author of Moore of Everything These new technologies, in turn, significantly alter how we name, date, and understand the very stages of history that we teach our students. And since Moore's Law mandates that better machines and more sophisticated tests will continue to be created, the chronology that anchors the history we teach has gone from resembling a time line whose dates are as firm as a calendar's to a blur that refuses to stand still.”

Work Cited

Aronson, Marc. (2009). More of Everything. SLJ 55 no6 Je.

Bates, Mary. (2009). Of Moore and Magic. EContent 32 no9.

Excell, Jon. (2008). Moore to come? Engineer 293 S. 15-28 2008

Richtel, Matt. "Technology intensifies the law of change.(Intel Corp. co-founder Gordon E. Moore discusses the implications of high-speed technology)(Interview)."  The New York Times. (May 27, 2001 s0 pBU4(N) pBU4(L) col 2 (15 col): BU4(N) pBU4(L). Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. 1 July 2010 <>.

Teich, Albert. (2008). Technology and the future. Wadsworth Pub Co. Yang, Dori Jones. "Leaving Moore's Law in the dust.(advances in computer technology)."  U.S. News & World Report. 129. 2 (July 10, 2000): 37. Opposing Viewpoints Resource Center. Gale. Michigan State University Libraries. 1 July 2010 <>.