Demyra Hover

From Hst250
Jump to: navigation, search

Wiki Entry #1: The Z3 Computer

The Maker

Konrad Zuse known as the "father of the computer" in Germany was born on June 22, 1910 In Berlin. In 1941, Zuse completed what is known as the first fully functional programmed digital computer, the Z3. Konrad Zuse started his computing desire early when he was a civil engineer student in school. Mr. Zuse is also recognized for producing the Z1 and Z2 which was revised versions before coming up with the Z3 and later came up with the Z4 after the Z3 was destroyed in the war. Konrad Zuse was motivated to build the Z3 to show that it was possible to create a dependable working machine for complex arithmetic calculations. He was a German engineer and later joined the Nazi German government who helped fund his work. Zuse formed a company called Zuse Apparatebau to construct his machines but they were all destroyed due to World War II. Zuse did not rely on any outside help to develop his work, he did his work alone.

The Z3

Zuse began to work on the Z3 in 1938 and it was ready to function in 1941. The Z3 used 2,300 relays and floating point binary arithmetic with a 22-bit word length. The Z3 consists of a binary memory unit (capable of storing 64 floating point numbers), a binary floating-point processor, a control unit, and I/O devices (Rojas 2000). Even though the Z3 was Turing-complete there was no conditional branching to assemble the program language in the machine. Konrad used alot of recycled supplies to create this machine such as old film to store his data and programs since it was a little supply in paper. It used a punched film to input program and converted decimal from the film to binary. The Z3 floating point made it easy to calculate large numbers opposed to other computers, it could handle different set of calculations supported from differences in film. The Z3 is a machine capable of repeatedly executing a single loop of arithmetical operations that acts on numbers stored in memory (Rojas 1998). Unlike the earlier work of Charles Babbage using gears, Konrad Zuse used telephone switching relays to direct arithmetical solutions.

Why the Topic is Important?

Although during his time, Konrad Zuse work went unrecognized while others work was acknowledge, today his work is appreciated for making the first operating digital computer. His contributions made a breakthrough for computing and the start of digital age. The Z3 electric mechanical computer consisted of the binary system which helped process the data and numbers that we still use in many computers today. The making of the Z3 has helped us with the future technology of computers and bring attention to how important Zuse innovations are to digital age. The designs that Zuse discovered and put into the Z3 were important steps in the development of computers. Today almost everyone uses computers rather at work, home or school so thanks to Dr. Konrad Zuse and his product the "Z3" his work should always be acknowledge when using this technical device. So despite the lack of references and acknowledgement Konrad Zuse has made a huge impact on computers and the beginning of how they were invented with the start and independent idea of creating the Z3.

Work Cited:

Rojas, Raul. "How to make Zuses Z3 a Universal Computer." IEEE Annals of the History of Computing 20.3 (1998): 51-4. ABI/INFORM Complete.

Rojas, Raul. "Simulating Konrad Zuses Computers." Dr.Dobbs Journal 25.9 (2000): 64-9. ABI/INFORM Complete

Wiki Entry #2: Roberta Williams

Roberta Williams also known as the "Queen of the Graphic Adventure" was born on February 16, 1953. She was raised in southern California where her father was a horticulturist working as an agriculture inspector and her mother was a housewife but was also a good oil painter. Roberta has one younger brother and is married to her high school sweetheart Ken Williams. Her and Ken share the union of two sons D.J. and Chris Williams who are a few years apart. Roberta Williams is an American video game designer and a co-founder of the Sierra On-Line company (later known as Sierra Entertainment) and is most famous for her pioneering work in graphical adventure games (Wikipedia 2012). In 1999, she decided to retire from working on computer games and focus her time on writing a historical novel and travelling.

On May 5, 1980 Roberta released what is known as the first graphical adventure game Mystery House for the Apple II with the help of her husband Ken. It was the first game created constructed by On-Line Systems Company which eventually helps develop into Sierra On-Line. Mystery House consisted of a six part adventure series entitled Hi-Res Adventures. Even though it was short of animation, color and sounds it still was the first game to ever display graphics. The game started by a vacant Victorian mansion where the player becomes locked inside. Soon after the player has to search the mansion and find jewels then start to realize that a murderer is killing people one by one. So the purpose of the game is to find the murderer or become the next victim. The player has to type in phrases to explore the adventure so despite the fact that the game used text it was not purely textual descriptions it also used graphics. Roberta was fond of text-based adventure games but she thought it would be more entertaining to play with images which is what help her start the development of computer game designing.

Her next adventure game was created in 1984, King's Quest that IBM had requested. It was the first animated 3D adventure game that really helped acknowledge her talent and made Roberta Williams famous. It was also constructed by Sierra Company and is accountable for the high status Sierra received. The world of King's Quest is filled with characters and places from fairy tales. The game is about a hero saving Daventry and becoming the King but in order to do that the player must venture through other lands. Millions of copies were sold and Roberta believes that was so because it was similar to a cartoon for children and it gave adults the experience of their childhood again.

Last but not least, Phantasmagoria was one of Roberta famous creations. It was released in 1995 and also has a sequal. It is the first game to use a live performer to be an on-screen avatar. The game was constructed by Sierra On-line for a PC and was an interactive adventure game. Phantasmagoria took months to film and the game script was many pages long than any regular movie script. After the game was released it had mixed reviews because of the graphic content including violence and a rape scene. The story of the game involves the characters Adrienne and her husband Donald, who bought a mansion that used to be owned by a magician who was demonized and murdered his wives. Adrienne starts to have nightmares and accidentally unleashes the demon while looking around the mansion and it possesses her husband. "With a $4 million development budget and 2 years of development time, Phantasmagoria is Sierra's most sophisticated product ever" (Buxton 1995). Roberta describes this game as her most favorite because it was challenging for her to create and she enjoyed the time she put into making it.

With the creation of these graphical adventure games, Roberta Williams opened a new era for video games and games on the PC. Roberta is a leading figure in adventure gaming and also considered one of the most significant video game designers. According to Roberta Williams, founder of Sierra On-Line, the growth in home computers will be stimulated by such things as the advent of small, low-cost computers and the availability of educational and productivity software for the home (Gilbert 1984). Roberta Williams is an important person to the development of digital age and computers because of the determination she had for making a more entertaining view for the gaming system. As her husband worked on an accounting project on their home computer they came across a video game that they got pleasure in playing and she then discovered a vision to add graphics which was a advance in video gaming technology.

Work Cited:

Gilbert, Betsy. "Playing for Time on Home Computers." Advertising Age 55.31 (1984): 22-. ABI/INFORM Complete; ProQuest Entrepreneurship; ProQuest Research Library.

Buxton, Rebecca F. "Sierra Unleashes Multimedia Horror Phantasmagoria." Business Wire: 1. ABI/INFORM Complete. Aug 17 1995.

Wikipedia contributors. "Roberta Williams." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 13 Apr. 2012.

Wiki Entry #3: Jimmy Wales

Jimmy Wales who some may refer to as Jimbo was born on August 7, 1966 in Huntsville, Alabama but now reside in London. His father who he was named after, was a grocery store manager and his mother operated in a small private school with the help of his grandmother. He was once into financial trading in which he earned his degree in but that soon washed away and he started to focus on Nupedia which was the beginning of the now successful Wikipedia. Wales has always been obsessed with the Internet even at an earlier age so the idea of creating the largest encyclopedia was his dream even though he had some concerns at first. Wales has said that he was initially so worried with the concept of open editing, where anyone can edit the encyclopedia, that he would awake during the night and monitor what was being added (Wikipedia Jimmy Wales 2012). According to the Ted profile website, "With a vision for a free online encyclopedia, Wales assembled legions of volunteer contributors, gave them tools for collaborating, and created the self-organizing, self-correcting, ever-expanding, multilingual encyclopedia of the future" as describing his work on the Wikipedia.

Wikipedia is a free Internet encyclopedia that is liable for anyone to edit. Jimmy and his partner Larry Sanger started this brilliant encyclopedia in January 2001 and with the help of thousands of viewers and articles, Wikipedia quickly became popular. The Web site offers more than 2.6 billion articles in 200 languages, and attracts more than 2 billion page visits a month. This online encyclopedia is composed entirely of contributions by volunteers--anybody who feels moved to contribute (Sutherland 2006). Wikipedia is available in various languages and is used every day for many different purposes such as in classrooms or universities. Students have been assigned to write Wikipedia articles as an exercise in clearly and succinctly explaining difficult concepts to an uninitiated audience (Wikipedia 2012). Since its founding in 2001, Wikipedia has become the largest single source of information in history (Sutherland 2006).

Created in 2004 by Jimmy Wales and Angela Beesley, Wikia is a free site that allows people to host wikis and is operated under a private corporation Wikia Inc. After a few years Wales left his role as CEO and was replaced, he then continued on at Wikia as a spokesperson. While Wikipedia is a better-known site to most Internet users, Wikia lets fans flesh out their interests to an obsessive degree. The online encyclopedia has a page devoted to pets, another to diabetes and another to pet diabetes, while Wikia has hundreds of articles, case studies and images on pet diabetes (Lavallee 2009). Not only is Wikipedia or the Wikia company important to the history of digital age because it brought a new concept in which people can share different types of meanings but we should also give Jimmy Wales the recognition he deserves for coming up with this brilliant idea. His ideas have led us to a new and efficient way to distribute our views on certain topics with others from all around the world.

Works Cited:

Wikipedia contributors. "Jimmy Wales." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 14 Jun. 2012. Web. 19 Jun. 2012.

Sutherland, Benjamin. "The Peoples Encyclopedia; as Wikipedia Grows into a Mainstream Internet Brand, Will it be Able to Keep its Volunteers in Line?" Newsweek Jan 09 2006

Lavallee, Andrew. "Wikia Hits Profit Target Ahead of Schedule." VentureWire (2009)

Wikipedia contributors. "Wikipedia." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 17 Jun. 2012. Web.

Wiki Article #1: Programming Language

Abstract

This article discusses the development of programming language and how it has changed the history of digital age today. Programming language go as far as the first modern computer was created and there are many different languages used. The most prominent languages used are the following; Java, C and C++ which we continue to use in computers today. Programming language are languages programmers use to expand applications so that the computer will perform properly. Programming language will continue to grow and the development will expand over the years. When programming language was first introduced, it was only used for a specific purpose but now there are various languages used for many different purposes. The designing of programming language can be a difficult task to manage and must be correct, if not the language will fail and be abandoned. Many languages have come into existence but we have yet to find one designed to function worldwide.

In the technological realm we use programming language as a base to converse database into a computer. They are also used to develop programs that can direct a computer and convey a system of procedures. The earliest known programming language in history was used when the computer was first invented and managed the control of machines such as Jacquard looms. No matter what languages you use in a computer the language must be translated to machine language so that the computer can easily adapt to and understand. “In the very early days of computing, the only language employed was comprised of native machine instructions, which were often entered by flipping switches and moving cables around. Programmers had to know the numeric representation of each instruction, and they had to calculate addresses for data and execution paths” (GeeksAreSexy 2008). During the 1950s “assembly language” was initiated where the instructions were written in a form for any human to easily read and the code was ran into an “assembler” machine so they are machine-dependent. Sometime afterwards programming language had established and was created even easier to understand without having to be put into a machine. Machine language is the only language that works with a computer so in order for humans to effectively communicate with one, it must have a computer language. So programs are written in the computer language to process for us to use. Computer language is now expanding out of machine language and soon will not be complicated to easily write out.

In 1972 the C programming language was developed by Dennis Ritchie and Ken Thompson at Bells Lab. C is a general-purpose programming language featuring economy of expression, modern control flow and data structure capabilities, and a rich set of operators and data types (Ritchie 1978). C was hoped to be a useful language that help programmers get things done efficiently and precisely. The language is sufficiently expressive and efficient to have completely displaced assembly language programming on UNIX (Ritchie 1978). The C language affluent functions operate very well and are easy for programmers to write out any difficult logic program. C has been used for a wide variety of programs, including the UNIX operating system, the C compiler itself, and essentially all UNIX applications software (Ritchie 1978). C is a powerful, flexible language that provides fast program execution and imposes few constraints on the programmer. It allows low level access to information and commands while still retaining the portability and syntax of a high level language. These qualities make it a useful language for both systems programming and general purpose programs (groups.edu.nd). The C programming language was originally written for the PDP-11 under UNIX which was built on B but was highly functional and because of its power it was popularized as the most common programming language worldwide.

The C++ was developed in 1983, similar to the C with the same feature but was called C with “classes”. “C++ provides a collection of predefined classes, along with the capability of user-defined classes. The classes of C++ are data types, which can be instantiated any number of times. Class definitions specify data objects (called data members) and functions (called member function). Classes can name one or more parent classes, providing inheritance and multiple inheritances, respectively; classes inherit the data members and member functions of the parent class that are specified to be inheritable” (group.edu.nd). C++ was designed to advance the features and the code that was produced easier to write. Also developed under the UNIX system, C++ is a predecessor to C except for the multiple amounts of features used in object-oriented programs. C++ is known to be an intermediate-level language because it consists of both high-level and low0level features. Today, C++ controls the business market and is special among system programmers and application developers. The work on what eventually became C++ started with an attempt to analyze the UNIX kernel to determine to what extent it could be distributed over a network of computers connected by a local area network. The language provided general mechanisms for organizing programs rather than support for specific application areas. This was what made C with Classes and later C++ a general purpose language rather than a C variant with extensions to support specialized applications (Stroustrup nd.)

Java was created during a special time in digital era, during the booming of the Internet age in the 1990s. By the Internet creating many opportunities for new languages to emerge; making it easier for the Java language to become popular due to it combination with a web browser. James Gosling developed Java at Sun Microsystems and originally evolved from a language named Oak. The language was developed from both languages previously mentioned, C and C++. Java is a general-purpose, concurrent, class-based, object-oriented language that is specifically designed to have as few implementation dependencies as possible. Java is as of 2012 one of the most popular programming languages in use, particularly for client-server web applications, with a reported 10 million users (Wikipedia “Java” 2012). According to Andrew Ferguson, “Though Java has very lofty goals and is a text-book example of a good language, it may be the "language that wasn't". It has serious optimization problems, meaning that programs written in it run very slowly. And Sun has hurt Java's acceptance by engaging in political battles over it with Microsoft. But Java may wind up as the instructional language of tomorrow as it is truly object-oriented and implements advanced techniques such as true portability of code and garbage collection”. Java consist of syntax and applet, the syntax was produced from C++ but Java was built from scratch to be practically object-oriented and everything that is in Java is written in a classroom and applets are programs that are set in other applications usually in a web page or browser.

Programming languages continues to evolve and new languages will expand in the future of digital age. Programming language is important because it defines the relationship and grammar that allows the programmers to communicate with the machines that they program. Programming languages are created to seal the construction among the hardware and the real world. Programming languages are not just tools, they are essentially important to the software of a computer that we use in everyday life. Without programming language we would have to use machine code that is difficult for the ordinary human to understand or make sense of. Programming language in the computer makes it easier for a programmer to develop a language that humans can understand and that is important to us because without it we would have a hard time trying to operate a computer.

Works Cited:

http://www.geeksaresexy.net/2008/03/11/geek-support-the-early-history-of-programming-languages/

D.M. Ritchie, M.E. Lesk, and B.W. Kernighan, The Bell System Technical Journal. Vol. 57, No 6, July-August 1978.

http://groups.engin.umd.umich.edu/CIS/course.des/cis400/c/c.html

Stroustrup, Bjarne, A History of C++: 1979−1991. Nd.

Wikipedia contributors. "Java (programming language)." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 23 Jun. 2012.

Wiki Article #2: The Turing Test

Abstract

This article explains the Turing Test and the person behind the work of this artificial intelligence machine that some people are speculate about. The description of the machine briefly describes the concept of how this game actually works. Alan Turing is the master mind behind this innovative idea and described it in his book during 1950. The Turing Test is a test of one of a kind; it simply allows a judge to distinguish the resemblance between human and machine. It is a simple set up that consist of a computer keyboard, a screen and two people sitting on opposite sides. The Turing Test is known to be one of the most controversial arguments in the world of artificial intelligence. The purpose of research in this article surrounds the question if machines have the same knowledge as a human and to go about this question in depth and know an accurate answer.

The Turing test examines a machine capability of demonstrating its knowledge of intelligence. The question, Can machines think?, is what came to Alan Turing mind when he wrote the paper and introduced the Turing test but found the question impossible to answer. Coming up with a new question much more understandable and knowingly can answer he came up with the imitation game which was the name of the Turing test before it was changed. The imitation game did not engage into any computer intelligence but involved three rooms; one with a judge, one with a man and the other with a woman. Much like explained in the Turing test, the judge job is to decide from the two people communicating with him through the computer which one is the man. With help and deceit from the two counterparts the judge must make a correct decision. Revising the game Alan Turing substituted both man and woman and just stuck with either/or and a computer machine. So now the judge job is similar but he must tell the difference between human and machine, which is the Turing Test. If the judge was less than fifty percent correct, then the machine passed for being as intelligent as a human. The test does not check the ability to give the correct answer; it checks how closely the answer resembles typical human answers. In the years since 1950, the test has been proven to be both highly influential and widely criticized, and it is an essential concept in the philosophy of artificial intelligence (Wikipedia “Turing test” 2012). People who criticize the Turing test believe that it does not matter or relate to anything of importance. To eliminate the debate and change the long history of trying to define intelligence, Turing wrote in his paper the idea of machines conversing is the key to judge intelligence.

One of the most important founding figures in computing history is well known mathematician Alan Turing. Alan Turing was born on June 23, 1912 in London. Turing was fond of mathematics, philosophy, and computer science. He began to study mathematics at Cambridge University and eventually started teaching there also. There he began working on the concept of the Turing machine, which is known for the theoretical computing machine. Working part time for the British cryptanalytic department, he played an important role in translating messages by the German Enigma machine and cracking the Nazis codes during World War II. When the war was over Turing decided to develop a logic machine that process information, he started to work for the National physical Laboratory where he introduced his ideas to his associates and they had dismissed them. Laying down the groundwork for the first modern digital computer, Alan Turing creation was groundbreaking and the lab that turned him down lost the honor of sharing that creation. Turing's report setting out his design for the Automatic Computing Engine (ACE) was the first relatively complete specification of an electronic stored-program general-purpose digital computer. Turing saw that speed and memory were the keys to computing. His design had much in common with today's RISC architectures and called for a high-speed memory of roughly the same capacity as an early Macintosh computer (enormous by the standards of his day) (Copeland 2000). He then constructed a proposal that if humans are intelligent, and machines can replicate humans, then machine, too, must be intelligent. In 1952 Alan Turing was arrest for being a homosexual, which was once a criminal offense, instead of facing prison he was injected with hormones and unfortunately loss his job. A few years later he was found dead from poising, his death was resulted as a suicide but his mother believe that it was an accident. Turing was a founding father of modern cognitive science and a leading early exponent of the hypothesis that the human brain is in large part a digital computing machine, theorizing that the cortex at birth is an unorganized machine which through training becomes organized into a universal machine (Copeland 2000).

A Turing machine is a theoretical computing machine invented by Alan Turing (1937) to serve as an idealized model for mathematical calculation. A Turing machine consists of a line of cells known as a "tape" that can be moved back and forth, an active element known as the "head" that possesses a property known as "state" and that can change the property known as "color" of the active cell underneath it, and a set of instructions for how the head should modify the active cell and move the tape (Wolfram 2002). Turing machines are simple abstract computational devices intended to help investigate the extent and limitations of what can be computed (Barker-Plummer 2012). Alan Turing called the Turing machine an “automatic machine” that can help computer scientists understand the boundary of mechanical computation. A Turing machine consists of a potentially infinite paper tape, on which is written a finite number of discrete (e.g. binary) symbols, and a scanner that moves back and forth along the tape symbol by symbol, reading what it finds and writing further symbols (Copeland 2000). Turing came up with the idea to create a single machine that is possible to compute any type of computable cycle. Turing and the American logician Alonzo Church argued that every effective mathematical method can be carried out by the universal Turing machine, a proposition now known as the Church-Turing thesis (Copeland 2000). Turing machines are very important in are life especially when dealing with computing because it helps us understand computer science.

The Turing test was developed to help us see the logic point of view that machines can be as smart as humans are with the great sense of intelligence we carry. The idea behind digital computers may be explained by saying that these machines are intended to carry out operations which could be done by a human computer (Turing 1950). Turing had hoped that someday computers will be able to intellectually contend with men in all fields and be able to withhold the same intellect humans hold. Turing thought that if a machine can communicate with the same intelligence as human then machines are able to “think” as well. The Turing test is socially important because it shows with socialization any element on Earth can replicate the form of human. With the making of the Turing test and Turing machine, Alan came up with both great ideas that still shape our way of life today due to the fact we used computer machines for every purpose and it is argued that technology and machines are starting to take over our lives and control us so maybe machines can out think humans.

Works Cited:

Wikipedia contributors. "Turing test." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 25 Jun. 2012.

Turing, A.M. (1950). Computing machinery and intelligence. Mind, 59, 433-560.

http://www.bbc.co.uk/history/people/alan_turing

Copeland, Jack. “Biography of Turing” (July 2000) http://www.alanturing.net/turing_archive/pages/Reference%20Articles/Bio%20of%20Alan%20Turing.html

Barker-Plummer, David, "Turing Machines", The Stanford Encyclopedia of Philosophy (Fall 2012 Edition), Edward N. Zalta (ed.), forthcoming URL = <http://plato.stanford.edu/archives/fall2012/entries/turing-machine/>.

Weisstein, Eric W. "Turing Machine." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/TuringMachine.html