Invitation to Computer Science Chapter 1 Notes PDF

Title Invitation to Computer Science Chapter 1 Notes
Author Lucy Long
Course Principles of Computer Science AP
Institution The Woodlands High School
Pages 3
File Size 50.2 KB
File Type PDF
Total Downloads 17
Total Views 134

Summary

Chapter notes for chapter one from pages 2 to 37. Originally in Cornell format, but simply listed here. Influenced mainly by topics discussed in class. Brief overview of the main points of the chapter....


Description

Chapter 1: An Introduction to Computer Science Input – use. Mouse, keyboard, camera, microphone Processing – CPU (Central Processing Unit aka the Brain) Output - Monitor (soft copy), Speakers, Printer (hard copy) Storage – HDD, SSD, Flash Drive, CD, Cloud Advances in Computing: high-speed supercomputers, wireless networks, minute computer chips, A.I. systems Misconception #1: Computer science isn’t the study of computers – that definition is incomplete. There are branches of computer science that are distinct from the study of computers and machines Misconception #2: Computer science isn’t the study of how to write computer programs. Programs are used to help computer scientists test and analyze ways to solve problems and represent information Misconception #3: Computer science isn’t the study of uses or applications of computers and software. Computer science is used for specifying, designing, building and testing software programs and the computer systems that they’re on Definition of Computer Science: the central concept in computer science is the algorithm – you can’t understand the field without a thorough understanding of algorithm. It is the computer scientists’ job to design and develop algorithms to solve important problems Design Process of Algorithms:  Studying their behavior to see if they’re correct and efficient (their formal and mathematical properties)  Designing and building computer systems that can execute algorithms (their hardware realizations)  Designing programming languages to translate algorithms into so they can be executed by the hardware (their linguistic realizations)  Identifying problems and designing software packages to solve them (their applications) Algorithm Definition: an ordered sequence of instructions that’s guaranteed to solve a specific problem Operations used to Construct Algorithms: 1. Sequential operations – a sequential operation carries out a single, well-defined task, then moves to the next operations. Sequential operations are usually expressed as simple declarative sentences 2. Conditional operations – the “question-asking” instructions of an algorithm, a question is asked, then (based off the answer) the next operation is selected 3. Iterative operations – “looping” instructions, instead of moving on to other instructions, you move back and repeat a certain block of instructions Computing Agent: the machine, robot, person or thing carrying out the steps of an algorithm Science of Algorithmic Problem Solving:  Unsolvable Problems: No solution will ever be found. Proved by Kurt Gödel. Limited the capabilities of computers and computer science

Inefficient Solutions: Problems that can be solved but take too long to get to a conclusion that the answer is technically useless  Unknown Solution: The problem is known, but it is unknown how to specify the operations algorithmically Unambiguous Operation: An operation that can be carried out and understood directly by a computing agent without extra simplification or explanation. Aka – a primitive (operation), an algorithm must be fully composed of primitive operations Effectively Computable Operations: operations that are doable Computer Revolution: in 20th and 21st centuries, we have automated repetitive tasks, like: adding long columns of #’s, finding one name of account number in a massive database, sorting student records by course number, retrieving a hotel/airline reservation from a file containing millions of pieces of data. This process offers a huge increase in productivity. Humans are then free to do what they can do better than computers. Algorithms: solve problems and produce results Infinite Loop: not solvable, no finite number of operations Multiple Ways to Solve: There is always more than a single way to write a correct solution Logarithms: created by Scotsman John Napier, a way to simplify difficult mathematical computations Mathematical Calculators: first by Blaise Pascal in 1642 called Pascaline, could add and subtract (Gottfried Leibnitz, 1673 called Leibnitz’s Wheel, could add, subtract, multiply and divide. These were important milestones: demonstrated how mechanization could simplify and speed up numerical computation. However, not computers, lacked 2 things: 1) no memory to store info in a machine readable from, 2) not programmable Jacquard Loom: first actual “computing device”, used for manufacturing rugs and clothing, developed in 1801 by Joseph Jacquard, used puncture cards to create a pattern. Was the first programmable device and showed how the knowledge of a human expert could be captured in machine readable form and accomplish the task automatically. Difference Engine: made in 1823 by Charles Babbage, could add, subtract, multiply and divide to 6 digits, and could solve other polynomial equations and other complex mathematical problems Analytic Engine: surprisingly similar in design to a modern computer  Mill -> arithmetic/logic unit, store -> memory, operator -> processor, output unit -> input/output Keypunch, Tabulator, Sorter: used to automatically read, tally and sort data entered on punch cards. Created by Hermand Hollerith between 1880-1890. A clear and successful demonstration of the advantages of automated information processing. Mark 1: a general purpose, electromechanical programmable computer used to process and store data. First computing device to use the base-2 binary numbering system. Completed in 1944. One of the first working general purpose computers. Memory capacity of 72 numbers. Could be programmed to perform a 23-digit multiplication in 4 seconds. Operational for 15 years. ENIAC: Electronic Numerical Integrator and Calculator, completed in1446 could add two 10-digit numbers in approximately 1/5000 of a second. Could multiply two numbers in 1/300 of a second. 

ABC System: Atanasoff – Berry computer by John Atanasoff and Clifford Berry first electronic computer. Made in 1939-1942 only useful for solving simultaneous linear equations. Colossus: made in England in 1943 one of the first computers built outside of the U.S. used to crack the German John Von Neumann: genius in mathematics, experimental physics, chemistry, economics, and computer science. Stored Program Computer: Proposed by John Von Neumann in 1946. Von Neumann Architecture: proposed that the instructions that control the operation of the computer be encoded as binary values and stored internally in the memory with data. EDVAC/EDSAC: made in around 1949, executed programs in a surprisingly similar way to the computers in the 21st century. UNIVAC 1: first computer actually sold, built by Presper Eckert and John Mauchly (worked on ENIAC) delivered to the US Bureau of the Census on March 31, 1951. Ran for 12 years. Now in Smithsonian Institute. Marks the true beginning of the computing age First Gen. of Computing: roughly 1950-1957. This era saw the appearance of UNIVAC-1 and IBM 701; these were similar in design to EDVAC. 1st gen machines were only used by trained professionals and in specialized locations. Second Gen. of Computing: roughly 1957-1965. Major changes in size and complexity of computers. Increase in reliability and reduced costs. Also, era of FORTRAN and COBOL, first high-level programming languages. Occupation programmer was born. Third Gen. of Computing: roughly 1965-1975. Era of integrated circuit: further reduced the size/cost of computers. First minicomputer – the POR1. Birth of software industry. By mid1970’s computers were no longer a rarity. Fourth Gen. of Computing: roughly 1975-1985: first microcomputer. Integrated circuit tech had advanced so much that a full computer system fit on one circuit board that could be held in your hand. Altair 8800 – worlds first microcomputer appeared in 1975. First computer networks, electronic mail, user friendly systems, graphical user interfaces and embedded systems. Fifth Gen. of Computing: 1985 – present day. So much has changed that distinct gens are useless. in a few decades, computers have progressed from UNIVAC 1 to todays tech changes of this magnitude have never occurred so quickly in any other technology. Virtual Machine: user-oriented view of a computer system and it’s resources. Composed only of the resources that the user perceives rather than all the hardware resources that exist....


Similar Free PDFs