English | Size: 9.97 GB
Category: Tutorial
These Computer Science Video Lectures cover fundamental concepts that provide a first step in understanding the nature of computer science’s undeniable impact on the modern world. They cover basic elements of programming, algorithms and data structures, theory of computing and machine architecture, all in the context of applications in science, engineering, and commerce.
Description
The basis for education in the last millennium was “reading, writing, and arithmetic;” now it is reading, writing, and computing. Learning to program is an essential part of any working professional’s skill set, not just programmers and engineers, but also artists, scientists, and humanists, as well. This collection of video lectures aims to teach programming to those who need or want to learn it, in a scientific context. But computer science is much more than just programming. These lectures also aim to teach fundamental concepts of the discipline, which can serve as a basis for further study of some of the most important scientific ideas of the past century.
About the Instructors
Robert Sedgewick is the William O. Baker Professor of Computer Science at Princeton University. He is a Director of Adobe Systems and has served on the research staffs at Xerox PARC, IDA, and INRIA. He earned his PhD from Stanford University under Donald E. Knuth. He is the coauthor (with Kevin Wayne) of Algorithms, Fourth Edition (Addison-Wesley).
Kevin Wayne is the Phillip Y. Goldman senior lecturer in computer science at Princeton University, where he has taught since 1998. He is an ACM Distinguished Educator and holds a Ph.D. in operations research and industrial engineering from Cornell University.
Skill Level
All Levels
What You Will Learn
Basic elements, including variables, assignment statements, built-in types of data, conditionals and loops, arrays, and I/O, including graphics and sound
Functions and modules, stressing the fundamental idea of dividing a program into components that can be independently debugged, maintained, and reused
Object-oriented programming, centered on an introduction to data abstraction
Applications,drawing examples from applied mathematics, the physical and biological sciences, and computer science itself
Algorithms and data structures, emphasizing the use of the scientific method to understand performance characteristics of implementations
Theory of computing, which helps us address basic questions about computation, using simple abstract models of computers
Machine architecture, providing a link between the abstract machines of the theory of computing and the real computers that we use
Historical context, including the fascinating story of the development and application of fundamental ideas about computation by Alan Turing, John von Neumann, and many others
Who Should Take This Course
Students in introductory CS and programming courses.
Programmers trained in older languages who want to know Java.
Scientists and engineers who find themselves engaged in computation but never had a computer science course and want to learn to program.
Anyone interested in obtaining a fundamental understanding of computing.
Course Requirements
No computing experience or programming knowledge is required to understand the content of these lectures.
Table of Contents
PART I: PROGRAMMING IN JAVA
Lecture 0: Prologue—A Simple Machine
This lecture introduces fundamental ideas of computation in the context of a familiar and important application from the field of cryptography. The story motivates the study of computer science, but the concepts covered are a bit advanced, so novices may wish to review it again after watching the other lectures in the course.
Lecture 1: Basics. Why program? This lecture addresses that basic question. Then it describes the anatomy of your first program and the process of developing a program in Java using either virtual terminals or a program development environment, with some historical context. Most of the lecture is devoted to thorough coverage of Java’s built-in data types, with example programs for each.
Lecture 2: Conditionals and Loops. The if, while, and for statements are Java’s fundamental control structures. This lecture is built around short programs that use these constructs to address important computational tasks. Examples include sorting, computing the square root, factoring, and simulating a random process. The lecture concludes with a detailed example illustrating the process of debugging a program.
Lecture 3: Arrays. Computing with a large sequence of values of the same type is extremely common. This lecture describes Java’s built-in array data structure that supports such applications, with several examples, including shuffling a deck of cards, the coupon collector test for randomness, and random walks in a grid.
Lecture 4: Input and Output. To interact with our programs, we need mechanisms for taking information from the outside world and for presenting information to the outside world. This lecture describes several such mechanisms for text, drawings, and animation. Detailed examples covered include fractal drawings that model natural phenomena and an animation of a ball bouncing around in the display window.
Lecture 5: Functions and Libraries. Modular programming is the art and science of breaking a program into pieces that can be individually developed. This lecture introduces functions (Java methods), a fundamental mechanism that enables modular programming. Motivating examples include functions for the classic Gaussian distribution and an application that creates digital music.
Lecture 6: Recursion. A recursive function is one that calls itself. This lecture introduces the concept by treating in detail the ruler function and (related) classic examples, including the Towers of Hanoi puzzle, the H-tree, and simple models of the real world based on recursion. We show a common pitfall in the use of recursion, and a simple way to avoid it, which introduces a different (related) programming paradigm known as dynamic programming.
Lecture 7: Performance. When you develop a program, you need to be aware of its resource requirements. In this lecture, we describe a scientific approach to understanding performance, where we develop mathematical models describing the running time of our programs and then run empirical tests to validate them. Eventually we come to a simple and effective approach that you can use to predict the running time of your own programs that involve significant amounts of computation.
Lecture 8: Abstract Data Types. In Java, you can create your own data types and use them in your programs. In this and the next lecture, we show how this ability allows us to view our programs as abstract representations of real-world concepts. First we show the mechanics of writing client programs that use data types. Our examples involve abstractions such as color, images, and genes. This style of programming is known as object-oriented programming because our programs manipulate objects, which hold data type values.
Lecture 9: Creating Data Types. Creating your own data types is the central activity in modern Java programming. This lecture covers the mechanics (instance variables, constructors, instance methods, and test clients) and then develops several examples, culminating in a program that uses a quintessential mathematical abstraction (complex numbers) to create visual representations of the famous Mandelbrot set.
Lecture 10: Programming Languages. We conclude the first half of the course with an overview of important issues surrounding programming languages. To convince you that your knowledge of Java will enable you to learn other programming languages, we show implementations of a typical program in C, C++, Python, and Matlab. We describe important differences among these languages and address fundamental issues, such as garbage collection, type checking, object oriented programming, and functional programming with some brief historical context.
PART II: ALGORITHMS, THEORY, and MACHINES
Lecture 11: Searching and Sorting. Building on the scientific approach developed in Part 1 (Lecture 7), we introduce and study classic algorithms for two fundamental problems, in the context of realistic applications. Our message is that efficient algorithms (binary search and mergesort, in this case) are a key ingredient in addressing computational problems with scalable solutions that can handle huge instances.
Lecture 12: Stacks and Queues. Our introduction to data structures is a careful look at the fundamental stack and queue abstractions, including performance specifications. Then we introduce the concept of linked structures and focus on their utility in developing simple, safe, clear, and efficient implementations of stacks and queues.
Lecture 13: Symbol Tables. The symbol table abstraction is one of the most important and useful programmer’s tools, as we illustrate with several examples in this lecture. Extending the scientific approach of the previous two lectures, we introduce and study binary search trees, a classic data structure that supports efficient implementations of this abstraction.
Lecture 14: Introduction to Theory of Computation. The theory of computation helps us address fundamental questions about the nature of computation while at the same time helping us better understand the ways in which we interact with the computer. In this lecture, we introduce formal languages and abstract machines, focusing on simple models that are actually widely useful in practical applications.
Lecture 15: Turing Machines. In 1936, Alan Turing published a paper that is widely hailed as one of the most important scientific papers of the 20th century. This lecture is devoted to the two far-reaching central ideas of the paper: All computational devices have equivalent computational power, and there are limitations to that power.
Lecture 16: Intractability. As computer applications expanded, computer scientists and mathematicians realized that a refinement of Turing’s ideas was needed. Which computational problems can we solve with the resource limitations that are inescapable in the real world? As described in this lecture, this question, fundamentally, remains unanswered.
Lecture 17: A Computing Machine. Every programmer needs to understand the basic characteristics of the underlying computer processor being used. Fortunately, the fundamental design of computer processors has changed little since the 1960s. In this lecture, we provide insights into how your Java code actually gets its job done by introducing an imaginary computer that is similar to both the minicomputers of the 1960s and the microprocessor chips found in today’s laptops and mobile devices.
Lecture 18: von Neumann Machines. Continuing our description of processor design and low-level programming, we provide context stretching back to the 1950s and discuss future implications of the von Neumann machine, where programs and data are kept in the same memory. We examine in detail the idea that we design new computers by simulating them on old ones, something that Turing’s theory guarantees will always be effective.
Lecture 19: Combinational Circuits. Starting with a few simple abstractions (wires that can carry on/off values and switches that can control the values carried by wires), we address in this lecture the design of the circuits that implement computer processors. We consider gates that implement simple logical functions and components for higher-level functions, such as addition. The lecture culminates with a full circuit for an arithmetic/logic unit.
Lecture 20: CPU. In this lecture we provide the last part of our answer to the question “How does a computer work?” by developing a complete circuit for a computer processor, where every switch and wire is visible. While vastly different in scale, this circuit, from a design point of view, has many of the same characteristics as the circuits found in modern computational devices.
-https://eshoptrip.com
Reviews
There are no reviews yet.