信息时代的计算机科学理论
暂无封面,等待上传

信息时代的计算机科学理论

(美) 霍普克罗夫特, (美) 坎南, 著

出版社:上海交通大学出版社

年代:2013

定价:35.0

书籍简介:

本书是上海交通大学致远教材系列之一,由国际著名计算机科学家John Hopcroft编写。本书共分为11章,包括绪论、多维空间、 随机图、随机行走和马克诺夫链等。从第2章开始,每章后面均附有适量的练习题。本书可作为计算机及相关专业高年级本科生或研究生的教材,也可供相关专业技术人员参考。

书籍目录:

1 Introduction

2 High-Dimensional Space

2.1 Properties of High-Dimensional Space

2.2 The High-Dimensional Sphere

2.2.1 The Sphere and the Cube in Higher Dimensions

2.2.2 Volume and Surface Area of the Unit Sphere

2.2.3 The Volume is Near the Equator

2.2.4 The Volume is in a Narrow Annulus

2.2.5 The Surface Area is Near the Equator

2.3 Volumes of Other Solids

2.4 Generating Points Uniformly at Random on the Surface of a Sphere

2.5 Gaussians in High Dimension

2.6 Bounds on Tail Probability

2.7 Random Projection and the Johnson-Lindenstrauss Theorem

2.8 Bibliographic Notes

2.9 Exercises

3 Random Graphs

3.1 TheG(n, p) Model

3.1.1 Degree Distribution

3.1.2 Existence of Triangles in G ( n, d/n )

3.2 Phase Transitions

3.3 The Giant Component

3.4 Branching Processes

3.5 Cycles and Full Connectivity

3.5.1 Emergence of Cycles

3.5.2 Full Connectivity

3.5.3 Threshold for O (Inn) Diameter

3.6 Phase Transitions for Monotone Properties

3.7 Phase Transitions for CNF-sat

3.8 Nonuniform and Growth Models of Random Graphs

3.8.1 Nonuniform Models

3.8.2 Giant Component in Random Graphs with Given Degree Distribution ...

3.9 Growth Models

3.9.1 Growth Model Without Preferential Attachment

3.9.2 A Growth Model with Preferential Attachment

3.10 Small World Graphs

3.11 Bibliographic Notes

3.12 Exercises

4 Singular Value Decomposition (SVD)

4.1 Singular Vectors

4.2 Singular Value Decomposition (SVD)

4.3 Best Rank k Approximations

4.4 Power Method for Computing the Singular Value Decomposition

4.5 Applications of Singular Value Decomposition

4.5.1 Principal Component Analysis

4.5.2 Clustering a Mixture of Spherical Gaussians

4.5.3 An Application of SVD to a Discrete Optimization Problem

4.5.4 Spectral Decomposition

4.5.5 Singular Vectors and Ranking Documents

4.6 Bibliographic Notes

4.7 Exercises

5 Random Walks and Markov Chains

5.1 Stationary Distribution

5.2 Electrical Networks and Random Walks

5.3 Random Walks on Undirected Graphs with Unit Edge Weights

5.4 Random Walks in Euclidean Space

5.5 The Web as a Markov Chain

5.6 Markov Chain Monte Carlo

5.6.1 Metropolis-Hasting Algorithm

5.6.2 Gibbs Sampling

5.7 Convergence of Random Walks on Undirected Graphs

5.7.1 Using Normalized Conductance to Prove Convergence

5.8 Bibliographic Notes

5.9 Exercises

6 Learning and VC-Dimension

6.1 Learning

6.2 Linear Separators, the Perceptron Algorithm, and Margins

6.3 Nonlinear Separators, Support Vector Machines, and Kernels

6.4 Strong and Weak Learning-Boosting

6.5 Number of Examples Needed for Prediction: VC-Dimension

6.6 Vapnik-Chervonenkis or VC-Dimension

6.6.1 Examples of Set Systems and Their VC-Dimension

6.6.2 The Shatter Function

6.6.3 Shatter Function for Set Systems of Bounded VC-Dimension

6.6.4 Intersection Systems

6.7 The VC Theorem

6.8 Bibliographic Notes

6.9 Exercises

7 Algorithms for Massive Data Problems

7.1 Frequency Moments of Data Streams

7.1.1 Number of Distinct Elements in a Data Stream

7.1.2 Counting the Number of Occurrences of a Given Element

7.1.3 Counting Frequent Elements

7.1.4 The Second Moment

7.2 Sketch of a Large Matrix

7.2.1 Matrix Multiplication Using Sampling

7.2.2 Approximating a Matrix with a Sample of Rows and Columns ...

7.3 Sketches of Documents

7.4 Exercises

8 Clustering

8.1 Some Clustering Examples

8.2 A Simple Greedy Algorithm for k-clustering

8.3 Lloyd's Algorithm for k-means Clustering

8.4 Meaningful Clustering via Singular Value Decomposition

8.5 Recursive Clustering Based on Sparse Cuts

8.6 Kernel Methods

8.7 Agglomerative Clustering

8.8 Communities, Dense Submatrices

8.9 Flow Methods

8.10 Linear Programming Formulation

8.11 Finding a Local Cluster Without Examining the Whole Graph

8.12 Axioms for Clustering

8.12.1 An Impossibility Result

8.12.2 A Satisfiable Set of Axioms

8.13 Exercises

9 Graphical Models and Belief Propagation

9.1 Bayesian or Belief Networks

9.2 Markov Random Fields

9.3 Factor Graphs

9.4 Tree Algorithms

9.5 Message Passing Algorithm

9.6 Graphs with a Single Cycle

9.7 Belief Update in Networks with a Single Loop

9.8 Maximum Weight Matching

9.9 Warning Propagation

9.10 Correlation Between Variables

9.11 Exercises

10 Other Topics

10.1 Rankings

10.2 Hare System for Voting

10.3 Compressed Sensing and Sparse Vectors

10.3.1 Unique Reconstruction of a Sparse Vector

10.3.2 The Exact Reconstruction Property

10.3.3 Restricted Isometry Property

10.4 Applications

10.4.1 Sparse Vector in Some Coordinate Basis

10.4.2 A Representation Cannot be Sparse in Both Time and Frequency Domains

10.4.3 Biological

10.4.4 Finding Overlapping Cliques or Communities

10.4.5 Low Rank Matrices

10.5 Exercises

11 Appendix

11.1 Asymptotic Notation

11.2 Useful Inequalities

11.3 Sums of Series

11.4 Probability

11.4.1 Sample Space, Events, Independence

11.4.2 Variance

11.4.3 Variance of Sum of Independent Random Variables

11.4.4 Covariance

11.4.5 The Central Limit Theorem

11.4.6 Median

11.4.7 Unbiased Estimators

11.4.8 Probability Distributions

11.4.9 Maximum Likelihood Estimation MLE

11.4.10 Tail Bounds

11.4.11 Chernoff Bounds: Bounding of Large Deviations

11.4.12 Hoeffding's Inequality

11.5 Generating Functions

11.5.1 Generating Functions for Sequences Defined by Recurrence Relationships

11.5.2 Exponential Generating Function

11.6 Eigenvalues and Eigenvectors

11.6.1 Eigenvalues and Eigenvectors

11.6.2 Symmetric Matrices

11.6.3 Extremal Properties of Eigenvalues

11.6.4 Eigenvalues of the Sum of Two Symmetric Matrices

11.6.5 Norms

11.6.6 Important Norms and Their Properties

11.6.7 Linear Algebra

11.6.8 Distance Between Subspaces

11.7 Miscellaneous

11.7.1 Variational Methods

11.7.2 Hash Functions

11.7.3 Catalan Numbers

11.7.4 Sperner's Lemma

11.8 Exercises

Index

References

内容摘要:

Computer Science Theory for the Information Age covers the computer science theory likely to be useful in the next 40 years, including high- dimensional space, random graphs, singular value decomposition. random walks, Markov chains, learning algorithms, VC-dimension, algorithms for massive date problems, clustering. The book also covers graphical models and belief propagation, ranking and voting, sparse vectors, and compressed sensing. The book is intended for either an undergraduate or a graduate theory course in computer science. Prof. John Hopcroft is a world-renowned scientist and an expert on education in computer science. He was awarded the A. M. Turing Award in 1986 for his contributions in theoretical computing and data structure design. Dr. Ravindran Kannan is a principal researcher with Microsoft Research Labs located in India.

书籍规格:

书籍详细信息
书名信息时代的计算机科学理论站内查询相似图书
9787313096098
如需购买下载《信息时代的计算机科学理论》pdf扫描版电子书或查询更多相关信息,请直接复制isbn,搜索即可全网搜索该ISBN
出版地上海出版单位上海交通大学出版社
版次1版印次1
定价(元)35.0语种英文
尺寸23 × 15装帧精装
页数 404 印数

书籍信息归属:

信息时代的计算机科学理论是上海交通大学出版社于2013.出版的中图分类号为 TP3-0 的主题关于 计算机科学-理论-英文 的书籍。