微博

ECO中文网

 找回密码
 立即注册

QQ登录

只需一步,快速开始

查看: 4410|回复: 0
收起左侧

2000 姚期智

[复制链接]
发表于 2022-4-17 16:35:40 | 显示全部楼层 |阅读模式

马上注册 与译者交流

您需要 登录 才可以下载或查看,没有帐号?立即注册

x
Andrew C Yao
BIRTH:
1946, Shanghai, China

EDUCATION:
B.S. (Physics, National University of Taiwan, 1967); A.M. (Physics, Harvard University, 1969); Ph.D. (Physics, Harvard University, 1972); Ph.D. (Computer Science, University of Illinois Urbana-Champaign, 1975).

EXPERIENCE:
Assistant Professor (Mathematics Department, Massachusetts Institute of Technology, 1975-1976); Assistant Professor (Computer Science Department, Stanford University, 1976-1981); Professor (Computer Science Division, University of California, Berkeley, 1981-1982); Professor (Computer Science Department, Stanford University, 1982-1986); William and Edna Macaleer Professor of Engineering and Applied Science, (Princeton University, 1986-2004); Professor (Center for Advanced Study, Tsinghua University, from 2004); Director, Institute for Theoretical Computer Science, Tsinghua University, from 2004); Distinguished Professor-At-Large (The Chinese University of Hong Kong, from 2005).

HONORS AND AWARDS:
George Polya Prize (1987); Guggenheim Fellowship (1991); Fellow, Association for Computing Machinery (1995); Donald E. Knuth Prize (1996); Member, US National Academy of Sciences (1998), and American Academy of Arts and Sciences (2000), Academia Sinica (2000); A.M. Turing Award (2000); Pan Wen-Yuan Research Award (2003); Honorary Doctor of Science (City University of Hong Kong, 2003); Fellow, American Association for the Advancement of Science (2003); Honorary Doctor of Engineering (Hong Kong University of Science and Technology, 2004); Foreign Member, Chinese Academy of Sciences (2004); Alumni Award for Distinguished Service, College of Engineering, University of Illinois (2004); Honorary Doctor of Science (Chinese University of Hong Kong, 2006); Honorary Doctor of Mathematics (University of Waterloo, 2009); Fellow, International Association for Cryptologic Research (2010).

ANDREW CHI-CHIH YAO DL Author Profile link
China – 2000
CITATION
In recognition of his fundamental contributions to the theory of computation, including the complexity-based theory of pseudorandom number generation, cryptography, and communication complexity.

SHORT ANNOTATED
BIBLIOGRAPHY
RESEARCH
SUBJECTS
Andrew Chi-Chih Yao was born in Shanghai, China, on December 24, 1946. After moving with his family to Hong Kong for two years he immigrated to Taiwan. In 1967 he received a B.S. in Physics from the National University of Taiwan. He then started graduate studies in Physics at Harvard University, where he received an A.M. in 1969 and a Ph.D. in 1972 under the supervision of Sheldon Glashow, winner of the 1979 Nobel Prize in Physics. He subsequently entered the Ph.D. program in Computer Science at the University of Illinois Urbana-Champaign, and received his degree just two years later, in 1975. Yao completed his dissertation, A Study of Concrete Computational Complexity, under the supervision of Chung Laung Liu.

After a year as an Assistant Professor in the Mathematics Department at MIT, Yao joined the Computer Science Department at Stanford University as an Assistant Professor in 1976. Over the next five years there he made a number of fundamental contributions to the theory of algorithms.

His 1977 paper “Probabilistic computations: toward a unified measure of complexity,” [1] introduced what is now known as Yao’s min-max principle, which uses von Neumann’s minimax theorem from game theory to relate average-case complexity for deterministic algorithms to worst-case complexity for randomized algorithms. Yao proved that the expected running time of any randomized algorithm on worst-case input is equal to the average-case running time of any deterministic algorithm for the worst-case distribution of inputs. Yao’s principle has become a fundamental technique for reasoning about randomized algorithms and complexity, and has also been applied in areas such as property testing and learning theory.

Around this time Yao also made fundamental contributions to the theory of data structures. His 1978 paper, “Should tables be sorted?” [2] introduced the cell-probe model, an abstract model of data structures where the cost of a computation is measured by the total number of memory accesses. This model has been widely used in creating lower bound proofs of algorithms.

Yao spent a year as a Professor in the Computer Science Division of the University of California, Berkeley, and subsequently returned to Stanford as a full Professor in 1982. During the early 1980’s, Yao produced a number of papers which had a lasting impact on the foundations of cryptography, computer security, computational complexity and randomized computation. This work was significant not only for results obtained, but also for the introduction of problems, models and techniques which are now considered fundamental in their respective areas.

His 1981 paper with Danny Dolev, “On the security of public-key protocols,” [8] introduced a formal model for symbolic reasoning about security protocols. Since its introduction, this “Dolev-Yao model” has been the starting point for most work done on symbolic security, including recent work on the security of complexity-based cryptography, This continues to be an active area of research. Yao also made significant contributions to cryptography and complexity-based approaches to security. In 1982 he published “Theory and applications of trapdoor functions” [7] and “Protocols for secure computations” [5]. These works, which were introduced at the same conference, stand as seminal contributions in cryptography and secure computation.

The first of these papers addresses the then newly-emerging field of public-key cryptography from a theoretical perspective, lays the foundation for a theory of computational randomness, and initiates a study of its relationship to computational hardness. Yao provides a definition of pseudorandom number generator which is based on computational complexity, and proposes a definition of “perfect”—in current terminology, “pseudorandom”—probability distribution ensembles. (An ensemble is perfect if it cannot be distinguished from a truly random ensemble by any feasible distinguisher, where such distinguishers are formalized using the notion of a polynomial time randomized algorithm.) Yao relates his notion of pseudorandomness to the idea of a statistical test, a notion already used in the study of pseudorandom number generators, and shows that one particular test, known as the next-bit test, is adequate for characterizing pseudorandomness. Having defined perfect ensembles, Yao then defined a pseudorandom number generator as an efficient randomized algorithm which uses a limited number of truly random bits in order to output a sample from a perfect distribution whose size is polynomial in the number of random bits used.

The next fundamental contribution of the paper addresses the question of what computational assumptions are adequate for the existence of pseudorandom number generators. Advances in public-key cryptography indicated that secure encryption could be based on the assumed hardness of certain computational problems such as quadratic residuosity, or problems related to factoring integers. Yao asked whether one could make a general assumption about computational hardness which could be used to obtain pseudorandomness and hence, through standard techniques, cryptographic security. For this he formalized the notion of a one-way function that is easy to compute but hard to invert for a large fraction of inputs. He proved that one-way functions with certain properties may be used to construct pseudorandom number generators. This inspired a series of important results that refined Yao’s work, and it continues to be an area of active research. The contributions of this paper to pseudorandomness form an essential component of modern cryptography. In addition, the paper proposes a new field of computational information theory which refines Shannon’s theory by taking computational resources into account. Yao gives a definition of computational entropy, and uses it to give a characterization of encryption security. This definition of entropy is now important in areas such as leakage-resilient cryptography.

The second paper introduces a new paradigm for secure function evaluation, and introduces the famous “Millionaires’ Problem”. Yao gives a protocol which allows two parties, each holding a number, to determine who has the larger number without revealing the actual values. The Millionaires’ Problem is a two-party instance of a more general class of secure multiparty computation problems which are now essential to the study of secure cryptographic protocols. With the advent of wide-scale distributed computing and the ubiquity of cryptographic protocols, Yao’s contributions in this area have had a significant impact on networked computing.

In the 1980’s, Professor Yao introduced models and techniques whose ramifications are still being felt in research in complexity, computational randomness, cryptography, and security. Some of his most influential ideas were disseminated in lectures building on his published results. One example is the XOR-lemma, which uses computational hardness to produce pseudorandomness. Yao addressed whether the hardness of a problem may be amplified by combining multiple instances of the problem, in this case through the use of the bitwise exclusive-OR operation. While interesting in its own right, the XOR-lemma is an essential technique in the area of derandomization, which seeks generic methods for eliminating the need for randomness in the efficient solution of algorithmic problems.  More generally, it helps determine whether certain classes of problems that are solved efficiently with randomization can instead be solved deterministically.

A second example is the garbled circuit technique, an important tool in secure multiparty computation which was used implicitly by Yao in his 1982 secure computation paper as well as in a 1986 paper “How to generate and exchange secrets” [6]. Recent advances in computing power have made the garbled circuit technique practical for large-scale computational problems, for example in privacy-preserving matching in DNA databases.

In 1986, Yao moved to Princeton University, where he became the William and Edna Macaleer Professor of Engineering and Applied Science. During this period he continued his work on the foundations of cryptography. He also built on previous work in areas such as decision tree and communication complexity. Yao made substantial contributions to the theory of lower bounds for algebraic decision trees, an area he established in a 1982 Journal of Algorithms paper [9] co-authored with J.M. Steele. This work exploited deep relationships between algebraic decision trees and mathematical results in algebraic geometry. He also investigated the use of randomization in decision trees. Professor Yao introduced the theory of communication complexity in a 1979 paper “Some complexity questions related to distributive computing” [3].  In his 1993 paper, “Quantum circuit complexity”[4] he extended communication complexity to quantum computing. Starting in the 1990’s, Professor Yao began to work extensively on quantum computing, communication and information theory. He continues to make significant contributions in these areas.

Andrew Yao became a Professor in the Center for Advanced Study and Director of the Institute for Theoretical Computer Science at Tsinghua University, Beijing, in 2004. Since 2005 he has also been Distinguished Professor-at-Large of the Chinese University of Hong Kong. His recent contributions include work in security protocols, universally composable secure computation, quantum computing, and the theory of algorithms.

Yao is active in graduate supervision, and has mentored over twenty Ph.D. students. He is married to Professor Frances Yao, a computer scientist and leading researcher in computational geometry, algorithms and cryptography.



Author: Bruce Kapron



姚期智
出生地:中国上海
1946年,中国上海

学历。
台湾国立大学物理学学士,1967年;哈佛大学物理学硕士,1969年;哈佛大学物理学博士,1972年;伊利诺伊大学厄巴纳-香槟分校计算机科学博士,1975年。

工作经验。
助理教授(麻省理工学院数学系,1975-1976);助理教授(斯坦福大学计算机科学系,1976-1981);教授(加州大学伯克利分校计算机科学部,1981-1982);教授(斯坦福大学计算机科学系,1982-1986)。William and Edna Macaleer工程和应用科学教授,(普林斯顿大学,1986-2004);教授(清华大学高级研究中心,2004年起);清华大学理论计算机科学研究所主任,2004年起);特聘教授(香港中文大学,2005起)。

荣誉和奖项。
乔治-波利亚奖(1987年);古根海姆奖学金(1991年);计算机协会研究员(1995年);唐纳德-E-克努特奖(1996年);美国国家科学院院士(1998年)和美国艺术与科学院院士(2000年),中央研究院院士(2000年);A.M. 图灵奖(2000);潘文渊研究奖(2003);荣誉科学博士(香港城市大学,2003);美国科学促进会研究员(2003);荣誉工程博士(香港科技大学,2004);中国科学院外籍院士(2004)。伊利诺伊大学工程学院杰出服务校友奖(2004);荣誉科学博士(香港中文大学,2006);荣誉数学博士(滑铁卢大学,2009);国际密码学研究协会研究员(2010)。

ANDREW CHI-CHIH YAO DL 作者简介链接
中国 - 2000年
奖状
以表彰他对计算理论的基本贡献,包括基于复杂性的伪随机数生成理论、密码学和通信复杂性。

简短注释
书目
研究成果
题目
姚期智于1946年12月24日出生在中国上海。在随家人移居香港两年后,他移民到了台湾。1967年,他获得了台湾国立大学的物理学学士学位。随后,他开始在哈佛大学攻读物理学研究生课程,1969年获得硕士学位,1972年获得博士学位,师从1979年诺贝尔物理学奖得主Sheldon Glashow。随后,他进入伊利诺伊大学厄巴纳-香槟分校的计算机科学博士课程,并在两年后即1975年获得学位。姚期智在刘中良的指导下完成了他的论文《具体计算复杂性的研究》。

在麻省理工学院数学系担任助理教授一年后,姚期智于1976年加入斯坦福大学计算机科学系,担任助理教授。在接下来的五年里,他在算法理论方面做出了许多基本贡献。

他在1977年发表的论文 "Probabilistic computations: toward a unified measure of complexity," [1]介绍了现在已知的姚氏最小-最大原则,该原则使用冯-诺伊曼博弈论中的最小-最大定理,将确定性算法的平均情况下的复杂性与随机算法的最坏情况下的复杂性联系起来。姚期智证明,任何随机算法在最坏情况下的预期运行时间都等于任何确定性算法在最坏情况下的平均运行时间。姚期智的原理已成为推理随机算法和复杂性的基本技术,并被应用于属性测试和学习理论等领域。

大约在这个时候,姚期智还对数据结构理论做出了基本贡献。他在1978年发表的论文 "表格应该被排序吗?[2]介绍了单元探针模型,这是一个数据结构的抽象模型,其中计算的成本是由内存访问的总次数来衡量的。这个模型已被广泛用于创建算法的下限证明。

姚期智在加州大学伯克利分校的计算机科学部担任了一年的教授,随后于1982年回到斯坦福大学担任正式教授。在20世纪80年代初,姚期智发表了多篇论文,对密码学、计算机安全、计算复杂性和随机计算的基础产生了持久影响。这项工作的意义不仅在于所取得的成果,还在于引入了现在被认为是各自领域的基础的问题、模型和技术。

他在1981年与Danny Dolev合作的论文 "论公钥协议的安全性"[8],引入了一个关于安全协议的符号推理的正式模型。自引入以来,这个 "Dolev-Yao模型 "一直是大多数符号安全工作的起点,包括最近关于基于复杂性的密码学安全的工作,这仍然是一个活跃的研究领域。姚期智还对密码学和基于复杂性的安全方法做出了重大贡献。1982年,他发表了《陷阱函数的理论和应用》[7]和《安全计算的协议》[5]。这些作品是在同一会议上介绍的,作为密码学和安全计算的开创性贡献而存在。

其中第一篇论文从理论角度探讨了当时新出现的公钥密码学领域,为计算随机性的理论奠定了基础,并开始研究它与计算硬度的关系。姚期智提供了一个基于计算复杂性的伪随机数发生器的定义,并提出了一个 "完美"--用目前的术语来说是 "伪随机"--概率分布集合的定义。(如果一个合集不能被任何可行的区分器从真正的随机合集中区分出来,那么这个合集就是完美的,这种区分器是用多项式时间随机算法的概念正式确定的)。姚期智将他的伪随机性概念与统计测试的概念联系起来,这个概念已经被用于研究伪随机数生成器,并表明一个特殊的测试,即下位测试,足以说明伪随机性的特征。在定义了完美集合之后,姚期智将伪随机数生成器定义为一种高效的随机算法,该算法使用有限的真正随机比特,以便从完美分布中输出一个样本,其大小与使用的随机比特数成多项式。

本文的下一个基本贡献是解决什么样的计算假设对伪随机数生成器的存在是足够的问题。公钥密码学的进展表明,安全加密可以基于某些计算问题的假设硬度,如二次残差,或与整数派生有关的问题。姚期智问道,是否可以对计算的硬度做一个一般的假设,用来获得伪随机性,从而通过标准技术获得加密的安全性。为此,他正式提出了一个单向函数的概念,这个函数很容易计算,但对于很大一部分输入来说很难反转。他证明了具有某些特性的单向函数可以用来构建伪随机数生成器。这激发了一系列重要的结果,完善了姚期智的工作,并且它仍然是一个积极研究的领域。本文对伪随机性的贡献构成了现代密码学的一个重要组成部分。此外,本文提出了一个新的计算信息理论领域,通过考虑计算资源来完善香农的理论。姚期智给出了一个计算熵的定义,并利用它给出了加密安全性的特征。这个熵的定义现在在抗泄漏密码学等领域很重要。

第二篇论文介绍了一个安全函数评估的新范式,并介绍了著名的 "百万富翁问题"。姚明给出了一个协议,允许两方,每一方持有一个数字,确定谁拥有更大的数字,而不透露实际值。百万富翁问题是更普遍的安全多方计算问题的一个两方实例,这些问题现在对安全加密协议的研究至关重要。随着大规模分布式计算的出现和加密协议的普及,姚期智在该领域的贡献对网络计算产生了重大影响。

在20世纪80年代,姚教授引入了一些模型和技术,这些模型和技术的影响至今仍在复杂性、计算随机性、密码学和安全性的研究中得到体现。他的一些最有影响力的观点是在他发表的成果基础上的讲座中传播的。其中一个例子是XOR-lemma,它使用计算的硬度来产生伪随机性。姚期智探讨了一个问题的硬度是否可以通过组合该问题的多个实例来放大,在这种情况下,通过使用位排他性的OR操作。虽然XOR法则本身很有趣,但它是去随机化领域的一项重要技术,该领域寻求在有效解决算法问题时消除对随机性需求的通用方法。 更广泛地说,它有助于确定用随机化有效解决的某些类别的问题是否可以用确定的方式解决。

第二个例子是乱码电路技术,这是安全多方计算的一个重要工具,姚明在他1982年的安全计算论文以及1986年的论文 "如何生成和交换秘密 "中隐含地使用了它[6]。最近计算能力的进步使得乱码电路技术在大规模计算问题上变得实用,例如在DNA数据库的隐私保护匹配中。

1986年,姚期智转到普林斯顿大学,成为工程和应用科学的威廉和艾德娜-麦卡莱尔教授。在此期间,他继续从事密码学的基础工作。他还在决策树和通信复杂性等领域的先前工作的基础上进行研究。姚期智对代数决策树的下限理论做出了实质性的贡献,他在1982年与J.M. Steele合著的《算法杂志》的论文[9]中确立了这个领域。这项工作利用了代数决策树和代数几何学中的数学结果之间的深刻关系。他还研究了随机化在决策树中的应用。姚教授在1979年的论文《与分布式计算有关的一些复杂性问题》[3]中介绍了通信复杂性的理论。 在1993年的论文 "量子电路复杂性"[4]中,他将通信复杂性扩展到量子计算。从20世纪90年代开始,姚教授开始广泛地研究量子计算、通信和信息理论。他继续在这些领域做出重大贡献。

姚期智于2004年成为北京清华大学高级研究中心教授和理论计算机科学研究所所长。自2005年起,他还担任香港中文大学的无任所教授。他最近的贡献包括安全协议、普遍可组合的安全计算、量子计算和算法理论方面的工作。

姚期智积极从事研究生指导工作,已指导了20多名博士生。他的妻子是计算机科学家、计算几何学、算法和密码学的主要研究人员Frances Yao教授。



作者简介 布鲁斯-卡普隆

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

QQ|小黑屋|手机版|网站地图|关于我们|ECO中文网 ( 京ICP备06039041号  

GMT+8, 2024-11-7 17:50 , Processed in 0.073576 second(s), 22 queries .

Powered by Discuz! X3.3

© 2001-2017 Comsenz Inc.

快速回复 返回顶部 返回列表