https://www.youtube.com/watch?v=cllFzkvrYmE
This paper does not describe a working system. Instead, it presents a single idea about representation which allows advances made by several different groups to be combined into an imaginary system called GLOM. The advances include transformers, neural fields, contrastive representation learning, distillation and capsules. GLOM answers the question: How can a neural network with a fixed architecture parse an image into a part-whole hierarchy which has a different structure for each image? The idea is simply to use islands of identical vectors to represent the nodes in the parse tree. If GLOM can be made to work, it should significantly improve the interpretability of the representations produced by transformer-like systems when applied to vision or languageHow to represent part-whole hierarchies in a neural network
https://arxiv.org/abs/2102.12627
Quantum neural networks are computational neural network models which are based on the principles of quantum mechanics. The first ideas on quantum neural computation were published independently in 1995 by Subhash Kak and Ron Chrisley,[1][2] engaging with the theory of quantum mind, which posits that quantum effects play a role in cognitive function. However, typical research in quantum neural networks involves combining classical artificial neural network models (which are widely used in machine learning for the important task of pattern recognition) with the advantages of quantum information in order to develop more efficient algorithms.[3][4][5] One important motivation for these investigations is the difficulty to train classical neural networks, especially in big data applications. The hope is that features of quantum computing such as quantum parallelism or the effects of interference and entanglement can be used as resources. Since the technological implementation of a quantum computer is still in a premature stage, such quantum neural network models are mostly theoretical proposals that await their full implementation in physical experiments.
Quantum neural network
https://en.m.wikipedia.org/wiki/Quantum_neural_network
In an article recently published in Physical Review Research, we show how deep learning can help solve the fundamental equations of quantum mechanics for real-world systems. Not only is this an important fundamental scientific question, but it also could lead to practical uses in the future, allowing researchers to prototype new materials and chemical syntheses in silico before trying to make them in the lab. Today we are also releasing the code from this study so that the computational physics and chemistry communities can build on our work and apply it to a wide range of problems. We’ve developed a new neural network architecture, the Fermionic Neural Network or FermiNet, which is well-suited to modeling the quantum state of large collections of electrons, the fundamental building blocks of chemical bonds. The FermiNet was the first demonstration of deep learning for computing the energy of atoms and molecules from first principles that was accurate enough to be useful, and it remains the most accurate neural network method to date. We hope the tools and ideas developed in our AI research at DeepMind can help solve fundamental problems in the natural sciences, and the FermiNet joins our work on protein folding, glassy dynamics, lattice quantum chromodynamics and many other projects in bringing that vision to life.
FermiNet: Quantum Physics and Chemistry from First Principles
https://deepmind.com/blog/article/FermiNet
Given access to accurate solutions of the many-electron Schrödinger equation, nearly all chemistry could be derived from first principles. Exact wave functions of interesting chemical systems are out of reach because they are NP-hard to compute in general, but approximations can be found using polynomially scaling algorithms. The key challenge for many of these algorithms is the choice of wave function approximation, or Ansatz, which must trade off between efficiency and accuracy. Neural networks have shown impressive power as accurate practical function approximators and promise as a compact wave-function Ansatz for spin systems, but problems in electronic structure require wave functions that obey Fermi-Dirac statistics. Here we introduce a novel deep learning architecture, the Fermionic neural network, as a powerful wave-function Ansatz for many-electron systems. The Fermionic neural network is able to achieve accuracy beyond other variational quantum Monte Carlo Ansatz on a variety of atoms and small molecules. Using no data other than atomic positions and charges, we predict the dissociation curves of the nitrogen molecule and hydrogen chain, two challenging strongly correlated systems, to significantly higher accuracy than the coupled cluster method, widely considered the most accurate scalable method for quantum chemistry at equilibrium geometry. This demonstrates that deep neural networks can improve the accuracy of variational quantum Monte Carlo to the point where it outperforms other ab initio quantum chemistry methods, opening the possibility of accurate direct optimization of wave functions for previously intractable many-electron systems.
Ab initio solution of the many-electron Schrödinger equation with deep neural networks
David Pfau, James S. Spencer, Alexander G. D. G. Matthews, and W. M. C. Foulkes
Phys. Rev. Research 2, 033429 – Published 16 September 2020
https://journals.aps.org/prresearch/abstract/10.1103/PhysRevResearch.2.033429
https://github.com/deepmind/ferminet
요즘은, 공부를 하고 싶은 논문들이 많습니다. 시간만 된다면... 아무튼 최근 흥미 있는 기술 2개만 소개해봤습니다. 이전에도 소개한 것들이죠.
이런 최신 기술을 잘 이용해서, 얼마나 많은 문제를 해결 할 수 있을까요?
기존 구조적인 문제, 계산 속도 문제 해결을 통해서 꽤 많은 문제를 해결할 수 있다고 생각합니다.
아래와 같은 편향성 제거 문제는 어떻게 하면, 잘 해결 할 수 있을까요?
집리크루터, 커리어빌더, 링크드인 등 일자리 검색 사이트들은 AI를 활용하여 구직자와 일자리를 연결한다. 그런데 이 AI 알고리즘이 늘 공정한 것은 아니다.
몇 해 전 링크드인(LinkedIn)은 구직자와 채용공고를 연결할 때 사용하는 자사 알고리즘의 추천 결과가 편향되었다는 사실을 알게 되었다.
링크드인 추천 알고리즘은 구직자의 우선순위를 정할 때 몇 가지 정보를 활용한다. 여기에는 구직자가 채용공고에 지원하거나 기업 채용 담당자에게 응답할 확률도 포함된다. 그 결과, 단지 남성이 여성보다 구직 활동에 더 적극적이라는 이유로 여성보다 남성을 추천하는 비율이 높아진 것이었다.
...
“채용 담당자가 이력서를 훑어보는 데 걸리는 시간이 평균 6초라는 말을 들어봤을 것”이라고 몬스터 상품 관리 담당 부사장 데렉 칸(Derek Kan)은 말한다. “우리 회사의 추천 엔진을 사용하면 시간이 천 분의 몇 초로 단축된다.”
AI 편향성 제거 위한 링크드인의 새로운 시도
https://www.technologyreview.kr/linkedin-ai-bias-ziprecruiter-monster-artificial-intelligence/
아직 대부분의 기술은 너무 획일화 돼 있습니다.
이를 해결하기 위해서는 많은 데이터가 필요할까요? 충분한 도메인 지식이 필요할까요? 혹은 더 뛰어난 계산 능력이 필요할까요?
계산 능력은 추후 양자 컴퓨터나 유의미한 QNN을 만든다면 해결 될 거라고 봅니다. 양자 계산은 상상을 초월하는 속도를 가졌죠.
마무리로 양자컴퓨터에 대한 뉴스를 공유 드리고 끝내겠습니다.