Top Page→ コロキウム/セミナー情報 |
コロキウム/セミナー情報 |
コロキウム/セミナー スケジュール 2017 |
お知らせ |
次々回のコロキウム/セミナーは1/26(金)の開催予定です。
セミナー1/19(金) |
“Mathematics in deep learning”
御園生 洋祐
--- Abstract ---
One of the most important role of deep learning is a prediction of time series data. If data series possesses some dependence on previous developments, a perceptron (a set of neural cells) should keep mind the past information.
Recurrence neural network (RNN) is a powerful equipment to analyze time series data, which hold memories in each neural cells.
However, it is known that RNN is plagued by gradient vanishing (or sometimes explosion) problem as is the case with multilayer perceptron.
The gradient vanishing (explosion) is a kind of instability during training.
Therefore, new versions of RNN have been developed, for example Long and Short Term Memory (LSTM).
LSTM is the most successful version of RNN which recognizes time dependencies over a very long period of time and can forget irrelevant previous memory without encountering instabilities.
In this talk, we focus on how to construct and train the perceptron, especially to analyze time series data, that is (i) the structures of RNN and LSTM, (ii) backpropagation to obtain error gradients with respect to parameters in perceptron, (iii) the way to find suitable parameters, i.e., optimizer.
If time permits, we further discuss the roles of hyperparameters which cannot be determined by training, i.e., the number of cells and layers.
references
[1] ”ゼロから作るDeep Learning” 斎藤康毅 (オライリー, 2016).
[2] “詳解 ディープラーニング” 巣籠悠輔 (マイナビブックス, 2016).
[3] “深層学習 Deep Leaning” 人工知能学会 (近代科学社, 2015).
[4] “Deep Learning” Y. LeCun, Y. Bengio and G. Hinton, Nature 521 (2015).
[5] “Long and Short-Term Memory” S. Hochreiter and J. Schmidhuber, Neural Computation 9(8) : 1735-1780 (1997).
[6] “Learning to forget: Continual prediction with LSTM” F. A. Gers, J. Schmidhuber, and F. Cummins, Artificial Neural Networks, 1999. ICANN 99. Ninth International.
[7] “Recurrent nets that time and count. ”, F. A. Gers and J. Schmidhuber, Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on (Vol. 3, pp. 189-194).
[8] “Neural Networks for Machine Learning Lecture 6a overview of mini-batch gradient descent”, G. Hinton, N. Srivastava and K.Swersky, Lecture in Toronto University.
[9] “Adam: A Method for Stochastic Optimization” D. P. Kingma and J. Ba, arXiv1412.6980[cs.LG].
[10] “An Overview of Gradient Descent Optimization Algorithms”, S.Ruder, arXiv1609.04747[cs.LG].
場所:55号館N棟2階物理応物会議室
時間:13:30-15:30
セミナー1/19(金) |
“Lattice QCD: algorithm bottle neck and implementation ”
金森 逸作 [広島大学]
--- Abstract ---
Lattice QCD is a non-perturbative formulation of Quantum Chromo Dynamics (QCD) which describes interaction of quarks and gluons. Combined with a numerical method, it has been revealing a lot of nature inside and between hadrons, including existence of hadron itself. In this talk, I will give a brief sketch of lattice QCD simulation. Lattice QCD has been one of major scientific applications of high performance computing (HPC). We always need to prepare/tune our simulation codes to newer HPC machines such as Oarkforest-PACS and "post K" computer, and I guess that the situation is not much different from application to application. Therefore my aim of this talk is to share the algorithm and some know-how on the implementation with physicists working on or near HPC field. The most time consuming part in the lattice QCD simulation is to solve large scale linear equations with iterative methods. Focusing on the solver, especially sparse matrix vector multiplications needed in iterative solvers, I will also explain our implementation for Intel Xeon Phi Knights Landing (KNL).
場所:55号館N棟2階物理応物会議室
時間:16:30-18:00
[Colloquium/Seminar 1997] [Colloquium/Seminar 1998]
[Colloquium/Seminar 1999] [Colloquium/Seminar 2000]
[Colloquium/Seminar 2001] [Colloquium/Seminar 2002]
[Colloquium/Seminar 2003] [Colloquium/Seminar 2004]
[Colloquium/Seminar 2005] [Colloquium/Seminar 2006]
[Colloquium/Seminar 2007] [Colloquium/Seminar 2008]
[Colloquium/Seminar 2009] [Colloquium/Seminar 2010]
[Colloquium/Seminar 2011] [Colloquium/Seminar 2012]
[Colloquium/Seminar 2013] [Colloquium/Seminar 2014]
[Colloquium/Seminar 2015] [Colloquium/Seminar 2016]
[Colloquium/Seminar 2017]
アクセス： [Japanese] [English]
TOPページ： [Japanese] [English]
_at_ を@に置き換えてください
編集者:佐藤 星雅