BULLETIN BOARD

Publication of Results Made by ABCI Users(Article 21, ABCI Agreement)

When ABCI users intend to make the results public for academic publication, international conference, press release, etc, please send the following information to: abci-application-ml@aist.go.jp

Example of the credit at the publication

Computational resource of AI Bridging Cloud Infrastructure (ABCI) provided by National Institute of Advanced Industrial Science and Technology (AIST) was used.

Research Publication (ABCI Grand Challenge)

[Academic Papers]

Kates-Harbeck, J., Svyatkovskiy, A., Tang, W., “Predicting disruptive instabilities in controlled fusion plasmas through deep learning”, Nature (2019 Apr).

[International Conferences, Academic Publications]

Y. Tsuji, K. Osawa, Y. Ueno, A. Naruse, R. Yokota, and S. Matsuoka, “Performance Optimizations and Analysis of Distributed Deep Learning with Approximated Second-Order Optimization Method”, The 1st Workshop on Parallel and Distributed Machine Learning 2019 (PDML19).
Y. Ueno and R. Yokota, “Hierarchical Topology-aware Communication for Scaling Deep Learning to Thousands of GPUs”, The 19th Annual IEEE/ACM International Symposium in Cluster, Cloud, and Grid Computing (CCGrid 2019).
K. Osawa, Y. Tsuji, Y. Ueno, A. Naruse, T. Yokota, and S. Matsuoka, “Large-scale Distributed Second-order Optimization Using Kronecker-factored Approximate Curvature for Deep Convolutional Neural Networks”, Conference on Computer Vision and Pattern Recognition (CVPR 2019).
Yokota, R. “Kronecker Factorization for Second Order Optimization in Deep Learning”, SIAM CSE, Spokane, USA, February 25-March 3 (2019).
Massively Distributed SGD: ImageNet/ResNet-50 Training in a Flash (arXiv:1811.05233) [v2] Tue, 5 Mar 2019.
ImageNet/ResNet-50 Training in 224 Seconds (arXiv:1811.05233) [v1] Tue, 13 Nov 2018.

[Press Releases]

November 13, 2018 / SONY
Sony Achieves World's Fastest*1 Deep Learning Speeds through Distributed Learning

Research Results (ABCI Users)

[Press Releases]

April 01, 2019 / Fujitsu Laboratories Ltd.
Fujitsu Develops Deep Learning Acceleration Technology, Achieves World's Highest Speed
Achieves training time of 75 seconds in ResNet-50 through highly-efficient distributed parallel processing

https://www.fujitsu.com/global/about/resources/news/press-releases/2019/0401-01.html
Fujitsu Laboratories Ltd. today announced that it has developed technology to improve the speed of deep learning software, which has now achieved the world’s highest speed when the time required for machine learning was measured using the AI Bridging Cloud Infrastructure (ABCI) system, deployed by Fujitsu Limited for the National Institute of Advanced Industrial Science and Technology (AIST).

[International Conferences, Academic Publications]

Kohei Ozaki and Shuhei Yokoo, “1st place in Retrieval Challenge” and "3rd place in Recognition Challenge", CVPR 2019 Second Landmark Recognition Workshop, 16 June 2019.
Kohei Ozaki and Shuhei Yokoo, "Large-scale Landmark Retrieval/Recognition under a Noisy and Diverse Dataset" (arXiv:1906.04087) [v1] Mon, 10 June 2019.
Peng Chen, Mohamed Wahib, Shinichiro Takizawa, Ryousei Takano, Satoshi Matsuoka, "iFDK: A Scalable Framework for Instant High-resolution Image Reconstruction", SC19, 17-22 November 2019.
Peng Chen, Mohamed Wahib, Shinichiro Takizawa, Ryousei Takano, Satoshi Matsuoka, "A Versatile Software Systolic Execution Model for GPU Memory Bound Kernels", SC19, 17-22 November 2019.

ABCI Benchmarks

5th in the row at TOP500 (June 2018)
4th in the row at Green500 (November 2018)
5th in the row at HPCG Performance (November 2018)
BULLETIN BOARD ARCHIVE