Posts on May 2019

Preferred Networks releases version 6 of both the open source deep learning framework Chainer and the general-purpose matrix calculation library CuPy

May 16, 2019, Tokyo Japan – Preferred Networks, Inc. (PFN, Head Office: Tokyo, President & CEO: Toru Nishikawa) has released Chainer(TM) v6 and CuPy(TM) v6, major updates of PFN’s open source deep learning framework and general-purpose matrix calculation library, respectively. The latest version will run as-is on most of the code used in previous versions.
Chainer was released as open source software in 2015 and is known as a pioneer of flexible and intuitive deep learning frameworks based on the Define-by-Run method. Chainer has since been supported by many users and is being actively developed.

ChainerX, a C++ implementation of automatic differentiation that has experimentally been integrated into the main Chainer distribution since the release of the v6 beta version, now supports more examples. The use of ChainerX can significantly reduce overhead on the framework side in both forward and backward propagations without losing much of Chainer’s flexibility and backward compatibility, resulting in increased performance. In addition, Chainer and ChainerX source code does not need to be changed to use new hardware on ChainerX if a third-party developer implements the support for the hardware as a plug-in.

 

Main features of Chainer v6 and CuPy v6 are:

  • Integration of ChainerX
    • Fast and more portable multi-dimensional arrays and the automatic differentiation backend have been added.
    • A compatibility layer has been implemented to allow for the use of ChainerX arrays in the same manner as NumPy and CuPy arrays, allowing automatic differentiation with low overhead in C++.
    • An integrated device API has been introduced. The unified interface can handle the specification of devices or inter-device transfer for a wide variety of backends such as NumPy, CuPy, iDeep, and ChainerX.
  • Enhanced support for training in mixed precision
    • Mixed 16, a new default data type, has been added. It is a mixed precision mode that realizes transparent training using operations in single and half precisions.
    • Dynamic scaling that detects and automatically adjusts overflow has been implemented in order to avoid underflow in mixed precision training.
  • Addition of a function and link test tool
    • A test tool that generates unit tests for forward and backward propagations as well as second order differentials with minimal code has been added.
  • CuPy arrays to support NumPy functions
    • NumPy’s experimental feature __array_function__ is supported now. CuPy arrays have been directly applied to many __array_function__ enabled Numpy functions.

 

 

PFN will continue improving Chainer performance and expanding the backend. It will contribute to improved performance in a wide range of use cases by making ChainerX easier to use as well as supporting more arithmetic operations.

Chainer has incorporated a number of development results from external contributors. PFN will continue to quickly adopt the results of the latest deep learning research and promote the development and popularization of Chainer in collaboration with supporting companies and the OSS community.

 

  • About the Chainer(TM) Open Source Deep Learning Framework

Chainer is a Python-based deep learning framework developed and provided by PFN, which has unique features and powerful performance that allow for designing complex neural networks easily and intuitively, thanks to its “Define-by-Run” approach. Since it was open-sourced in June 2015, as one of the most popular frameworks, Chainer has attracted not only the academic community but also many industrial users who need a flexible framework to harness the power of deep learning in their research and real-world applications.

Chainer quickly incorporates the results of the latest deep learning research. With additional packages such as ChainerRL (reinforcement learning), ChainerCV (computer vision), and Chainer Chemistry(a deep learning library for chemistry and biology)and through the support of Chainer development partner companies, PFN aims to promote the most advanced research and development activities of researchers and practitioners in each field.(http://chainer.org/

Preferred Networks appoints Professor Yarin Gal of the University of Oxford as a Technical Advisor

Preferred Networks, Inc. (PFN, HQ: Chiyoda-ku, Tokyo, President & CEO: Toru Nishikawa) agreed with Yarin Gal (Associate Professor at the University of Oxford) that he has been appointed as a technical advisor at PFN as of March 1st, 2019.

Professor Gal is a pioneering researcher in Bayesian deep learning and its applications in industry. He has achieved notable research accomplishments in Bayesian statistics, approximate Bayesian inference, Bayesian deep learning, and also worked on applications such as computer vision, AI safety, and ML interpretability.

In this role as a technical advisor, Professor Gal will jointly advance research of generative models for vision tasks and uncertainty in real-world modeling with PFN researchers by his technical advice and supervision.

Associate Prof. Yarin Gal

  • Biography

Yarin Gal is an Associate Professor of Machine Learning at the Computer Science department at the University of Oxford, Tutorial Fellow in Computer Science at Christ Church, Oxford, and Turing fellow at the Alan Turing Institute. Prior to taking his position at Oxford he was a Research Fellow in Computer Science at St Catharine’s College, Cambridge. He obtained his PhD from the Cambridge machine learning group, working with Zoubin Ghahramani and funded by the Google Europe Doctoral Fellowship.