Workshop ScopeΒΆ

After transitioning from industrial applications of massive scale image classification, a variety of stochastic inference and statistical learning methods have gained a momentum in many fields of experimental and computational science. The large data sets collected during scientific experiments and computational simulations continue to overwhelm both in terms of size and processing capacity of even the largest supercomputing installations. The modern data analytics methods provide a wide range of possibilities to tackle these issues effectively and, with the help of HPC, at large system scales.

The SDASC workshop is meant to bring together new developments and applications of large scale data analytics to modern scientific applications and libraries. It will offer a broad view of current state-of-the-art in deep-learning, hyper-parameter search, and data representation combined with the parallelization and HPC techniques on top of modern supercomputing software and hardware stacks in service of computational science. In other words, the workshop will feature HPC-enabled statistical learning applied to scientific data sets and large computational simulations along with the related integration efforts across a number of scientific disciplines.