Contributing

We are encouraging contributions that present original unpublished research. Submissions (up to 12 pages) should adhere to the single-column LNCS style (Springer’s website). Topics of interest include but are not limited to the following topic areas:

  • Incorporating realtime and ad-hoc data analytics into applications and deployment on supercomputing and cluster platforms
  • Scientific data set creation, ingest, curation, and analysis with stochastic approaches
  • Computational steering through machine learning models
  • Meta-data and data metrics collection and generation for large data collections and output data sets
  • Multi-precision inference methods and their use on modern hardware for simulation data
  • Novel use of discriminative and generative machine learning approaches for scientific data sets including reinforcement learning
  • Modern HPC storage issues when dealing with integration of computational simulation outputs with data analytics software
  • Synchronous and asynchronous learning approaches for methods related neural network training, stochastic gradient descent, loss-function engineering, etc.
  • Model derivation and training for scalable simulations and data sets
  • Deployment of statistical models and their implementations such as TensorFlow, (Py)Torch, Caffe 1/2, Keras, combined with their integration with large scale simulations through containers (Kubernetes, Docker, Singularity, OpenShift), virtualizaiton, colocation, etc.

The authors of accepted contributions will get a 15 minute time slot at the workshop to present their work. The revised versions of accepted papers will be published as post-conference proceedings. This year, the ISC workshop chairs organize a joint Workshop Proceedings that will be published with Springer similar to the ISC 2019 research papers proceedings. The workshop proceedings will be published after the conference but we will collect preliminary versions of the papers and make them available during the workshop to your workshop attendees.