Call for Papers:

IISWC 2021

Abstract or Paper Registration Deadline
July 9, 2021
Final Submission Deadline
October 8, 2021

IISWC invites manuscripts that present original unpublished research in all areas related to characterization and analysis of computing system workloads, including translational research related to production-oriented commercial systems. The detailed CFP can be found here.

Deadlines

  • Submission Deadline : July 9, 2021 (11:59 EDT)
  • Decision Notification : September 3, 2021
  • Camera-ready deadline : October 8, 2021

**New in 2021 — Artifact Evaluation**

This year, IISWC will include an artifact evaluation process to promote the reproducibility of experimental results. We will invite the authors of accepted IISWC papers to submit their supporting materials to the Artifact Evaluation process, which is to assess how the artifacts support the work described in the papers. This submission will be voluntary and will not influence the final decision regarding acceptance of the paper. The description of the artifact will not be included in the page limit. The artifact submission deadline will be shortly after the notification of the paper’s acceptance — authors should prepare in advance to ensure sufficient time for artifact assembly and documentation. More details of artifact evaluation will be made available to the authors of the accepted paper.

Submission Guidelines

Submissions to IISWC can be made in one of the following two categories: (1) regular papers (2) tool and benchmark papers. The primary focus of “regular papers” (submission length: 10 pages, excluding references) should be to describe new research ideas supported by experimental implementation and evaluation of the proposed research ideas. The primary focus of “tool and benchmarks papers” should be to describe the design, development, and evaluation of new open-source tools / benchmarks suites. Submissions in the “regular papers” category are also encouraged to open-source their software or hardware artifacts.

The authors are required to indicate the category of the paper as a part of the submitted manuscript’s title. The last line of the title should indicate the paper type by using one of the two phrases (1) Paper Type: Regular, or (2) Paper Type: Tool / Benchmark.

Papers in the tool and benchmark category with relatively shorter length (6 pages) are welcome, if the contributions can be well articulated and substantiated. However, all submissions in the tool and benchmark category have the flexibility of using all 10 pages (excluding references).

The submissions in both categories will be evaluated to the same standards in terms of novelty, scientific value, demonstrated usefulness, and potential impact on the field. The nature of the contribution differs between the two categories (new research idea vs. new open-source benchmark-suite / tool) and papers will be evaluated based on the intended nature of the contribution, as declared by the chosen paper category at the time of the submission. The chosen category at the time of the submission can not be changed after the submission deadline.

Double-blind submission guidelines apply to the submissions in both categories.

Open-source benchmarks and tools that have not been previously published (but may have been open-sourced) are eligible for submission in the “tool and benchmark papers” category. Even in cases where the benchmarks suite/tool is already being used in the community, the authors should demonstrate a good faith effort to adhere to the double-blind submission guidelines. All submitted papers should have obtained legal permission (if applicable) to open-source the benchmark-suite / tool at the time of submission.

Topics of Interest

Characterization of applications in domains including

  • Life sciences, bioinformatics, scientific computing, finance, forecasting
  • Machine learning, data analytics, data mining
  • Cyber-physical systems, pervasive computation and Internet of Things (IoT)
  • Security and privacy-preserving computing
  • Quantum computing
  • High performance computing
  • Cloud and edge computing
  • Mobile computing
  • User behavior and system-user interaction
  • Search engines, e-commerce, web services, and databases
  • Embedded, multimedia, real-time, 3D-graphics, gaming
  • Blockchain services

Emerging workloads and architectures, such as

  • Quantum computations and communication
  • Serverless computing
  • Near-threshold computing
  • Non-volatile memory
  • Near data processing architectures
  • Neuromorphic and brain-inspired computing
  • Artificial intelligence and transactional memory workloads

Characterization of OS, Virtual Machine, middleware and library behavior, including

  • Virtual machines, .NET, Java VM, databases
  • Graphics libraries, scientific libraries
  • Operating system and hypervisor effects and overheads

Implications of workloads in system design, such as

  • Power management, reliability, security, privacy, performance
  • Processors, memory hierarchy, I/O, and networks
  • Design of accelerators, FPGAs, GPUs, CGRAs, etc.
  • Large-scale computing infrastructures and facilities

Benchmark methodologies and suites, including

  • Representative benchmarks for emerging workloads
  • Benchmark cloning methods
  • Profiling, trace collection, synthetic traces
  • Validation of benchmarks

Measurement tools and techniques, including

  • Instrumentation methodologies for workload verification and characterization
  • Techniques for accurate analysis/measurement of production systems
  • Analytical and abstract modeling of program behavior and systems