Computer Architecture Today

Informing the broad computing community about current activities, advances and future directions in computer architecture.

TL;DR

The ASPLOS Steering Committee (SC) is soliciting feedback on the new review process introduced with ASPLOS 2023. The SC welcomes input on the impact of the new process and changes being contemplated for ASPLOS 2026. Please read this whole post then complete this online survey by Nov 8, 2024.

Growing Pains

In 2021, the ASPLOS SC solicited input on a proposal to introduce two changes to the review process: multiple review cycles per year and a major revision process.  At the time the community was overwhelmingly in favor and the proposal was adopted with ASPLOS 2023.  Since then, the number of submissions to ASPLOS has more than doubled from an average of 408 per year between 2019 and 2022 growing to 597 submissions in 2023, 922 submissions in 2024 and 919 submissions in 2025.  While submissions have also increased in some related conferences (HPCA, MICRO) other related conferences (ISCA, OSDI, SOSP, PLDI) have seen relatively stable submission rates over the same periods.

The growth in submissions has led to challenges. One is recruiting a sufficiently large committee to keep load manageable while ensuring each submission receives sufficient review.  This can be better appreciated by considering the number of PC invitations sent versus accepted:  For ASPLOS 2019, out of 89 invitations sent, 52 (or 58%) were accepted, a rate that was similar for ASPLOS 2021 (60%).  For ASPLOS 2023, the first year of the changes, the rate stayed similar (59%). However, for ASPLOS 2024 only 224 out of 460 invitations (49%) were accepted and for ASPLOS 2025 only 204 of 451 invitations (45%) were accepted.  Historically, PC chairs employ some combination of existing service track records and professional social networks when identifying candidate committee members.  As noted in the ASPLOS 2024 volume 1 PC chairs’ message regarding sending 460 PC invitations, “It will be an understatement to say that we do not know so many potentially trustworthy reviewers”.

Quality is hard to define let alone measure precisely, but another concern raised by recent ASPLOS PC chairs is that anecdotally review and discussion quality may have declined.  Thus, one goal of the survey is to provide a better understanding of the impact of the changes to the quality of the review process and to do this from both author and reviewer perspectives.

Refine, Revert, Reinvent?

Many approaches to addressing the challenges have been suggested by ASPLOS SC members and current/incoming Chairs.  Proposals range from refinements to the multicycle review process, reverting back to a single deadline, or introducing an editorial system involving early rejection of 50% of submissions based upon an evaluation of an extended abstract by one or two “seasoned reviewers”.  Thus, a second goal of the survey is to solicit the views of the ASPLOS community to help inform the SC’s decision making process going forward, including for how ASPLOS 2026 will be run.

Potential refinements to the review process could include: (i) restricting submissions to those that are interdisciplinary between two or more of architecture, programming languages and operating systems; (ii) finding more systematic ways of expanding the pool of reviewers such as requiring well qualified authors submitting to a given cycle to volunteer to serve as a reviewer; and (iii) introducing a cap on the number of submissions per author; and (iv) improving the quality of reviews by better incentivizing good behavior and/or discouraging bad behavior.  For example, the ASPLOS 2024 PC-chairs’ data across all three submission cycles suggests around 59% of submissions to ASPLOS 2024 could be categorized as ‘non-interdisciplinary’.  Thus, one approach would be to strictly enforce that papers are interdisciplinary.

Reverting back to a single deadline is viewed as feasible by some as the number of submissions to the Summer cycle is similar to that of single-deadline conferences such ISCA.  One hypothesis supporting this approach is that the increase in submissions is mainly a result of opportunistic resubmissions of rejected papers from other systems conferences to Spring and Fall cycles.  Returning to the traditional deadline would force authors to decide earlier whether their submission is a good fit for ASPLOS.  With a single submission deadline, perhaps it would be possible to have a small enough PC to enable a return to an in-person (online or physical) PC meeting, which may help review quality.

Finally, viewing the transition to multiple review cycles as a first step towards a journal-like review process, some SC members favor adopting scaling strategies employed by prestigious journals such as Nature and Science where an editor will review a cover letter to decide whether a submission should be sent for detailed peer review. 

Initial Expectations, Revisited

Back in 2021, the ASPLOS SC put forward seven hypotheses as to the advantages of the changes to multiple cycles and major revisions, but allowed that they “need to be measured and evaluated”.  Below we revisit each of the hypotheses in turn, along with recent data to provide additional context to those filling out the survey.

The first hypothesis was that “The workload of the PC will not increase significantly, although the duration of its work will extend over more of the year”.  While assigned papers per reviewer has remained roughly in line with prior years, the perception of the workload is different. For example, the ASPLOS 2023 program chairs surveyed their committee and found that “78% of our PC members felt that service on the ASPLOS’23 was either ‘A little more work’ or ‘Far more work’ than previous ASPLOS PC’”’. To address this ASPLOS 2024 and 2025 recruited a larger program committee (224 PC members in 2024 versus 57 in 2020).  The perception that workload is higher may result from the longer time period, the variability in workload versus expectations, or some other factor.  There is concern that the increased PC size required to manage the load is leading to lower review quality.

The second hypothesis was that “Three smaller PC meetings per year will be possible.”  While some communities moved to online PC meetings earlier out of concerns over environmental impact, with COVID-19 and improved video conference technology online meetings became standard.  While conferences have returned to in-person gatherings, PC meetings have remained online for many conferences.  For ASPLOS 2023 PC meetings were held online.  As the PC grew in size, coordinating meeting times became more difficult and the ASPLOS 2024 program chairs moved paper decision making entirely to asynchronous online discussions, which is also the process for ASPLOS 2025.  We achieved three review cycles per year but the current practice seems to have diverged from that originally envisioned.  

The third hypothesis was that “Spreading the reviewing over the year permits higher quality reviews and discussion.”  As noted, quality is hard to define or quantify.  All else being equal, the average number of comments per paper may give some indication of how engaged reviewers are. This metric grew, from an average of 8.3 across ASPLOS 2019, 2020 and 2021 to 13.8 in 2023, 14.5 in 2024 and 13.0 for the spring review cycle of 2025.  Given the change to asynchronous decision making in 2024, one might have expected a larger increase. 

The fourth hypothesis was that “The submissions will be spread among the three deadlines without too great of an imbalance. Authors will not all rush to submit at the first deadline.”  On average, the Spring cycle has received 14% of submissions, the Summer cycle 49% of submissions and the Fall cycle the remaining 37% of submissions.  While relative popularity of the different cycles was similar all three years, the imbalance between cycles has not been stable year to year.  The load imbalance varied from 30% in 2023, to 26% in 2024 and 47% in 2025. Here load imbalance is defined as maximum submissions to a cycle minus minimum submissions to a cycle in the same year divided by total submissions that year.

The fifth hypothesis was that “Multiple deadlines will encourage authors to submit finished work that is easier to evaluate and more likely to be accepted.” As noted above, the first submission cycle has been least popular but this may only reflect the relative timing of other conference deadlines.  As part of our survey we ask authors to rate their own behavior and reviewers to rate their perceptions of others behavior with regard to this hypothesis.

The sixth hypothesis was that “The Revise option will improve the conference and reduce reviewing randomness by providing authors of papers close to acceptance with an opportunity to address PC concerns directly by providing additional work and experiments.”  The data from 2023 shows that 77% of major revisions for that year were ultimately accepted while for 2024 it was 78%.  Reviewers from ASPLOS 2023 “felt that papers were significantly improved through the major revision process”.  

The seventh and final hypothesis was that “The additional work of evaluating the revisions will not be a significant burden on the PC because it starts from the context of its previous effort and its specific questions and requests.”  Survey data collected from reviewers from ASPLOS 2023 suggested otherwise and the PC chairs for ASPLOS 2024 estimated the overhead of revisions to be about 20%.  We hope to collect more input from reviewers on their personal experience of the overhead of major revisions via the survey.

About the Authors: The ASPLOS Steering Committee currently includes Tor Aamodt, Nael Abu-Ghazaleh, Babak Falsafi, Michael Ferdman, Rajiv Gupta, Natalie Enright Jerger, Shan Lu, Madan Musuvathi, Andrew Myers, KyoungSoo Park, Michael Swift, Dan Tsafrir, Thomas Wenisch.

Disclaimer: These posts are written by individual contributors to share their thoughts on the Computer Architecture Today blog for the benefit of the community. Any views or opinions represented in this blog are personal, belong solely to the blog author and do not represent those of ACM SIGARCH or its parent organization, ACM.