Evolutionary Computation and Explainable AI
Webpage: https://ecxai.github.io/ecxai/
Description
In recent years there has been increasing interest in the intersection between evolutionary computation and explainable AI. Starting from a discussion group at GECCO 2021, activity in the EC community around this topic has grown. The proposers of this ECXAI workshop have organised previous iterations at GECCO 2022-2025. Each of these sessions was exceptionally well-attended, with 50 - 100 delegates present physically and 10 - 20 digitally at each iteration. At each, additional informal meet-ups have attracted 15 - 20 attendees. Papers accepted for presentation at each workshop numbered 7 (2022), 3 (2023), 5 (2024), and 6 (2025).
This workshop aims to continue to raise the profile of this important and novel area at GECCO 2026 through discussion and demonstration. In line with previous workshops, we intend to invite a keynote speaker. The ECXAI workshop is part of a wider effort by the organisers to raise awareness in the wider EC community, most recently including a TELO special issue devoted to the topic. More information on this effort, including details of prior ECXAI workshops, can be found here: https://ecxai.github.io/ecxai/
‘Explainable AI’ is an umbrella term that covers research on methods designed to provide human-understandable explanations of the decisions made/knowledge captured by AI models. This is currently a very active research area within the AI field. Evolutionary Computation (EC) draws from concepts found in nature to drive development in evolution-based systems such as genetic algorithms and evolution systems. Alongside other nature-inspired metaheuristics, such as swarm intelligence, the path to a solution is driven by stochastic processes. This creates barriers to explainability: algorithms may return different solutions when re-run from the same input, and technical descriptions of these processes often hinder end-user understanding and acceptance. On the other hand, very often, XAI methods require the fitting of some kind of model, and hence EC methods have the potential to play a role in this area. This workshop will focus on the bidirectional interplay between XAI and EC. That is, discuss how XAI can help EC research and how EC can be used within XAI methods.
Recent growth in the adoption of black-box solutions, including EC-based methods into domains such as medical diagnosis, manufacturing, and transport & logistics, has led to greater attention being paid to generating explanations and their accessibility to end-users. This increased attention has helped create a fertile environment for applying XAI techniques in the EC domain for both end-user and researcher-focused explanation generation. Furthermore, many approaches to XAI in machine learning are based on search algorithms (e.g., Local Interpretable Model-Agnostic Explanations / LIME) that have the potential to draw on the expertise of the EC community. Finally, many of the broader questions (such as what kinds of explanations are most appealing or useful to end users) are faced by XAI researchers in general.
From an application perspective, important questions have arisen for which XAI may be crucial: Is the system biased? Has the problem been formulated correctly? Is the solution trustworthy and fair? The goal of XAI and related research is to develop methods to interrogate AI processes with the aim of answering these questions. This can support decision-makers while also building trust in AI decision-support through more readily understandable explanations.
Submission format
Full papers and extended abstracts:
- Full papers (8 pages + references): Must cover the ACM Open APC (see below for more information)
- Extended Abstracts (up to 4 pages): Are not eligible for APC - no fee paid by the authors for ACM Open Access. An Extended Abstract provides a summary of a work-in-progress, typically just enough for readers to understand the idea, scope, and potential impact. It often lacks full methodology, detailed results, or extensive references.
Important dates
- Submission opening: February 2, 2026
- Submission deadline:
March 27, 2026April 03, 2026 - Notification: April 24, 2026
- Camera-ready: May 5, 2026
- Author's mandatory registration: May 11, 2026
ACMs new Open Access publishing model for 2026 ACM Conferences
Starting January 1, 2026, ACM will fully transition to Open Access. All ACM publications, including those from ACM-sponsored conferences, will be 100% Open Access. Authors will have two primary options for publishing Open Access articles with ACM: the ACM Open institutional model or by paying Article Processing Charges (APCs). With over 2,600 institutions already part of ACM Open, the majority of ACM-sponsored conference papers will not require APCs from authors or conferences (currently, around 76%).
Authors from institutions not participating in ACM Open will need to pay an APC to publish their papers, unless they qualify for a financial waiver. To find out whether an APC applies to your article, please consult the list of participating institutions in ACM Open and review the APC Waivers and Discounts Policy. Keep in mind that waivers are rare and are granted based on specific criteria set by ACM.
Understanding that this change could present financial challenges, ACM has approved a temporary subsidy for 2026 to ease the transition and allow more time for institutions to join ACM Open. The subsidy will offer:
- $250 APC for ACM/SIG members
- $350 for non-members
This represents a 65% discount, funded directly by ACM. Authors are encouraged to help advocate for their institutions to join ACM Open during this transition period.
This temporary subsidized pricing will apply to all conferences scheduled for 2026.
Additionally, SIGEVO will provide an additional subsidy of $125 to papers accepted to GECCO 2026 (and only for 2026) that are subject to APCs. This will make the final amounts to be paid:
- $125 (USD) for SIGEVO members
- $225 (USD) for non-members
It is IMPORTANT to mention that both forms of subsidy (by ACM and by SIGVO) only apply to GECCO 2026. Moreover, it is still to be determined how the SIGEVO subsidy will be implemented, either directly to the APC or in other forms.
Finally, we note that APC charges apply to accepted Full Papers, but Abstracts (1-2 pages), Extended Abstracts (1-4 pages) and Tutorials ARE NOT APC Eligible; i.e., an APC will not have to be paid for these types of contributions.
ACM Authorship and Peer Review Policies on Generative AI
GECCO follows the official ACM policies on authorship and peer review, including the use of generative AI tools.
Under ACM's Authorship policy, generative AI tools and technologies cannot be listed as authors of an ACM published Work. The use of generative AI tools and technologies for assistance must be fully disclosed in the manuscript's Acknowledgments section. Authors are fully accountable for the originality, accuracy, and integrity of all submitted material.
In accordance with ACM's Peer Review policy, reviewers must not upload or share submitted manuscripts or review materials with generative AI systems. Reviewers may use generative AI or tools with the sole purpose of improving the quality and readability of reviewer reports for the author.
ACM is actively developing tools to help identify improper AI use in submissions, and GECCO may employ available detection methods. Submissions found to violate ACM policies may be rejected.
-
Organizers