Open Optimization Competition 2021
Competition and Benchmarking of Sampling-Based Optimization Algorithms

Nevergrad logo


November 18: Alea iacta est! Our nine jury members have selected three winning entries: Congratulation to the three winners!
A short summary of these entries is available in the SIGEVOlution newsletter.
We thank all participants for their diverse contributions.


In order to promote research in black-box optimization, we organize a competition around Nevergrad and IOHprofiler.
The competition has two tracks:
  • Track 1: Performance-Oriented Track: Contributors submit an optimization algorithm as a pull request in Nevergrad as detailed below. Several subtracks (“benchmark suites”) are available, covering a broad range of black-box optimization scenarios, from discrete over mixed-integer to continuous optimization, from “artificial” academic functions to real-world problems, from one-shot setting over sequential optimization to parallel settings.

  • Track 2: General Benchmarking Practices: contributions are made by pull request to Nevergrad or to IOHprofiler and by posting a short documentation (can be very short) somewhere (homepage, arXiv, pointer to a paper, etc). This track invites contributions to all aspects of benchmarking black-box optimization algorithms, e.g., suggesting new benchmark problems, performance measures or statistics, visualization of benchmark data, extending or improving the functionalities of the benchmarking environment, etc.

Jury Members

We thank our nine award committee members who had the difficult task to select the winning entries of the Open Optimization Competition 2021:

Deadlines (for both tracks)

Submission deadline: all submissions made between January 1 and September 30 that were not co-authored by members of the organizing team have automatically participated in the competition.

Hosting Events

The Open Optimization Competition 2021 was part of the following two events:
  • The IEEE Congress on Evolutionary Computation, CEC 2021, June 28-July 1 (online)
  • The ACM/SIGEVO Genetic and Evolutionary Computation Conference, GECCO 2021, July 10-14 (online)

How to participate

How to participate in Track 1: Performance-Oriented Track

  • Participants submit as many pull requests as they want.
  • Maintainers of the Nevergrad platform decide what is merged and what is not. Arguments are welcome in the pull request text for justifying that the code is worth being merged. Sending dozens of pull requests of variants of the same or similar algorithms is not necessarily welcome.
  • The Nevergrad maintainers also maintain the dashboard. They do their best for running algorithms sufficiently frequently for having the dashboard updated frequently. However, this can include noise in the competition -- life is unfair sometimes, we do our best. Please note that pull request do not have to mention names of authors. People can run their own experiments; however only the runs by the Nevergrad maintainers, on the dashboard, are taken into account at the end. Also, the maintainers can decide to modify a benchmark suite, if they feel that something must be done.
  • A committee decides who has won. The committee might just use the dashboard, for each track; but they can decide to exclude a competitor, for whatever reason they want. The committee is independent of the Nevergrad team. They have access to the pull request and the full history of the platform.
  • Some tracks might not have any winner, if the best algorithms were integrated before the present competition.

How to participate in Track 2: General Benchmarking Practices Track

Suggested topics: There is absolutely no restriction on what to submit. All submissions that contribute to building and establishing open-source, user-friendly, and community-driven platforms for comparing different optimization techniques are welcome.

How to participate:
  • Write a documentation about your contribution and make it available online (there are no formal requirements on where/how to publish the documentation, nor on the length of the documentation, arXiv and/or homepage is perfectly fine, and a short docu of 1 page can suffice)
  • Do a pull request, provide the link to the documentation, and mention that you participate in the paper track of the present competition
Evaluation Criteria: A dedicated committee chooses the winning entries, similar to choosing a best paper award in a conference.
Our policy re. possible conflict of interest: Award committee members can not propose as winner a person with whom they worked directly during the previous 12 months. There is no restriction for coworkers of other committee members than oneself. Submissions made by members of the organizing committee or by employees of Facebook can not be awarded. Their coworkers can be rewarded (but only for work that does not involve organizers nor Facebook employees).


The best way to reach us is through the Facebook group.
Alternatively, you can reach us via mail. Please send all inquiries to Carola and to Olivier, who will coordinate your request


Organizers of the competition are the Nevergrad and the IOHprofiler teams. Main contact persons are
  • Olivier Teytaud, Facebook Artificial Intelligence Research, Paris, France
  • Jeremy Rapin, Facebook Artificial Intelligence Research, Paris, France
  • Carola Doerr, CNRS researcher at LIP6, Sorbonne University, Paris, France
  • Thomas Bäck, LIACS, Leiden University, The Netherlands

Link to Previous Editions

2020 Open Optimization Competition

Page last modified: January 31, 2022.