Open Optimization Competition 2021
Competition and Benchmarking of Sampling-Based Optimization Algorithms

Nevergrad logo

News


  • January 23: We are thrilled to announce the award committee members for Track 2 submissions:
  • January 1: All pull requests made between today and September 30 participate in the competition. Good luck to all participants. We look forward to your submissions!

Motivation


In order to promote research in black-box optimization, we organize a competition around Nevergrad and IOHprofiler.
The competition has two tracks:
  • Track 1: Performance-Oriented Track: Contributors submit an optimization algorithm as a pull request in Nevergrad as detailed below. Several subtracks (“benchmark suites”) are available, covering a broad range of black-box optimization scenarios, from discrete over mixed-integer to continuous optimization, from “artificial” academic functions to real-world problems, from one-shot setting over sequential optimization to parallel settings.

  • Track 2: General Benchmarking Practices: contributions are made by pull request to Nevergrad or to IOHprofiler and by posting a short documentation (can be very short) somewhere (homepage, arXiv, pointer to a paper, etc). This track invites contributions to all aspects of benchmarking black-box optimization algorithms, e.g., suggesting new benchmark problems, performance measures or statistics, visualization of benchmark data, extending or improving the functionalities of the benchmarking environment, etc.

Deadlines (for both tracks)


Submission deadline: submissions must be made before September 30, 2021.
The dashboard might still move after this deadline, as some experiments might still be running. The organizers also reserve the right to perform additional evaluations in case of ties or generalizability questions.
An intermediate hall of fame, with winners at that point, will be published
  • At CEC 2021, June 28-July 1, for contributions received before May 31
  • At GECCO 2021, July 10th, for contributions received before June 31
  • On the web pages of Nevergrad and IOHprofiler: November 1st, 2021, for contributions received before September 30.

How to participate


How to participate in Track 1: Performance-Oriented Track

  • Participants submit as many pull requests as they want.
  • Maintainers of the Nevergrad platform decide what is merged and what is not. Arguments are welcome in the pull request text for justifying that the code is worth being merged. Sending dozens of pull requests of variants of the same or similar algorithms is not necessarily welcome.
  • The Nevergrad maintainers also maintain the dashboard. They do their best for running algorithms sufficiently frequently for having the dashboard updated frequently. However, this can include noise in the competition -- life is unfair sometimes, we do our best. Please note that pull request do not have to mention names of authors. People can run their own experiments; however only the runs by the Nevergrad maintainers, on the dashboard, are taken into account at the end. Also, the maintainers can decide to modify a benchmark suite, if they feel that something must be done.
  • A committee decides who has won. The committee might just use the dashboard, for each track; but they can decide to exclude a competitor, for whatever reason they want. The committee is independent of the Nevergrad team. They have access to the pull request and the full history of the platform.
  • Some tracks might not have any winner, if the best algorithms were integrated before the present competition.

How to participate in Track 2: General Benchmarking Practices Track

Suggested topics: There is absolutely no restriction on what to submit. All submissions that contribute to building and establishing open-source, user-friendly, and community-driven platforms for comparing different optimization techniques are welcome.

How to participate:
  • Write a documentation about your contribution and make it available online (there are no formal requirements on where/how to publish the documentation, nor on the length of the documentation, arXiv and/or homepage is perfectly fine, and a short docu of 1 page can suffice)
  • Do a pull request, provide the link to the documentation, and mention that you participate in the paper track of the present competition
Evaluation Criteria: A dedicated committee chooses the winning entries, similar to choosing a best paper award in a conference.
Our policy re. possible conflict of interest: Award committee members can not propose as winner a person with whom they worked directly during the previous 12 months. There is no restriction for coworkers of other committee members than oneself. Submissions made by members of the organizing committee or by employees of Facebook can not be awarded. Their coworkers can be rewarded (but only for work that does not involve organizers nor Facebook employees).

Committee Members:

Questions?


The best way to reach us is through the Facebook group.
Alternatively, you can reach us via mail. Please send all inquiries to Carola and to Olivier, who will coordinate your request

Organizers


Organizers of the competition are the Nevergrad and the IOHprofiler teams. Main contact persons are
  • Olivier Teytaud, Facebook Artificial Intelligence Research, Paris, France
  • Jeremy Rapin, Facebook Artificial Intelligence Research, Paris, France
  • Carola Doerr, CNRS researcher at LIP6, Sorbonne University, Paris, France
  • Thomas Bäck, LIACS, Leiden University, The Netherlands

Link to Previous Editions


2020 Open Optimization Competition

Page last modified: January 23, 2021.