<h1>COCO 2018 Keypoint Detection Task</h1>
<p><a href="images/keypoints-splash-big.png"><img src="images/keypoints-splash.png" class="wide"/></a></p>

<h1>1. Overview</h1>
<p>The COCO Keypoint Detection Task requires localization of person keypoints in challenging, uncontrolled conditions. The keypoint task involves simultaneously detecting people <i>and</i> localizing their keypoints (person locations are <i>not</i> given at test time). For full details of this task please see the <a href="#keypoints-eval">keypoint evaluation</a> page.</p>
<p>This task is part of the <a href="workshop/coco-mapillary-eccv-2018.html">Joint COCO and Mapillary Recognition Challenge Workshop</a> at ECCV 2018. For further details about the joint workshop please visit the workshop page. Please also see the related COCO <a href="#detection-2018">detection</a>, <a href="#stuff-2018">stuff</a>, and <a href="#panoptic-2018">panoptic</a> tasks.</p>
<p>The COCO train, validation, and test sets, containing more than 200,000 images and 250,000 person instances labeled with keypoints (the majority of people in COCO at medium and large scales) are available for <a href="#download">download</a>. Annotations on train and val (with over 150,000 people and 1.7 million labeled keypoints) are publicly available.</p>
<p>This is the third iteration of keypoint task and it exactly follows the <a href="#keypoints-2017">COCO 2017 Keypoint Detection Task</a>. In particular, the same data, metrics, and guidelines are being used for this year's task.</p>
<p><b>Note: evaluation servers for the 2018 task will be open soon. For testing 2017 servers may be used for now.</b></p>

<h1>2. Dates</h1>
<div class="json">
  <div class="jsonktxt fontBlue">August 10, 2018</div><div class="jsonvtxt">Submission deadline (11:59 PST)</div>
  <div class="jsonktxt">August 26, 2018</div><div class="jsonvtxt">Challenge winners notified</div>
  <div class="jsonktxt">September 9, 2018</div><div class="jsonvtxt">Winners present at ECCV 2018 Workshop</div>
</div>

<h1>3. Organizers</h1>
<div>Tsung-Yi Lin (Google Brain)</div>
<div>Genevieve Patterson (MSR, Trash TV)</div>
<div>Matteo Ruggero Ronchi (Caltech)</div>
<div>Yin Cui (Cornell Tech)</div>
<div>Michael Maire (TTI-Chicago)</div>
<div>Piotr Dollár (Facebook AI Research)</div>

<h1>4. Award Committee</h1>
<div>Genevieve Patterson (MSR, Trash TV)</div>
<div>Matteo Ruggero Ronchi (Caltech)</div>
<div>Yin Cui (Cornell Tech)</div>
<div>Michael Maire (TTI-Chicago)</div>
<div>Serge Belongie (Cornell Tech)</div>
<div>Lubomir Bourdev (WaveOne, Inc.)</div>
<div>James Hays (Georgia Tech)</div>
<div>Pietro Perona (Caltech)</div>
<div>Deva Ramanan (CMU)</div>

<h1>5. Task Guidelines</h1>
<p>Participants are recommended but not restricted to train their algorithms on COCO 2017 train and val sets. The <a href="#download">download</a> page has links to all COCO 2017 data. The COCO test set is divided into two splits: test-dev and test-challenge. Test-dev is as the default test set for testing under general circumstances and is used to maintain a public <a href="#keypoints-leaderboard">leaderboard</a>. Test-challenge is used for the workshop competition; results will be revealed at the workshop. When participating in this task, please specify any and all external data used for training in the "method description" when uploading results to the evaluation server. A more thorough explanation of all these details is available on the <a href="#guidelines">guidelines</a> page, please be sure to review it carefully prior to participating. Results in the correct <a href="#format-results">format</a> must be <a href="#upload">uploaded</a> to the <a href="https://competitions.codalab.org/competitions/12061" target="_blank">evaluation server</a>. The <a href="#keypoints-eval">evaluation</a> page lists detailed information regarding how results will be evaluate. Challenge participants with the most successful and innovative methods will be invited to present at the workshop.</p>

<h1>6. Tools and Instructions</h1>
<p>We provide extensive API support for the COCO images, annotations, and evaluation code. To download the COCO API, please visit our <a href="https://github.com/cocodataset/cocoapi">GitHub repository</a>. For an overview of how to use the API, please visit the <a href="#download">download</a> page. Due to the large size of COCO and the complexity of this task, the process of participating may not seem simple. To help, we provide explanations and instructions for each step of the process on the <a href="#download">download</a>, <a href="#format-data">data format</a>, <a href="#format-results">results format</a>, <a href="#guidelines">guidelines</a>, <a href="#upload">upload</a>, and <a href="#keypoints-eval">evaluation</a> pages. For additional questions, please contact <a href="mailto:info@cocodataset.org">info@cocodataset.org</a>.</p>
