{ "paper_id": "M93-1034", "header": { "generated_with": "S2ORC 1.0.0", "date_generated": "2023-01-19T03:14:20.130217Z" }, "title": "APPENDIX B : MUC-5 TEST SCORES", "authors": [], "year": "", "venue": null, "identifiers": {}, "abstract": "This appendix contains the summary score reports for each system on the MUC-5 evaluation. The reports are ordered by language/domain (EJV, JJV, EME, JME) and secondarily by site (in alphabetical order). Deeper investigation into system performance may be done on the basis of template-by-template scor e reports, which detail the performance of the system on each article in the test set. The template-by-template score reports are not included in this appendix. A brief introduction to reading the score reports is presented here. Some information on the performance metrics may also be found in the paper, \"Tipster/MUC-5 Information Extraction System Evaluation\" by B. Sundheim in this volume. Further description of the score report and further definition and explanation of the scoring categories and performance metrics may be found in this volume in the paper titled \"MUC-5 Evaluatio n Metrics\" by N. Chinchor and B. Sundheim. The first column in the report, SLOT, contains one of the following three types of label: A slot name, e.g ., conten t An object name, e .g .,