News
Oct 31st, 2017
The results are now available. Oct 30th, 2017
The solutions are now available. Sept 8th, 2017
Update for Challenge 15 available, but will not count in evaluation. Sept 4th, 2017
Updated mailling list and submission information. Aug 23rd, 2017
The preliminary results have been sent out to participants, and are now available. July 09th, 2017
We fixed the intensities in the TSV archive for challenges 046-243. June 22nd, 2017
We added the Category 4 on a subset of the data files. May 22nd, 2017
We have improved challenges 29, 42, 71, 89, 105, 106 and 144. April 26th, 2017
The rules and challenges of CASMI 2017 are public now ! Jan 20th, 2017
Organisation of CASMI 2017 is underway, stay tuned!
Oct 31st, 2017
The results are now available. Oct 30th, 2017
The solutions are now available. Sept 8th, 2017
Update for Challenge 15 available, but will not count in evaluation. Sept 4th, 2017
Updated mailling list and submission information. Aug 23rd, 2017
The preliminary results have been sent out to participants, and are now available. July 09th, 2017
We fixed the intensities in the TSV archive for challenges 046-243. June 22nd, 2017
We added the Category 4 on a subset of the data files. May 22nd, 2017
We have improved challenges 29, 42, 71, 89, 105, 106 and 144. April 26th, 2017
The rules and challenges of CASMI 2017 are public now ! Jan 20th, 2017
Organisation of CASMI 2017 is underway, stay tuned!
Results in Category 3
Summary of participant performance
F1 score | Mean rank | Median rank | Top | Top3 | Top10 | Misses | TopPos | TopNeg | Mean RRP | Median RRP | N | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
tkind1 | 3707 | 4.33 | 2.0 | 91 | 148 | 193 | 34 | 52 | 39 | 0.644 | 0.750 | 238 |
tkind2 | 1704 | 15.20 | 6.0 | 29 | 63 | 107 | 72 | 15 | 14 | 0.755 | 0.846 | 240 |
TsugawaYamamoto | 1559 | 19.99 | 7.0 | 28 | 56 | 104 | 64 | 16 | 12 | 0.841 | 0.900 | 243 |
Table legend:
- F1 score
- The Formula 1 score awards points similar to the scheme in F1 racing for each challenge based on the rank of the correct solution. In the participant table, these are summed over all challenges. Please note that the F1 score is thus not neccessarily comparable across categories.
- Mean/Median rank
- Mean and median rank of the correct solution. For tied ranks with other candidates, the average rank of the ties is used.
- Top, Top3, Top10
- Number of challenges where the correct solution is ranked first, among the Top 3 and Top 10
- Misses
- Number of challenges where the correct solution is missing.
- TopPos, TopNeg
- Top1 ranked solutions in positive or negative ionization mode.
- Mean/Median RRP
- The relative ranking position, which is also incorporating the length of candidate list.
- N
- Number of submissions that have passed the evaluation scripts.
Summary of Rank by Challenge
For each challenge, the lowest rank among participants is highlighted in bold. If the submission did not contain the correct candidate this is denoted as "-". If someone did not participate in a challenge, the table cell is empty. The tables are sortable if you click into the column header. Category3:tkind1 | tkind2 | TsugawaYamamoto | |
---|---|---|---|
challenge-001 | 11.0 | 11.0 | 28.0 |
challenge-002 | 6.0 | 6.0 | 5.0 |
challenge-003 | 1.0 | 1.0 | 1.0 |
challenge-004 | 60.5 | 60.5 | 14.0 |
challenge-005 | 20.0 | 18.0 | 32.0 |
challenge-006 | 5.0 | 23.0 | 3.5 |
challenge-007 | 1.0 | 1.0 | 2.0 |
challenge-008 | 10.0 | 10.0 | 2.0 |
challenge-009 | 72.5 | 78.5 | 4.5 |
challenge-010 | - | - | - |
challenge-011 | - | - | - |
challenge-012 | 15.0 | 16.0 | 80.0 |
challenge-013 | - | 94.5 | 11.5 |
challenge-014 | 22.0 | 20.0 | 24.0 |
challenge-015 | 43.5 | 42.5 | 31.0 |
challenge-016 | 4.5 | 3.5 | 30.0 |
challenge-017 | 2.0 | 2.0 | 5.0 |
challenge-018 | 3.5 | 3.5 | 3.0 |
challenge-019 | - | - | 2.0 |
challenge-020 | 6.0 | 6.0 | 9.0 |
challenge-021 | - | - | - |
challenge-022 | 3.0 | ||
challenge-023 | - | - | - |
challenge-024 | 3.0 | 4.0 | 15.5 |
challenge-025 | - | - | - |
challenge-026 | 81.0 | 83.0 | 8.0 |
challenge-027 | - | - | - |
challenge-028 | - | - | - |
challenge-029 | 5.0 | 5.0 | 2.0 |
challenge-030 | - | - | 21.0 |
challenge-031 | 2.0 | 2.0 | 1.5 |
challenge-032 | 10.0 | 7.0 | 6.0 |
challenge-033 | - | - | - |
challenge-034 | 68.0 | 73.0 | 67.0 |
challenge-035 | 8.5 | 7.5 | 101.0 |
challenge-036 | 4.0 | 4.0 | 11.0 |
challenge-037 | 13.0 | 13.0 | 2.0 |
challenge-038 | 23.0 | 24.0 | 6.0 |
challenge-039 | 3.0 | 2.0 | 16.0 |
challenge-040 | 2.5 | 2.5 | 1.5 |
challenge-041 | - | - | - |
challenge-042 | - | ||
challenge-043 | - | - | - |
challenge-044 | - | - | - |
challenge-045 | - | - | - |
challenge-046 | 1.0 | 3.5 | 6.0 |
challenge-047 | - | 5.5 | 7.0 |
challenge-048 | 3.0 | 16.0 | 48.0 |
challenge-049 | 5.0 | - | 88.0 |
challenge-050 | - | 10.5 | 3.5 |
challenge-051 | 1.0 | - | - |
challenge-052 | 1.0 | - | - |
challenge-053 | 1.0 | 1.0 | 1.0 |
challenge-054 | 2.0 | 2.0 | 25.0 |
challenge-055 | 1.0 | 11.0 | |
challenge-056 | 7.0 | - | - |
challenge-057 | 1.0 | 1.0 | 1.0 |
challenge-058 | 5.0 | - | 11.0 |
challenge-059 | 1.0 | 1.5 | 5.0 |
challenge-060 | 4.0 | 55.0 | 61.5 |
challenge-061 | 4.0 | 16.5 | 19.0 |
challenge-062 | 2.0 | 9.0 | 9.0 |
challenge-063 | 2.0 | - | - |
challenge-064 | 1.0 | 2.0 | 3.0 |
challenge-065 | 1.0 | 1.0 | 5.0 |
challenge-066 | 4.0 | - | - |
challenge-067 | 2.0 | 40.0 | 44.5 |
challenge-068 | 2.0 | 2.0 | 2.0 |
challenge-069 | 1.0 | 1.5 | 1.0 |
challenge-070 | 3.0 | - | - |
challenge-071 | 1.0 | 2.0 | 25.0 |
challenge-072 | 3.0 | 3.0 | 12.0 |
challenge-073 | 1.0 | 1.0 | 1.0 |
challenge-074 | - | 1.0 | 1.0 |
challenge-075 | 3.5 | - | - |
challenge-076 | 4.0 | 10.0 | 37.5 |
challenge-077 | 7.0 | - | 88.5 |
challenge-078 | 4.0 | 33.5 | 54.0 |
challenge-079 | 2.0 | 32.5 | 23.0 |
challenge-080 | 1.0 | 15.5 | 5.0 |
challenge-081 | 1.0 | - | - |
challenge-082 | 2.5 | 30.0 | 59.0 |
challenge-083 | 3.5 | 6.5 | 4.0 |
challenge-084 | 1.0 | 20.0 | 48.0 |
challenge-085 | 1.0 | - | - |
challenge-086 | 1.0 | - | - |
challenge-087 | 3.0 | 32.5 | 22.0 |
challenge-088 | 3.5 | 7.0 | 5.0 |
challenge-089 | 1.0 | 1.0 | 11.0 |
challenge-090 | 1.0 | 1.5 | 1.0 |
challenge-091 | - | - | - |
challenge-092 | 1.5 | - | - |
challenge-093 | 1.0 | 2.0 | 13.0 |
challenge-094 | 4.5 | - | - |
challenge-095 | 3.0 | - | - |
challenge-096 | - | - | - |
challenge-097 | 1.0 | - | - |
challenge-098 | 1.0 | 1.0 | 1.0 |
challenge-099 | 1.0 | 2.5 | 9.0 |
challenge-100 | 3.0 | - | - |
challenge-101 | 1.0 | 17.5 | 10.0 |
challenge-102 | 2.5 | 13.0 | 16.5 |
challenge-103 | 1.5 | 5.0 | 1.5 |
challenge-104 | 5.0 | - | - |
challenge-105 | 2.0 | 28.5 | 47.5 |
challenge-106 | 1.0 | 3.5 | 6.5 |
challenge-107 | 1.5 | - | - |
challenge-108 | 1.0 | 7.0 | 33.0 |
challenge-109 | 1.0 | 20.5 | 15.5 |
challenge-110 | 1.0 | 1.0 | 1.0 |
challenge-111 | - | - | - |
challenge-112 | 1.0 | 11.0 | 9.0 |
challenge-113 | 3.0 | 5.0 | 2.0 |
challenge-114 | 2.0 | 9.0 | 4.0 |
challenge-115 | 1.0 | 1.0 | - |
challenge-116 | 1.0 | 1.0 | 1.0 |
challenge-117 | 1.0 | 1.0 | 7.0 |
challenge-118 | - | - | - |
challenge-119 | 1.0 | 1.0 | 10.0 |
challenge-120 | 1.5 | - | - |
challenge-121 | 1.0 | 1.5 | 3.0 |
challenge-122 | 1.0 | 2.0 | 16.0 |
challenge-123 | 5.0 | - | 96.0 |
challenge-124 | 1.0 | 8.5 | 2.0 |
challenge-125 | 2.0 | 23.0 | 23.0 |
challenge-126 | 1.0 | 1.5 | 1.0 |
challenge-127 | 1.0 | 1.0 | 1.0 |
challenge-128 | 1.0 | - | - |
challenge-129 | 1.0 | 1.0 | |
challenge-130 | - | - | |
challenge-131 | - | 9.0 | 8.0 |
challenge-132 | 1.0 | 2.5 | 2.5 |
challenge-133 | - | 6.0 | 4.0 |
challenge-134 | 3.0 | 78.0 | 56.0 |
challenge-135 | 6.0 | - | 221.0 |
challenge-136 | 1.0 | 4.0 | 2.5 |
challenge-137 | 1.0 | 4.0 | 4.0 |
challenge-138 | 3.0 | 42.0 | 147.0 |
challenge-139 | 1.0 | 1.5 | 3.0 |
challenge-140 | 1.0 | 3.0 | 48.5 |
challenge-141 | 1.0 | - | - |
challenge-142 | 2.0 | 9.0 | 9.0 |
challenge-143 | 5.0 | 89.0 | 75.5 |
challenge-144 | - | - | - |
challenge-145 | 2.0 | 10.0 | 29.0 |
challenge-146 | - | 12.5 | 4.0 |
challenge-147 | 2.0 | 23.0 | 17.5 |
challenge-148 | 1.0 | 1.0 | 7.5 |
challenge-149 | 1.0 | - | - |
challenge-150 | 1.0 | 2.0 | 2.0 |
challenge-151 | 1.0 | 2.0 | 12.0 |
challenge-152 | 1.0 | 3.0 | 3.0 |
challenge-153 | - | - | - |
challenge-154 | 1.0 | 1.0 | 1.0 |
challenge-155 | 1.0 | 5.0 | 3.0 |
challenge-156 | 6.5 | 56.5 | 77.0 |
challenge-157 | 6.0 | 84.5 | 156.0 |
challenge-158 | 3.0 | 14.0 | 16.0 |
challenge-159 | 7.0 | 41.0 | 27.0 |
challenge-160 | 1.0 | 2.0 | 2.0 |
challenge-161 | 5.0 | 36.0 | 71.0 |
challenge-162 | 2.0 | 7.0 | 66.0 |
challenge-163 | 1.0 | 1.0 | 3.0 |
challenge-164 | 4.0 | 47.5 | 36.0 |
challenge-165 | 1.0 | 1.0 | 2.0 |
challenge-166 | 2.0 | - | - |
challenge-167 | 1.0 | 3.5 | 13.0 |
challenge-168 | 4.0 | - | 96.0 |
challenge-169 | 1.0 | 1.5 | 1.0 |
challenge-170 | 1.0 | 54.5 | 2.5 |
challenge-171 | 3.0 | - | - |
challenge-172 | - | - | - |
challenge-173 | 1.0 | - | - |
challenge-174 | 1.0 | 7.0 | 9.0 |
challenge-175 | 4.0 | 7.0 | 1.0 |
challenge-176 | 4.0 | - | 114.0 |
challenge-177 | 1.0 | 2.0 | 1.0 |
challenge-178 | 2.0 | 41.0 | 58.0 |
challenge-179 | - | - | 2.5 |
challenge-180 | - | 2.0 | 1.0 |
challenge-181 | 1.0 | 4.0 | 33.0 |
challenge-182 | 6.5 | 65.0 | 52.0 |
challenge-183 | 1.0 | 1.0 | 2.0 |
challenge-184 | 2.0 | 28.0 | 19.0 |
challenge-185 | 2.0 | 9.0 | 5.0 |
challenge-186 | 4.0 | 52.0 | 42.0 |
challenge-187 | 1.0 | - | - |
challenge-188 | 4.0 | 43.5 | 17.5 |
challenge-189 | 3.0 | - | - |
challenge-190 | 9.0 | - | 35.0 |
challenge-191 | 2.0 | 12.0 | 7.0 |
challenge-192 | 2.0 | - | - |
challenge-193 | 3.0 | 30.5 | 25.0 |
challenge-194 | 1.0 | - | - |
challenge-195 | 3.5 | 8.0 | 4.0 |
challenge-196 | 1.0 | 1.0 | 15.5 |
challenge-197 | - | - | - |
challenge-198 | 2.0 | 19.0 | 13.0 |
challenge-199 | 2.0 | 6.0 | 1.0 |
challenge-200 | 2.0 | 53.0 | 6.0 |
challenge-201 | 5.0 | - | - |
challenge-202 | 1.0 | - | - |
challenge-203 | - | - | - |
challenge-204 | 2.0 | - | - |
challenge-205 | 1.0 | 23.0 | 6.5 |
challenge-206 | 1.0 | 1.5 | 1.0 |
challenge-207 | 1.0 | 2.0 | 7.0 |
challenge-208 | 3.0 | - | - |
challenge-209 | 1.0 | 23.0 | 6.0 |
challenge-210 | 1.5 | 2.0 | 9.0 |
challenge-211 | 4.0 | 56.5 | 15.0 |
challenge-212 | 1.5 | 5.0 | 14.0 |
challenge-213 | 1.0 | - | - |
challenge-214 | 2.0 | 10.5 | 10.0 |
challenge-215 | 3.0 | 6.5 | 1.0 |
challenge-216 | 1.0 | 22.5 | 25.5 |
challenge-217 | 2.5 | 3.5 | 4.0 |
challenge-218 | 1.0 | 2.5 | 3.5 |
challenge-219 | 1.0 | - | - |
challenge-220 | 4.5 | 4.5 | 3.5 |
challenge-221 | 1.0 | 6.5 | 5.0 |
challenge-222 | 1.0 | 5.5 | 6.0 |
challenge-223 | 1.0 | 1.0 | 1.0 |
challenge-224 | 2.0 | - | - |
challenge-225 | 2.0 | 12.0 | 5.0 |
challenge-226 | 1.0 | 1.0 | - |
challenge-227 | 1.0 | 1.0 | 20.0 |
challenge-228 | - | - | - |
challenge-229 | 3.0 | 33.0 | 11.0 |
challenge-230 | 1.0 | - | - |
challenge-231 | 1.0 | 1.0 | 1.0 |
challenge-232 | 1.0 | 1.0 | 2.0 |
challenge-233 | 1.0 | 3.0 | 1.0 |
challenge-234 | 4.0 | 65.0 | 70.0 |
challenge-235 | 4.0 | 30.5 | 14.5 |
challenge-236 | 1.0 | 2.0 | 5.5 |
challenge-237 | 1.0 | 1.5 | 1.0 |
challenge-238 | 1.0 | 1.0 | 1.0 |
challenge-239 | 2.0 | 6.0 | 1.0 |
challenge-240 | 1.0 | - | - |
challenge-241 | - | - | |
challenge-242 | 1.0 | 1.0 | 1.0 |
challenge-243 | - | 11.5 | 5.5 |
Participant information and abstracts
ParticipantID: TsugawaYamamoto Category: category3 Authors: Tsugawa, Hiroshi(1,2) and Yamamoto, Hiroyuki (3) Affiliations: (1) RIKEN Center for Sustainable Resource Science, 1-7-22 Suehiro-cho, Tsurumi-ku, Yokohama, Kanagawa 230-0045, Japan (2) RIKEN Center for Integrative Medical Sciences, 1-7-22 Suehiro-cho, Tsurumi-ku, Yokohama, Kanagawa 230-0045, Japan (3) Human Metabolome Technologies, Inc., 246-2 Mizukami, Kakuganji, Tsuruoka, Yamagata 997-0052, Japan. Automatic methods: yes Abstract Our method is an integrated approach using MS-FINDER (internal version, i.e. under development) and Sirius 3.5 (build 3)/CSI:FingerID GUI. Hiroyuki Yamamoto manually predicted the structure candidates for positive ion mode data by Sirius 3.5(build 3) with the following conditions. Sirius/CSI:FingerID parameters: ion/adduct type: Unknown positive collision energy: 0 (no set) Parent mass: manually typed by Pepmass value recorded in mgf Search in: PubChem, search all Then, Hiroshi Tsugawa used MS-FINDER internal version to rank structure candidates for positive/negative ion data with the following settings. Seach option: Formula prediction and structure elucidation by in silico fragmenter #Mass spectrum tab Mass tolerance type: ppm Mass tolerance (MS1): 10 ppm (for challenge 1-45) and 6 ppm (for challenge 46-243) Mass tolerance (MS2): 20 ppm Relative abundance cut off: 1% Mass range max: 1000 Mass range min: 50 #Formula finder tab LEWIS and SENIOR check: checked Isotopic ratio tolerance: 20% Elemental ratio check: 99.7% range Element probability check: unchecked Element selection: O, N, P, S Maximum report number: 5 #Structure finder Tree depth: 2 Maximum report number: 1000 #Data source Local databases: ChEBI, PlantCyc, KNApSAcK, FooDB, PubChem (biomolecules), UNPD, NANPDB MINEs: unchecked PubChem: unchecked For negative ion mode data, the results of MS-FINDER are submitted. (Because CSI:FingerID does not provide the function for negative ion mode yet as long as Hiroyuki Yamamoto examined.) For positive ion mode data, the integration of Hiroyuki Yamamoto (i.e. CSI:FingerID) and Hiroshi Tsugawa (i.e. MS-FINDER) was performed by a 'voting' method. The structure candidates were ranked by both programs, and then, the ranking orders from both programs were summed. Then, the structure candidates were ordered by the summed value (i.e. smaller value will be assigned as a better candidate).
ParticipantID: tkind1 Category: category3 Authors: Tobias Kind Affiliations: UC Davis Genome Center, USA Automatic pipeline: yes Spectral libraries: no Abstract We imported all 243 MS/MS challenge files into MS-Finder v2.14 (June 6,2017). Other data import/export was handled in EXCEL VBA. Minor prior adjustments to adduct types and number of peaks were handled in Microsoft EXCEL. We applied the following settings in MS-Finder: 1) Methods: Formula prediction and structure elucidation with in-silico fragmenter: yes; no spectral database search; Precursor option on 2) Mass Spectrum: Mass tolerance (MS1) 6 ppm and Mass tolerance (MS2) 20 ppm relative abundance cutoff 1%; Mass range 50-1000 Da 3) Formula Finder: Lewis and Senior check on; isotope ratio tolerance 20%; element ration 99.7%; element probability off; elements CHNSOPCl; minimum reports 100 4) Structure Finder: Tree depth 2; use the fragment library for EI (on); use the fragmentation library for low energy CID (off); maximum report number 100 5) Data source: for cases 0-45 UNPD (plant metabolite library) and plant metabolite target library and for cases 46 to 243 only the plant metabolite target library. All other libraries were off. Compound annotation was performed in batch mode. Processing time was 5:17 minutes (Dual Xeon 3.1 Ghz) The top 100 best ranked compounds were exported as SMILES with their related scores. No additional MS/MS library search was performed. No additional post-curation was performed. MS-Finder can be obtained from http://prime.psc.riken.jp/Metabolomics_Software/MS-FINDER/index.html
ParticipantID: tkind2 Category: category3 Authors: Tobias Kind Affiliations: UC Davis Genome Center, USA Automatic pipeline: yes Spectral libraries: no Abstract: Minor prior adjustments of precursor types, number of peaks and other data were performed in Microsoft Excel. Data import/export for MS-Finder was handled Excel VBA. All 243 MS/MS spectral files were imported into MS-Finder v2.14 (June 6,2017). (http://prime.psc.riken.jp/Metabolomics_Software/MS-FINDER/index.html) We applied the following settings in MS-Finder: 1) Methods: Formula prediction and structure elucidation with in-silico fragmenter: yes; no spectral database search; Precursor option on 2) Mass Spectrum: Mass tolerance (MS1) 6 ppm and Mass tolerance (MS2) 20 ppm relative abundance cutoff 1%; Mass range 50-1000 Da 3) Formula Finder: Lewis and Senior check on; isotope ratio tolerance 20%; element ration 99.7%; element probability off; elements CHNSOPCl; minimum reports 10 4) Structure Finder: Tree depth 2; use the fragment library for EI (on); use the fragmentation library for low energy CID (off); maximum report number 100 5) Data source: As target compound database we searched the PlantCyc database, the UNPD plant metabolite library and the KNApSAcK plant DB. Compound annotation was performed in batch mode. Processing time was two days on a 16-core Dual-Xeon (3.1 Ghz). The top 100 best ranked compounds were exported as SMILES with their related scores. No additional MS/MS library search or manual curation was performed.