Thank you very much for joining us at U-RISC. Based on the feedback from participants and other experts, the organizers decided to modify the evaluation metric, which will adopt two major changes:
1) the new evaluation metric no longer bolds the edges of ground truth and submissions and therefore increases the difficulty of the tasks;
2) in the simple track, changes have been made to the binarization order in the ground truth evaluation, which will lower the probability of error reports.
After, 02:30 Dec 25 (UTC), all new submissions will be evaluated by the new metric. We will also re-score all existed submissions in the next 5 workdays.
The competition will end on Jan 15, 2020. Top 30 teams from each track will be required to submit and test their models on our servers. We will directly run the models on the test set, which won’t be released on Jan 19.
The evaluation instruction in the Simple Track has been updated. Please check for details.
In the Simple Track, there are other value besides 0 and 255 in the ground truth.
That is because in the making of dataset, image binarization was finished first, and then sampling via bilinear interpolation was conducted. In fact, Nearest Neighbor should have been adopted as the sampling method. In evaluation, ground truth was binarized first (threshold value: 122) before the process. we sincerely apologize for the mistake.
Regarding the Submission Format for biendata “Ultra-high Resolution EM Images Segmentation Challenge”
Thank you very much for joining the competition. To improve the accuracy of the evaluation process, we update the submission format requirements as follows
1.Submitted files must be in white background with black color suggesting cell membrane;
2.Contestants need to choose proper threshold value for image binarization and upload files in “.tiff”. Reference compression codes are available in https://www.biendata.com/models/category/2953/L_notebook/;
3.Submitted files must be in the same filename with original dataset files, i.e., “0164_1_1565791505_58.tiff” for “0164_1_1565791505_58.png”.
Once again, thank you for your participation and cooperation. Good Luck!
For each given SEM image, participants need to return an image that depicts the boundary of all neurons. The image can be compressed by the piece of code provided by organizers. The image file name should be the same as the corresponding unlabeled image. Before submission, all result images need to be zipped as a file named result.zip.
The evaluation metric is F-score.
Ultra-high Resolution EM Images Segmentation Challenge