![]() Python python/evaluateScene.py /path/to/SpaceNetTruthFile.csv \ To run the metric you can use the following command. ![]() ![]() Use the metric implementation code to self evaluate.All proposed polygons should be legitimate (they should have an area, they should have points that at least make a triangle instead of a point or a line, etc).The images provided could contain anywhere from zero to multiple buildings.The F1 score is between 0 and 1, where larger numbers are better scores. ![]() Let tp denote the number of true positives of the M proposed polygons. For this competition, the number of true positives and false positives are aggregated over all of the test imagery and the F1 score is computed from the aggregated counts.įor example, suppose there are N polygon labels for building footprints that are considered ground truth and suppose there are M proposed polygons by an entry in the SpaceNet competition. The F1 score is the harmonic mean of precision and recall, combining the accuracy in the precision measure and the completeness in the recall measure. The value of IoU is between 0 and 1, where closer polygons have higher IoU values. The measure of proximity between labeled polygons and proposed polygons is the Jaccard similarity or the “Intersection over Union (IoU)”, defined as: There is at most one “true positive” per labeled polygon.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |