danieldux commited on
Commit
78950cf
1 Parent(s): 19a950c

Add ISCO-08 Hierarchical Accuracy Measure metric***

Browse files

***This commit adds a new metric called ISCO-08 Hierarchical Accuracy Measure. The metric calculates hierarchical precision, recall, and F1 given a list of reference codes and predicted codes from the ISCO-08 taxonomy. It also includes the necessary functions to download and prepare the ISCO-08 csv file from the ILO website.***

***The metric is implemented as a class called ISCO_Hierarchical_Accuracy, which inherits from the evaluate.Metric class. It provides a _compute method to calculate the accuracy scores and a _download_and_prepare method to download and create the hierarchy dictionary.***

***The commit also includes the necessary imports, descriptions, and examples for the metric.***

***The ISCO-08 Hierarchical Accuracy Measure metric can be used to evaluate the accuracy of predictions in the ISCO-08 classification scheme.***

***This commit is based on the changes in the isco_hierarchical_accuracy.py file.***

metric_template_1.py → isco_hierarchical_accuracy.py RENAMED
@@ -53,7 +53,7 @@ Returns:
53
  Examples:
54
  Example 1
55
 
56
- >>> hierarchical_accuracy_metric = evaluate.load("ham")
57
  >>> results = ham.compute(reference=["1111", "1112", "1113", "1114"], predictions=["1111", "1113", "1120", "1211"])
58
  >>> print(results)
59
  {
 
53
  Examples:
54
  Example 1
55
 
56
+ >>> ham = evaluate.load("danieldux/isco_hierarchical_accuracy")
57
  >>> results = ham.compute(reference=["1111", "1112", "1113", "1114"], predictions=["1111", "1113", "1120", "1211"])
58
  >>> print(results)
59
  {