Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

evaluate.py 959 B

You have to be logged in to leave a comment. Sign In
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
  1. import dask
  2. import dask.distributed
  3. import sklearn.metrics as metrics
  4. from sklearn.metrics import precision_recall_curve
  5. import pickle
  6. import conf
  7. client = dask.distributed.Client('localhost:8786')
  8. MODEL_FILE = conf.model
  9. TEST_MATRIX_FILE = conf.test_matrix
  10. METRICS_FILE = conf.metrics_file
  11. @dask.delayed
  12. def workflow(model_file, test_matrix_file):
  13. with open(model_file, 'rb') as fd:
  14. model = pickle.load(fd)
  15. with open(test_matrix_file, 'rb') as fd:
  16. matrix = pickle.load(fd)
  17. labels = matrix[:, 1].toarray()
  18. x = matrix[:, 2:]
  19. predictions_by_class = model.predict_proba(x)
  20. predictions = predictions_by_class[:, 1]
  21. precision, recall, thresholds = precision_recall_curve(labels, predictions)
  22. auc = metrics.auc(recall, precision)
  23. return auc
  24. auc = workflow(MODEL_FILE, TEST_MATRIX_FILE).compute()
  25. print('AUC={}'.format(auc))
  26. with open(METRICS_FILE, 'w') as fd:
  27. fd.write('AUC: {:4f}\n'.format(auc))
Tip!

Press p or to see the previous file or, n or to see the next file

Comments

Loading...