You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using this EvalConfig in tfma.run_model_analysis everything runs successfully. Inspecting the result, there is plot data for the confusion matrix, yet when trying to plot the result using tfma.view.render_slicing_metrics, the confusion matrix won't show.
This happens in a kubeflow pipeline.
The text was updated successfully, but these errors were encountered:
schmidt-jake
changed the title
Multclass metrics and plots don't work / lack class names
Confusion matrics and calibration plots don't visualize
Jun 30, 2020
schmidt-jake
changed the title
Confusion matrics and calibration plots don't visualize
Confusion matrics and calibration plots don't visualize in kubeflow pipeline
Jun 30, 2020
"The plots are visualized using tfma.view.render_plot."
Ahh, I didn't know this. It seems to work now. However, the multiclass confusion matrix is virtually useless when it just displays integer class IDs (#78). Is there a way to add a list of class name strings in my EvalConfig or something?
Added comment in #78. The short answer is class labels are not yet supported. We are trying to avoid having to have this as part of the config and make it automatic.
System information
provided in TensorFlow Model Analysis): Yes
Describe the problem
When using this EvalConfig in
tfma.run_model_analysis
everything runs successfully. Inspecting the result, there is plot data for the confusion matrix, yet when trying to plot the result usingtfma.view.render_slicing_metrics
, the confusion matrix won't show.This happens in a kubeflow pipeline.
The text was updated successfully, but these errors were encountered: