-
Notifications
You must be signed in to change notification settings - Fork 0
Add metric argument to confusion_matrix() #84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Reviewer's guide (collapsed on small PRs)Reviewer's GuideThe change introduces a new metric argument to audplot.confusion_matrix, updating its signature and documentation, and uses the provided metric callable instead of hardcoding audmetric.confusion_matrix. Sequence diagram for metric argument usage in confusion_matrix()sequenceDiagram
participant User
participant confusion_matrix
participant MetricFunction
User->>confusion_matrix: Call confusion_matrix(truth, prediction, metric=custom_metric)
confusion_matrix->>MetricFunction: metric(truth, prediction, labels, normalize)
MetricFunction-->>confusion_matrix: Return confusion matrix
confusion_matrix-->>User: Return plot
Class diagram for updated confusion_matrix() functionclassDiagram
class confusion_matrix {
+labels: Sequence = None
+label_aliases: dict = None
+metric: Callable = audmetric.confusion_matrix
+percentage: bool = False
+show_both: bool = False
+ax: matplotlib.axes.Axes = None
+figsize: tuple = None
+font_size: int = None
+cmap: str = None
+vmin: float = None
+vmax: float = None
+title: str = None
+xlabel: str = None
+ylabel: str = None
+xtick_rotation: int = None
+ytick_rotation: int = None
+colorbar: bool = True
+kwargs
+confusion_matrix(truth, prediction, labels, label_aliases, metric, percentage, show_both, ax, ...)
}
confusion_matrix --> "metric: Callable" MetricFunction
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey there - I've reviewed your changes and they look great!
Prompt for AI Agents
Please address the comments from this code review:
## Individual Comments
### Comment 1
<location> `audplot/core/api.py:178` </location>
<code_context>
labels = audmetric.utils.infer_labels(truth, prediction)
- cm = audmetric.confusion_matrix(
+ cm = metric(
truth,
prediction,
</code_context>
<issue_to_address>
**issue:** Passing None for normalize may not be supported by all metric callables.
Custom metric functions may not accept None for the normalize argument. Please standardize this value or document the requirement for custom metrics to handle None appropriately.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files
🚀 New features to boost your workflow:
|
Add support to select a different metric for calculating the confusion matrix in
audplot.confusion_matrix(). This will allow us to easily plot results of other algorithms, e.g.audmetric.event_confusion_matrix()as introduced in audeering/audmetric#77Summary by Sourcery
Add a metric argument to confusion_matrix() for using alternative metrics instead of the default audmetric.confusion_matrix
New Features:
Enhancements: