Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed: #12233 #12249

Closed
Closed

Conversation

vedprakash226
Copy link

I have added a logic that restricts the evaluation of log function if y_true is 0.

  • Add an algorithm?
  • Fix a bug or typo in an existing algorithm?
  • Add or change doctests? -- Note: Please avoid changing both code and tests in a single pull request.
  • Documentation change?

Checklist:

  • I have read CONTRIBUTING.md.
  • This pull request is all my own work -- I have not plagiarized.
  • I know that pull requests will not be merged if they fail the automated tests.
  • This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
  • All new Python files are placed inside an existing directory.
  • All filenames are in all lowercase characters with no spaces or dashes.
  • All functions and variable names follow Python naming conventions.
  • All function parameters and return values are annotated with Python type hints.
  • All functions have doctests that pass the automated testing.
  • All new algorithms include at least one URL that points to Wikipedia or another similar explanation.
  • If this pull request resolves one or more open issues then the description above includes the issue number(s) with a closing keyword: "Fixes Avoid log(0) in KL divergence #12233

@algorithms-keeper algorithms-keeper bot added the tests are failing Do not merge until tests pass label Oct 23, 2024
@kevin1kevin1k
Copy link

  • Note that y_true is an array so comparing it to 0 might not be a good idea here, and there are nonzero entries that should be calculated instead of just returning 0.
  • I think this is a duplicate of Avoid log(0) in KL divergence #12237

Copy link
Contributor

@imSanko imSanko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Invalid] Its already being PR

Copy link

@kevin1kevin1k kevin1kevin1k left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please see my comment in the conversation

Note that y_true is an array so comparing it to 0 might not be a good idea here, and there are nonzero entries that should be calculated instead of just returning 0.

This causes
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()

@kevin1kevin1k
Copy link

@anandfresh I do not think the PR should be approved as it is not correct and do not resolve what the issue intends.

@anandfresh
Copy link

@anandfresh I do not think the PR should be approved as it is not correct and do not resolve what the issue intends.

@kevin1kevin1k - Yes, my bad ! I didn't see it is an array variable . Ideally he has to check if the array length is greater than 0

@@ -659,7 +659,10 @@ def kullback_leibler_divergence(y_true: np.ndarray, y_pred: np.ndarray) -> float
if len(y_true) != len(y_pred):
raise ValueError("Input arrays must have the same length.")

kl_loss = y_true * np.log(y_true / y_pred)
kl_loss = 0
if y_true != 0:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't you check the length of the y_true array greater than 0?

@cclauss
Copy link
Member

cclauss commented Nov 1, 2024

Closing tests_are_failing PRs to prepare for Hacktoberfest

@cclauss cclauss closed this Nov 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tests are failing Do not merge until tests pass
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Avoid log(0) in KL divergence
5 participants