Implement log_loss Function and Add Comprehensive Test Coverage#28800
Draft
muzakkirhussain011 wants to merge 6 commits intoivy-llc:mainfrom
Draft
Implement log_loss Function and Add Comprehensive Test Coverage#28800muzakkirhussain011 wants to merge 6 commits intoivy-llc:mainfrom
log_loss Function and Add Comprehensive Test Coverage#28800muzakkirhussain011 wants to merge 6 commits intoivy-llc:mainfrom
Conversation
func_wrapper.py is a Python module designed to streamline the integration of Hugging Face Transformers into your natural language processing (NLP) projects. It provides a set of input and output conversion wrappers to simplify the process of passing data between your custom functions and Transformers' data structures. Input Conversion Wrappers: inputs_to_transformers_tensors: This wrapper converts input data (text, tensors, etc.) into Transformers-compatible data structures. It is particularly useful when your custom functions expect diverse input types. Output Conversion Wrappers: outputs_to_pytorch_tensors: After your custom function returns data, this wrapper ensures that the output data is converted into PyTorch tensors or other appropriate formats. Usage: Import func_wrapper.py into your project. Initialize a Hugging Face Transformers model and tokenizer. Wrap your custom function with to_transformers_tensors_and_back. This wrapped function can now accept and return Transformers-compatible data. Here's a simple example of how to use func_wrapper.py: import torch from transformers import BertForSequenceClassification, BertTokenizer from ivy.functional.frontends.transformers.func_wrapper import to_transformers_tensors_and_back # Initialize the model and tokenizer model_name = "bert-base-uncased" model = BertForSequenceClassification.from_pretrained(model_name) tokenizer = BertTokenizer.from_pretrained(model_name) # Wrap your custom function using the conversion wrappers wrapped_function = to_transformers_tensors_and_back(your_function, model, tokenizer) # Prepare sample input data sample_input_text = "This is a sample input text." sample_input_tensor = torch.rand((3, 3)) # Call your wrapped function with the sample input data output = wrapped_function(sample_input_text, sample_input_tensor) # The output is automatically converted to PyTorch tensors print(output) Please note that func_wrapper.py is still in development, and further enhancements and refinements are expected. Your feedback and contributions to improve its functionality are welcome.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
PR Description
This PR introduces the following changes:
Feature Implementation:
log_lossFunction:log_lossfunction to the metrics module.sample_weightinput to handle weighted samples.y_trueandy_predinputs have the same shape and includes validation for edge cases such as improper input shapes or invalid values.Test Implementation:
test_sklearn_log_loss:log_lossfunction across various scenarios.y_true) and corresponding predicted probabilities (y_pred).sample_weightand validates the function's performance across different backends (e.g., TensorFlow, PyTorch).log_lossimplementation with the reference implementation fromsklearn.metrics.log_lossto ensure correctness.Additional Notes:
ivy.clip,ivy.log,ivy.mean, etc.) have been cross-checked and confirmed to be available within the Ivy framework.Testing:
log_lossfunction performs as expected under various conditions.Checklist