-
Notifications
You must be signed in to change notification settings - Fork 49
Dropout (Channel) on HIP and HOST #634
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Conversation
r-abishek
commented
Nov 4, 2025
- Adds Channel Dropout augmentation on HIP and HOST
- Adds support for U8/F32/F16/I8 bit depths and NCHW/NHWC variants with toggle support
- Adds relevant unit / qa / performance tests
…all the bitdepths
indentation modified
… as constant for better understanding, added required comments
…d the HOST backend double free fault
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## develop #634 +/- ##
===========================================
+ Coverage 88.16% 88.36% +0.20%
===========================================
Files 195 197 +2
Lines 82723 83047 +324
===========================================
+ Hits 72932 73383 +451
+ Misses 9791 9664 -127
🚀 New features to boost your workflow:
|
| rpp::Handle &handle) | ||
| { | ||
| if (roiType == RpptRoiType::LTRB) | ||
| hip_exec_roi_converison_ltrb_to_xywh(roiTensorPtrSrc, handle); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please fix the typo in the function name _converison_ -> _conversion_
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This issue is prevalent across files. Will issue a common fix PR for the same. Thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need a separate PR for fixing the typo?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@rrawther The same function call is used in many instances - all are fixed in #645 now.
Thanks @AryanSalmanpour
| dstPtrImage = dstPtr + batchCount * dstDescPtr->strides.nStride; | ||
|
|
||
| uint8_t *maskPtr = scratchBuffer + batchCount * srcDescPtr->c; | ||
| int seed = randomSeed ? std::random_device{}() : DROPOUT_FIXED_SEED; // Use a true random seed if requested, otherwise use the fixed seed for deterministic QA |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All the randomization has to happen outside the kernel from the calling function
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
dropoutProbability and randomSeed have now been removed from arguments.
RPP won't be responsible for the randomization part.
| #pragma omp parallel for num_threads(numThreads) | ||
| for (int batchCount = 0; batchCount < dstDescPtr->n; batchCount++) | ||
| { | ||
| std::mt19937 gen(seed + batchCount); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As mentioned above this need to happen outside the kernel from the caller
rrawther
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added some comments
rrawther
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
waiting for the random dropout change before merge
Channel Dropout Resolved review comments
|
@rrawther , the random dropout changes were made as discussed. Thanks |