Skip to content

Conversation

@r-abishek
Copy link
Member

  • Adds Channel Dropout augmentation on HIP and HOST
  • Adds support for U8/F32/F16/I8 bit depths and NCHW/NHWC variants with toggle support
  • Adds relevant unit / qa / performance tests

snehaa8 and others added 30 commits August 22, 2025 07:39
… as constant for better understanding, added required comments
@codecov
Copy link

codecov bot commented Nov 14, 2025

Codecov Report

❌ Patch coverage is 99.68354% with 1 line in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/modules/tensor/hip/kernel/channel_dropout.cpp 98.46% 1 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop     #634      +/-   ##
===========================================
+ Coverage    88.16%   88.36%   +0.20%     
===========================================
  Files          195      197       +2     
  Lines        82723    83047     +324     
===========================================
+ Hits         72932    73383     +451     
+ Misses        9791     9664     -127     
Files with missing lines Coverage Δ
api/rppdefs.h 71.72% <ø> (ø)
src/include/common/hip/rpp_hip_load_store.hpp 100.00% <ø> (ø)
src/modules/tensor/cpu/kernel/channel_dropout.cpp 100.00% <100.00%> (ø)
...dules/tensor/rppt_tensor_effects_augmentations.cpp 96.18% <100.00%> (+1.51%) ⬆️
src/modules/tensor/hip/kernel/channel_dropout.cpp 98.46% <98.46%> (ø)

... and 3 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

rpp::Handle &handle)
{
if (roiType == RpptRoiType::LTRB)
hip_exec_roi_converison_ltrb_to_xywh(roiTensorPtrSrc, handle);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please fix the typo in the function name _converison_ -> _conversion_

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This issue is prevalent across files. Will issue a common fix PR for the same. Thanks

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Resolved

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need a separate PR for fixing the typo?

Copy link
Member Author

@r-abishek r-abishek Dec 3, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rrawther The same function call is used in many instances - all are fixed in #645 now.
Thanks @AryanSalmanpour

dstPtrImage = dstPtr + batchCount * dstDescPtr->strides.nStride;

uint8_t *maskPtr = scratchBuffer + batchCount * srcDescPtr->c;
int seed = randomSeed ? std::random_device{}() : DROPOUT_FIXED_SEED; // Use a true random seed if requested, otherwise use the fixed seed for deterministic QA
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All the randomization has to happen outside the kernel from the calling function

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dropoutProbability and randomSeed have now been removed from arguments.
RPP won't be responsible for the randomization part.

#pragma omp parallel for num_threads(numThreads)
for (int batchCount = 0; batchCount < dstDescPtr->n; batchCount++)
{
std::mt19937 gen(seed + batchCount);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As mentioned above this need to happen outside the kernel from the caller

Copy link
Contributor

@rrawther rrawther left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added some comments

Copy link
Contributor

@rrawther rrawther left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

waiting for the random dropout change before merge

@Srihari-mcw
Copy link
Contributor

@rrawther , the random dropout changes were made as discussed. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants