-
Notifications
You must be signed in to change notification settings - Fork 94
[application] namespace duplication in llama application @open sesame 03/07 15:57 #2989
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CodeQL found more than 20 potential problems in the proposed changes. Check the Files changed tab for more details.
|
Just a thought, but it seems that instead of using qualified names ie. namespace custom {
using namespace nntrainer;
///... rest mostly without changes
}And leaving this code with unqualified names should be sufficient, with little change/none of the code churn. |
08b37d1 to
6d7f896
Compare
I applied it, thanks :) |
Previously, the `multi_head_attention_layer` is seperated into a custom layer called `custom_multi_head_attention_layer` for refactoring purposes. At that time, the custom layer header macro and namespace should be changed, but they were copied over as-is, causing duplication. So I've removed the header macro and renamed namespace. Even after resolving this problem, the llama application still does not work properly now. It seems that several issues have arisen due to the reflection of application refactoring and modifications to the nntrainer core. I will resolve these issues as soon as possible. **Self evaluation:** 1. Build test: [X]Passed [ ]Failed [ ]Skipped 2. Run test: [ ]Passed [ ]Failed [X]Skipped Signed-off-by: Seungbaek Hong <[email protected]>
6d7f896 to
6aca1f8
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Previously, the
multi_head_attention_layeris seperated into a custom layer calledcustom_multi_head_attention_layerfor refactoring purposes. At that time, the namespace of custom layer should be changed, but they were copied over as-is, causing duplication. I've renamed namespace.This issue was discovered and reported by @gkisalapl . I always appreciate your efforts. thanks a lot!
Even after resolving this problem, the llama application still does not work properly now.
I'll try to resolve this issue as soon as possible.
Self evaluation: