-
Notifications
You must be signed in to change notification settings - Fork 14.3k
[WIP]KAFKA-19080 The constraint on segment.ms is not enforced at topic level #19371
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
m1a2st
wants to merge
47
commits into
apache:trunk
Choose a base branch
from
m1a2st:KAFKA-19080
base: trunk
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from 29 commits
Commits
Show all changes
47 commits
Select commit
Hold shift + click to select a range
ef72eee
fix the topic level segment.bytes error
m1a2st 36e59c8
use spotlessApply
m1a2st 7a29f51
update test default value
m1a2st 1d66fb0
wip
m1a2st 60c01e9
Merge branch 'trunk' into KAFKA-19080
m1a2st 9398bae
revert test change
m1a2st 0f766c5
add new constructor for test
m1a2st a0342ed
add escape annotation
m1a2st f707465
fix some test
m1a2st 768c8ed
Revert "fix some test"
m1a2st 640ca6b
Revert "add escape annotation"
m1a2st 2bfd478
Revert "add new constructor for test"
m1a2st 51da598
fix all test
m1a2st 8c39276
fix some test
m1a2st eefa646
fix MetadataLog Test
m1a2st e2adfbe
fix MetadataLog Test
m1a2st 3d8e122
fix some test
m1a2st 303ef39
fix fail tests
m1a2st 3f1b757
Merge remote-tracking branch 'origin/KAFKA-19080' into KAFKA-19080
m1a2st 71dbb89
fix config def error
m1a2st 4c62f11
fix fail test
m1a2st c847bbc
Merge branch 'trunk' into KAFKA-19080
m1a2st ad36e75
Merge branch 'trunk' into KAFKA-19080
m1a2st 5e4fe46
Update LogCleaner file
m1a2st 9b9f469
change segmentSize modifier
m1a2st 4e7a080
Merge branch 'trunk' into KAFKA-19080
m1a2st 8b7d366
addressed by comments
m1a2st 5bbdf1f
addressed by comments
m1a2st 2a3db85
update the test
m1a2st 919365a
Merge branch 'trunk' into KAFKA-19080
m1a2st a332072
remove unused import
m1a2st b774d92
Merge branch 'trunk' into KAFKA-19080
m1a2st 6876739
move other internal config
m1a2st 34a794f
update KafkaMetadataLog apply flow
m1a2st c10c9cb
fix compile error
m1a2st ea981e3
fix fail test
m1a2st 934ac37
fix fail test
m1a2st 77898ed
revert unused change
m1a2st 1756fee
revert unused change
m1a2st 0bfec85
Merge branch 'trunk' into KAFKA-19080
m1a2st 1a3f737
temp
m1a2st 5761c62
Merge branch 'trunk' into KAFKA-19080
m1a2st 06131bf
Merge branch 'trunk' into KAFKA-19080
m1a2st 7483a5d
completed the feature
m1a2st 048e052
fix fail test
m1a2st 76bd287
fix fail test
m1a2st ce644ae
addressed by comments
m1a2st File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -33,6 +33,7 @@ import org.apache.kafka.security.authorizer.AclEntry.{WILDCARD_HOST, WILDCARD_PR | |
import org.apache.kafka.server.config.{DelegationTokenManagerConfigs, ServerConfigs, ServerLogConfigs} | ||
import org.apache.kafka.metadata.authorizer.StandardAuthorizer | ||
import org.apache.kafka.server.authorizer.{Authorizer => JAuthorizer} | ||
import org.apache.kafka.storage.internals.log.LogConfig | ||
import org.apache.kafka.test.TestUtils.assertFutureThrows | ||
import org.junit.jupiter.api.Assertions._ | ||
import org.junit.jupiter.api.{AfterEach, BeforeEach, Test, TestInfo, Timeout} | ||
|
@@ -566,7 +567,7 @@ class SaslSslAdminIntegrationTest extends BaseAdminIntegrationTest with SaslSetu | |
client.createAcls(List(denyAcl).asJava, new CreateAclsOptions()).all().get() | ||
|
||
val topics = Seq(topic1, topic2) | ||
val configsOverride = Map(TopicConfig.SEGMENT_BYTES_CONFIG -> "100000").asJava | ||
val configsOverride = Map(LogConfig.INTERNAL_SEGMENT_BYTES_CONFIG -> "100000").asJava | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. this test case is used to verify the custom topic-level config, so we can increase the value to make test pass. |
||
val newTopics = Seq( | ||
new NewTopic(topic1, 2, 3.toShort).configs(configsOverride), | ||
new NewTopic(topic2, Option.empty[Integer].toJava, Option.empty[java.lang.Short].toJava).configs(configsOverride)) | ||
|
@@ -579,7 +580,7 @@ class SaslSslAdminIntegrationTest extends BaseAdminIntegrationTest with SaslSetu | |
assertEquals(3, result.replicationFactor(topic1).get()) | ||
val topicConfigs = result.config(topic1).get().entries.asScala | ||
assertTrue(topicConfigs.nonEmpty) | ||
val segmentBytesConfig = topicConfigs.find(_.name == TopicConfig.SEGMENT_BYTES_CONFIG).get | ||
val segmentBytesConfig = topicConfigs.find(_.name == LogConfig.INTERNAL_SEGMENT_BYTES_CONFIG).get | ||
assertEquals(100000, segmentBytesConfig.value.toLong) | ||
assertEquals(ConfigEntry.ConfigSource.DYNAMIC_TOPIC_CONFIG, segmentBytesConfig.source) | ||
val compressionConfig = topicConfigs.find(_.name == TopicConfig.COMPRESSION_TYPE_CONFIG).get | ||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of having an
internalApply
, could we just use the existingapply
and addINTERNAL_SEGMENT_BYTES_CONFIG
toMetadataLogConfig
?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 to move the internal config to
MetadataLogConfig
, and it would be better to wait #19465 extracting the metadata-related configs from other class toMetadataLogConfig
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just realized that metadata log uses a different approach to allow tests to use a smaller segment bytes than allowed in production. That approach defines the original segment byte config with a small minimal requirement, but adds METADATA_LOG_SEGMENT_MIN_BYTES_CONFIG to enforce the actual minimal requirement in production. This new config could be changed in tests to allow for smaller minimal bytes. The benefit of this approach is that it allows the existing config to be used directly to set a smaller value for tests. The downside is that the doc for min value is inaccurate and the validation is done through a customized logic.
It would be useful to pick the same strategy between metadata log and regular log. The metadata log approach seems slightly better since it's less intrusive. We could fix the inaccurate min value description for production somehow.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Excuse me, the strategy used by metadata log is to add a "internal" config (
METADATA_LOG_SEGMENT_MIN_BYTES_CONFIG
) to change the (metadata) segment size in testing, and that is what we want to address in this PR - we add a "internal" config for regular log, and so the test can use the "smaller" segment size.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am just saying that we now have two different ways to achieve the same goal. In the metadata log approach, you set the desired value through the original config, which is segment.bytes. You then set an internal config to change the min constraint.
The approach in this PR is to set the desired value through a different internal config.
It would be useful to choose same approach for both the metadata log and the regular log.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Personally, I prefer the approach of adding "internal".xxx config as it provide better user experience for public configs, allowing users to see the "correct" min value. Additionally, we can remove the customized logic of validation.
In short, I suggest to add following changes to this PR.
METADATA_LOG_SEGMENT_MIN_BYTES_CONFIG
MetadataLogConfig#logSegmentMinBytes
internal.metadata.log.segment.bytes
MetadataLogConfig#logSegmentBytes
as following codeThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for @chia7712, @junrao comments, addressed it :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, why do we need
internalApply()
? MetadataLogConfig has both the product and internal segment bytes configs and we could just pass both into LogConfig inapply()
, right?