Skip to content

HADOOP-19864. Cut WritableRPCEngine#8433

Open
steveloughran wants to merge 4 commits intoapache:trunkfrom
steveloughran:pr/HADOOP-19864-WritableRPCEngine
Open

HADOOP-19864. Cut WritableRPCEngine#8433
steveloughran wants to merge 4 commits intoapache:trunkfrom
steveloughran:pr/HADOOP-19864-WritableRPCEngine

Conversation

@steveloughran
Copy link
Copy Markdown
Contributor

Description of PR

Purge WritableRPCEngine from the code.

There's hints of its existence in the code (RpcWritable and ClientCache), but its not an RPC mechanism any more.

How was this patch tested?

tests of immediate methods; remaining homework is left to yetus

For code changes:

  • Does the title or this PR starts with the corresponding JIRA issue id (e.g. 'HADOOP-17799. Your PR title ...')?
  • Object storage: have the integration tests been executed and the endpoint declared according to the connector-specific documentation?
  • If adding new dependencies to the code, are these dependencies licensed in a way that is compatible for inclusion under ASF 2.0?
  • If applicable, have you updated the LICENSE, LICENSE-binary, NOTICE-binary files?

AI Tooling

If an AI tool was used:

RpcWritable still exists, and ClientCache uses it,
but those are vestigal uses.

Contains content generated by GitHub Copilot
@hadoop-yetus
Copy link
Copy Markdown

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 8m 39s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 0s codespell was not available.
+0 🆗 detsecrets 0m 0s detect-secrets was not available.
+0 🆗 buf 0m 0s buf was not available.
+0 🆗 buf 0m 0s buf was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
-1 ❌ test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.
_ trunk Compile Tests _
+1 💚 mvninstall 27m 20s trunk passed
+1 💚 compile 8m 31s trunk passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 compile 8m 44s trunk passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 checkstyle 0m 51s trunk passed
+1 💚 mvnsite 1m 8s trunk passed
+1 💚 javadoc 0m 51s trunk passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javadoc 0m 52s trunk passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 spotbugs 1m 45s trunk passed
+1 💚 shadedclient 16m 48s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 44s the patch passed
+1 💚 compile 8m 7s the patch passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 cc 8m 7s the patch passed
+1 💚 javac 8m 7s the patch passed
+1 💚 compile 8m 48s the patch passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 cc 8m 48s the patch passed
+1 💚 javac 8m 48s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
-0 ⚠️ checkstyle 0m 49s /results-checkstyle-hadoop-common-project_hadoop-common.txt hadoop-common-project/hadoop-common: The patch generated 2 new + 323 unchanged - 28 fixed = 325 total (was 351)
+1 💚 mvnsite 1m 10s the patch passed
+1 💚 javadoc 0m 50s the patch passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javadoc 0m 54s the patch passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 spotbugs 1m 52s the patch passed
+1 💚 shadedclient 16m 52s patch has no errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 18m 21s /patch-unit-hadoop-common-project_hadoop-common.txt hadoop-common in the patch passed.
+1 💚 asflicense 0m 43s The patch does not generate ASF License warnings.
136m 0s
Reason Tests
Failed junit tests hadoop.ipc.TestMiniRPCBenchmark
Subsystem Report/Notes
Docker ClientAPI=1.54 ServerAPI=1.54 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8433/1/artifact/out/Dockerfile
GITHUB PR #8433
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets cc buflint bufcompat
uname Linux 4bf5c3a3303b 5.15.0-171-generic #181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 498f4dd
Default Java Ubuntu-17.0.18+8-Ubuntu-124.04.1
Multi-JDK versions /usr/lib/jvm/java-21-openjdk-amd64:Ubuntu-21.0.10+7-Ubuntu-124.04 /usr/lib/jvm/java-17-openjdk-amd64:Ubuntu-17.0.18+8-Ubuntu-124.04.1
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8433/1/testReport/
Max. process+thread count 1306 (vs. ulimit of 10000)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8433/1/console
versions git=2.43.0 maven=3.9.11 spotbugs=4.9.7
Powered by Apache Yetus 0.14.1 https://yetus.apache.org

This message was automatically generated.

@hadoop-yetus
Copy link
Copy Markdown

💔 -1 overall

Vote Subsystem Runtime Logfile Comment
+0 🆗 reexec 0m 23s Docker mode activated.
_ Prechecks _
+1 💚 dupname 0m 0s No case conflicting files found.
+0 🆗 codespell 0m 1s codespell was not available.
+0 🆗 detsecrets 0m 1s detect-secrets was not available.
+0 🆗 buf 0m 1s buf was not available.
+0 🆗 buf 0m 1s buf was not available.
+1 💚 @author 0m 0s The patch does not contain any @author tags.
-1 ❌ test4tests 0m 0s The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch.
_ trunk Compile Tests _
+1 💚 mvninstall 32m 25s trunk passed
+1 💚 compile 10m 37s trunk passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 compile 11m 15s trunk passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 checkstyle 0m 52s trunk passed
+1 💚 mvnsite 1m 12s trunk passed
+1 💚 javadoc 0m 48s trunk passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javadoc 0m 46s trunk passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 spotbugs 1m 48s trunk passed
+1 💚 shadedclient 22m 38s branch has no errors when building and testing our client artifacts.
_ Patch Compile Tests _
+1 💚 mvninstall 0m 46s the patch passed
+1 💚 compile 11m 0s the patch passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 cc 11m 0s the patch passed
+1 💚 javac 11m 0s the patch passed
+1 💚 compile 11m 43s the patch passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 cc 11m 43s the patch passed
+1 💚 javac 11m 43s the patch passed
+1 💚 blanks 0m 0s The patch has no blanks issues.
+1 💚 checkstyle 0m 53s hadoop-common-project/hadoop-common: The patch generated 0 new + 323 unchanged - 28 fixed = 323 total (was 351)
+1 💚 mvnsite 1m 16s the patch passed
+1 💚 javadoc 0m 46s the patch passed with JDK Ubuntu-21.0.10+7-Ubuntu-124.04
+1 💚 javadoc 0m 45s the patch passed with JDK Ubuntu-17.0.18+8-Ubuntu-124.04.1
+1 💚 spotbugs 2m 0s the patch passed
-1 ❌ shadedclient 7m 14s patch has errors when building and testing our client artifacts.
_ Other Tests _
-1 ❌ unit 0m 32s /patch-unit-hadoop-common-project_hadoop-common.txt hadoop-common in the patch failed.
+0 🆗 asflicense 0m 33s ASF License check generated no output?
121m 16s
Subsystem Report/Notes
Docker ClientAPI=1.54 ServerAPI=1.54 base: https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8433/2/artifact/out/Dockerfile
GITHUB PR #8433
Optional Tests dupname asflicense compile javac javadoc mvninstall mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets cc buflint bufcompat
uname Linux 4cd6d5ed90de 5.15.0-171-generic #181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026 x86_64 x86_64 x86_64 GNU/Linux
Build tool maven
Personality dev-support/bin/hadoop.sh
git revision trunk / 8603e30
Default Java Ubuntu-17.0.18+8-Ubuntu-124.04.1
Multi-JDK versions /usr/lib/jvm/java-21-openjdk-amd64:Ubuntu-21.0.10+7-Ubuntu-124.04 /usr/lib/jvm/java-17-openjdk-amd64:Ubuntu-17.0.18+8-Ubuntu-124.04.1
Test Results https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8433/2/testReport/
Max. process+thread count 610 (vs. ulimit of 10000)
modules C: hadoop-common-project/hadoop-common U: hadoop-common-project/hadoop-common
Console output https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8433/2/console
versions git=2.43.0 maven=3.9.11 spotbugs=4.9.7
Powered by Apache Yetus 0.14.1 https://yetus.apache.org

This message was automatically generated.

@steveloughran
Copy link
Copy Markdown
Contributor Author

shaded build failing was vm oom.

I'm going to do a slightly modified PR which touches hdfs, mr and yarn out of diligence

// note anything marked public is solely for access by SaslRpcClient
/**
* Marshalling support, for hadoop shaded protobuf and legacy
* protobuf 2.5.
Copy link
Copy Markdown
Member

@pan3793 pan3793 Apr 19, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you have a plan to cut protobuf 2.5 thoroughly?

protobuf 3-based ProtobufRpcEngine2 has been introduced in HADOOP-17046 (3.3.0, 6 years ago), also Hadoop 3.2 has reached EOL, I think we have already given downstream projects enough time to allow them to migrate from protobuf 2.5 to hadoop shaded protobuf 3.x seamlessly

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good point. the issue was always hbase 1. now we are cleaning up, getting rid of obsolete rpc stuff is good. someone should ping hbase dev list though.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://issues.apache.org/jira/browse/HBASE-27286

The 1.x release are now fully EOLed.

@steveloughran
Copy link
Copy Markdown
Contributor Author

failures are legitimate

org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeakLegacy


java.lang.IllegalStateException: No RPC engine configured for org.apache.hadoop.hdfs.protocol.ClientDatanodeProtocol
	at org.apache.hadoop.util.Preconditions.checkState(Preconditions.java:298)
	at org.apache.hadoop.ipc.RPC.getProtocolEngine(RPC.java:227)
	at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:677)
	at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:642)
	at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:544)
	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:520)
	at org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeak(TestBlockToken.java:400)
	at org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeakLegacy(TestBlockToken.java:435)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)

org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeakProtobuf


java.lang.IllegalStateException: No RPC engine configured for org.apache.hadoop.hdfs.protocol.ClientDatanodeProtocol
	at org.apache.hadoop.util.Preconditions.checkState(Preconditions.java:298)
	at org.apache.hadoop.ipc.RPC.getProtocolEngine(RPC.java:227)
	at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:677)
	at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:642)
	at org.apache.hadoop.ipc.RPC.getProtocolProxy(RPC.java:544)
	at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:520)
	at org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeak(TestBlockToken.java:400)
	at org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeakProtobuf(TestBlockToken.java:440)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)

final failure less obvious

Failed
org.apache.hadoop.hdfs.server.datanode.TestDataNodeLifeline.testSendLifelineIfHeartbeatBlocked

Failing for the past 1 build (Since [Failed#3](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8433/3/) )
[Took 5.5 sec.](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-8433/3/testReport/junit/org.apache.hadoop.hdfs.server.datanode/TestDataNodeLifeline/testSendLifelineIfHeartbeatBlocked/history)
Error Message
Could not change property dfs.datanode.data.dir from '' to '/home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-8433/ubuntu-noble/src/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data/data-new'
Stacktrace
org.apache.hadoop.conf.ReconfigurationException: Could not change property dfs.datanode.data.dir from '' to '/home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-8433/ubuntu-noble/src/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data/data-new'
	at org.apache.hadoop.hdfs.server.datanode.DataNode.reconfigurePropertyImpl(DataNode.java:665)
	at org.apache.hadoop.hdfs.server.datanode.TestDataNodeLifeline.testSendLifelineIfHeartbeatBlocked(TestDataNodeLifeline.java:205)
	at java.base/java.lang.reflect.Method.invoke(Method.java:569)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
	at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
Caused by: java.io.IOException: FAILED to ADD: [DISK]file:/home/jenkins/jenkins-home/workspace/hadoop-multibranch_PR-8433/ubuntu-noble/src/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data/data-new: java.util.concurrent.ExecutionException: java.lang.IndexOutOfBoundsException: Index 0 out of bounds for length 0

	at org.apache.hadoop.hdfs.server.datanode.DataNode.refreshVolumes(DataNode.java:1362)
	at org.apache.hadoop.hdfs.server.datanode.DataNode.reconfigurePropertyImpl(DataNode.java:647)
	... 4 more

But if heartbeats were failing, that could be the trigger.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants