Skip to content

Commit 69d8ece

Browse files
cnaurothLuciferYang
authored andcommitted
[SPARK-51408][YARN][TESTS] AmIpFilterSuite#testProxyUpdate fails in some networks
### What changes were proposed in this pull request? While verifying Spark 4.0.0 RC2, I consistently saw YARN test `AmIpFilterSuite#testProxyUpdate` failing in my environment. The test is written to eventually expect a `ServletException` from `getProxyAddresses` after 5 seconds of retries, but I never received this exception. This test and the corresponding `AmIpFilter` were introduced in [SPARK-48238](https://issues.apache.org/jira/browse/SPARK-48238) as a fork of the Hadoop implementation to resolve a dependency conflict. However, it seems this test had a small bug in the way it was adapted into the Spark codebase. The `AmIpFilter#getProxyAddresses()` logic may either return an empty set or throw a `ServletException` if it can't find any valid configured proxies. The Hadoop test's assertion allows for either of these conditions: https://github.com/apache/hadoop/blob/rel/release-3.4.0/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-web-proxy/src/test/java/org/apache/hadoop/yarn/server/webproxy/amfilter/TestAmFilter.java#L212-L222 ``` // waiting for configuration update GenericTestUtils.waitFor(new Supplier<Boolean>() { Override public Boolean get() { try { return filter.getProxyAddresses().isEmpty(); } catch (ServletException e) { return true; } } }, 500, updateInterval); ``` The version in Spark strictly requires an exception to be thrown: https://github.com/apache/spark/blob/v4.0.0-rc2/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/AmIpFilterSuite.scala#L163-L168 ``` // waiting for configuration update eventually(timeout(5.seconds), interval(500.millis)) { assertThrows[ServletException] { filter.getProxyAddresses.isEmpty } } ``` The test involves updating the proxy configuration to use "unknownhost" as an invalid proxy. In my network, there is actually a host named "unknownhost", but it only has an IPv6 address, and I only have an IPv4 address. This causes a "network unreachable" error instead of "unknown host", resulting in an empty set instead of an exception. This pull request changes the Spark test to be consistent with the Hadoop test, allowing either condition to succeed. ### Why are the changes needed? Maintain consistency with the intent of the original Hadoop test and ensure it can pass in any network setup. ### Does this PR introduce _any_ user-facing change? No. The changes are in tests only. ### How was this patch tested? Existing tests pass in my environment after this change. ### Was this patch authored or co-authored using generative AI tooling? No. Closes #50173 from cnauroth/SPARK-51408. Authored-by: Chris Nauroth <[email protected]> Signed-off-by: yangjie01 <[email protected]>
1 parent db06293 commit 69d8ece

File tree

1 file changed

+5
-3
lines changed

1 file changed

+5
-3
lines changed

resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/AmIpFilterSuite.scala

+5-3
Original file line numberDiff line numberDiff line change
@@ -153,7 +153,7 @@ class AmIpFilterSuite extends SparkFunSuite {
153153

154154
// change proxy configurations
155155
params = new util.HashMap[String, String]
156-
params.put(AmIpFilter.PROXY_HOSTS, "unknownhost")
156+
params.put(AmIpFilter.PROXY_HOSTS, "unknownhostaf79d34c")
157157
params.put(AmIpFilter.PROXY_URI_BASES, proxyUri)
158158
conf = new DummyFilterConfig(params)
159159
filter.init(conf)
@@ -162,8 +162,10 @@ class AmIpFilterSuite extends SparkFunSuite {
162162
assert(!filter.getProxyAddresses.isEmpty)
163163
// waiting for configuration update
164164
eventually(timeout(5.seconds), interval(500.millis)) {
165-
assertThrows[ServletException] {
166-
filter.getProxyAddresses.isEmpty
165+
try {
166+
assert(filter.getProxyAddresses.isEmpty)
167+
} catch {
168+
case e: ServletException => // do nothing
167169
}
168170
}
169171
}

0 commit comments

Comments
 (0)