Skip to content

Commit fec5e2a

Browse files
committed
[SPARK-51408][YARN][TESTS] AmIpFilterSuite#testProxyUpdate fails in some networks
### What changes were proposed in this pull request? While verifying Spark 4.0.0 RC2, I consistently saw YARN test `AmIpFilterSuite#testProxyUpdate` failing in my environment. The test is written to eventually expect a `ServletException` from `getProxyAddresses` after 5 seconds of retries, but I never received this exception. This test and the corresponding `AmIpFilter` were introduced in [SPARK-48238](https://issues.apache.org/jira/browse/SPARK-48238) as a fork of the Hadoop implementation to resolve a dependency conflict. However, it seems this test had a small bug in the way it was adapted into the Spark codebase. The `AmIpFilter#getProxyAddresses()` logic may either return an empty set or throw a `ServletException` if it can't find any valid configured proxies. The Hadoop test's assertion allows for either of these conditions: https://github.com/apache/hadoop/blob/rel/release-3.4.0/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-web-proxy/src/test/java/org/apache/hadoop/yarn/server/webproxy/amfilter/TestAmFilter.java#L212-L222 ``` // waiting for configuration update GenericTestUtils.waitFor(new Supplier<Boolean>() { @OverRide public Boolean get() { try { return filter.getProxyAddresses().isEmpty(); } catch (ServletException e) { return true; } } }, 500, updateInterval); ``` The version in Spark strictly requires an exception to be thrown: https://github.com/apache/spark/blob/v4.0.0-rc2/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/AmIpFilterSuite.scala#L163-L168 ``` // waiting for configuration update eventually(timeout(5.seconds), interval(500.millis)) { assertThrows[ServletException] { filter.getProxyAddresses.isEmpty } } ``` The test involves updating the proxy configuration to use "unknownhost" as an invalid proxy. In my network, there is actually a host named "unknownhost", but it only has an IPv6 address, and I only have an IPv4 address. This causes a "network unreachable" error instead of "unknown host", resulting in an empty set instead of an exception. This pull request changes the Spark test to be consistent with the Hadoop test, allowing either condition to succeed. ### Why are the changes needed? Maintain consistency with the intent of the original Hadoop test and ensure it can pass in any network setup. ### Does this PR introduce _any_ user-facing change? No. The changes are in tests only. ### How was this patch tested? Existing tests pass in my environment after this change. ### Was this patch authored or co-authored using generative AI tooling? No.
1 parent a30bdc3 commit fec5e2a

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/AmIpFilterSuite.scala

+3-1
Original file line numberDiff line numberDiff line change
@@ -162,8 +162,10 @@ class AmIpFilterSuite extends SparkFunSuite {
162162
assert(!filter.getProxyAddresses.isEmpty)
163163
// waiting for configuration update
164164
eventually(timeout(5.seconds), interval(500.millis)) {
165-
assertThrows[ServletException] {
165+
try {
166166
filter.getProxyAddresses.isEmpty
167+
} catch {
168+
case e: ServletException => true
167169
}
168170
}
169171
}

0 commit comments

Comments
 (0)