-
Notifications
You must be signed in to change notification settings - Fork 220
Remove SparkSession from class to prevent it being included in the serialization process #1460
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
b5ef154
b5b296b
b91527c
c7d36da
a74bb8d
220405e
24c4ddf
4151a6b
ae12ec5
1aa61c2
e2060e0
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -63,4 +63,30 @@ public void testReadSessionMetricsAccumulator() { | |||||||||||||||||||
| metrics.incrementScanTimeAccumulator(5000); | ||||||||||||||||||||
| assertThat(metrics.getScanTime()).isEqualTo(6000); | ||||||||||||||||||||
| } | ||||||||||||||||||||
|
|
||||||||||||||||||||
| @Test | ||||||||||||||||||||
| public void testSerialization() throws Exception { | ||||||||||||||||||||
| String sessionName = "projects/test-project/locations/us/sessions/testSession"; | ||||||||||||||||||||
| SparkBigQueryReadSessionMetrics metrics = | ||||||||||||||||||||
| SparkBigQueryReadSessionMetrics.from( | ||||||||||||||||||||
| spark, | ||||||||||||||||||||
| ReadSession.newBuilder().setName(sessionName).build(), | ||||||||||||||||||||
| 10L, | ||||||||||||||||||||
| DataFormat.ARROW, | ||||||||||||||||||||
| DataOrigin.QUERY, | ||||||||||||||||||||
| 10L); | ||||||||||||||||||||
|
|
||||||||||||||||||||
| java.io.ByteArrayOutputStream bos = new java.io.ByteArrayOutputStream(); | ||||||||||||||||||||
| java.io.ObjectOutputStream out = new java.io.ObjectOutputStream(bos); | ||||||||||||||||||||
| out.writeObject(metrics); | ||||||||||||||||||||
| out.close(); | ||||||||||||||||||||
|
|
||||||||||||||||||||
| java.io.ByteArrayInputStream bis = new java.io.ByteArrayInputStream(bos.toByteArray()); | ||||||||||||||||||||
| java.io.ObjectInputStream in = new java.io.ObjectInputStream(bis); | ||||||||||||||||||||
| SparkBigQueryReadSessionMetrics deserializedMetrics = | ||||||||||||||||||||
| (SparkBigQueryReadSessionMetrics) in.readObject(); | ||||||||||||||||||||
|
Comment on lines
+84
to
+87
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The
Suggested change
|
||||||||||||||||||||
|
|
||||||||||||||||||||
| assertThat(deserializedMetrics.getNumReadStreams()).isEqualTo(10L); | ||||||||||||||||||||
| assertThat(deserializedMetrics.getBytesRead()).isEqualTo(0); | ||||||||||||||||||||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. To make the test more thorough, it would be good to also assert that the other metric accumulators ( assertThat(deserializedMetrics.getBytesRead()).isEqualTo(0);
assertThat(deserializedMetrics.getRowsRead()).isEqualTo(0);
assertThat(deserializedMetrics.getParseTime()).isEqualTo(0);
assertThat(deserializedMetrics.getScanTime()).isEqualTo(0); |
||||||||||||||||||||
| } | ||||||||||||||||||||
| } | ||||||||||||||||||||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To ensure resources are properly managed and to make the code more robust, it's recommended to use a try-with-resources statement for the
ObjectOutputStream. This guarantees that the stream is closed even if an exception is thrown.