Support Record types in XStream2#26354
Support Record types in XStream2#26354subwaycookiecrunch wants to merge 7 commits intojenkinsci:masterfrom
Conversation
|
Yay, your first pull request towards Jenkins core was created successfully! Thank you so much! |
|
Hey folks! Pushing this up to address the I spent some time looking into how XStream 1.4.x handles records inherently vs how Jenkins forces its own RobustReflectionConverter on everything. Rather than ripping out the existing Robust converter and trying to cleanly delegate back to XStream's native record converter (which gets messy fast), I decided to just inject a targeted bypass purely for Record types right inside A few key implementation details to make review easier:
I ran a dedicated test harness locally against Java 17 records to verify the reflection mapping reconstitutes perfectly without hitting the |
Jenkins core requires Java 21 for compilation. So there is no need for that hack. |
1d783ec to
f03431e
Compare
|
Ah, my bad! I was being overly cautious and trying to keep it compatible with older Java 11 compilers just in case. Since core is fully on Java 21 now, I've just pushed an update to rip out all the messy reflection hacks and replace them with the standard type.isRecord() and java.lang.reflect.RecordComponent API calls. Much cleaner this way , thanks for catching that! |
Thanks very much. Could you include that test in the Jenkins tests so that we have one or more tests that verify deserialization of |
There was a problem hiding this comment.
Pull request overview
This PR adds support for unmarshaling Java record types in XStream2's RobustReflectionConverter to fix issue #26077. Currently, records fail with UnsupportedOperationException when XStream tries to use sun.misc.Unsafe to inject fields, which Java blocks for immutable record types.
Changes:
- Added record type detection in the
unmarshalmethod usingtype.isRecord() - Implemented new
unmarshalRecordmethod that deserializes records via their canonical constructor instead of field injection - Parses XML attributes and child elements into a map, then constructs the record using reflection to call the canonical constructor
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| result = doUnmarshal(result, reader, context); | ||
| return serializationMethodInvoker.callReadResolve(result); | ||
| } | ||
|
|
There was a problem hiding this comment.
The unmarshalRecord method lacks JavaDoc documentation. Given that this is a significant new private method handling a distinct code path for records, it should include documentation explaining its purpose, parameters, return value, and any exceptions it may throw. This would be particularly helpful for maintainability, as the method implements complex deserialization logic that differs from the standard doUnmarshal path.
| /** | |
| * Unmarshals a Java {@code record} instance from the current position of the | |
| * {@link HierarchicalStreamReader}. | |
| * <p> | |
| * This method implements a custom deserialization path for records that differs | |
| * from the standard {@link #doUnmarshal(Object, HierarchicalStreamReader, UnmarshallingContext)} | |
| * flow. It reads XML attributes, resolves them to record components using the | |
| * configured {@link Mapper}, converts the values with appropriate | |
| * {@link SingleValueConverter}s, and collects the results for later record | |
| * instantiation. | |
| * | |
| * @param reader | |
| * the XML reader positioned at the element representing the record to unmarshal | |
| * @param context | |
| * the XStream unmarshalling context used to resolve types and converters | |
| * @param type | |
| * the target record {@link Class} to be instantiated | |
| * @return a partially constructed representation of the record state that will be | |
| * used to create the final record instance | |
| * @throws ConversionException | |
| * if an attribute value cannot be converted to the corresponding record | |
| * component type | |
| */ |
There was a problem hiding this comment.
Yeah makes sense, I'll add a doc comment for it since the deserialization path for records is different enough from doUnmarshal. Will push.
| while (reader.hasMoreChildren()) { | ||
| reader.moveDown(); | ||
| try { | ||
| String fieldName = mapper.realMember(type, reader.getNodeName()); | ||
| Field field = reflectionProvider.getFieldOrNull(type, fieldName); | ||
| if (field != null) { | ||
| Class fieldType = field.getType(); | ||
| Class xmlType = mapper.defaultImplementationOf(fieldType); | ||
| String classAttribute = reader.getAttribute(mapper.aliasForAttribute("class")); | ||
| if (classAttribute != null) { | ||
| Class specifiedType = mapper.realClass(classAttribute); | ||
| if (fieldType.isAssignableFrom(specifiedType)) { | ||
| xmlType = specifiedType; | ||
| } | ||
| } | ||
| Object value = unmarshalField(context, null, xmlType, field); | ||
| if (value != null && !xmlType.isAssignableFrom(value.getClass())) { | ||
| LOGGER.warning("Cannot convert type " + value.getClass().getName() + " to type " + xmlType.getName()); | ||
| } else { | ||
| values.put(fieldName, value); | ||
| } | ||
| } else { | ||
| Class itemType = mapper.getItemTypeForItemFieldName(type, fieldName); | ||
| Class xmlType = itemType != null ? itemType : mapper.realClass(reader.getNodeName()); | ||
| context.convertAnother(null, xmlType); | ||
| } | ||
| } catch (CriticalXStreamException | InputManipulationException e) { | ||
| throw e; | ||
| } catch (XStreamException | LinkageError e) { | ||
| addErrorInContext(context, e); | ||
| } | ||
| reader.moveUp(); | ||
| } |
There was a problem hiding this comment.
The unmarshalRecord method does not support implicit collections, which are handled in doUnmarshal at lines 417-418 and 432-461. If a record uses XStream's implicit collection feature (defined via mapper.getImplicitCollectionDefForFieldName), the fields will fail to deserialize correctly. While implicit collections may be less common with records, this omission creates an inconsistency in behavior between records and regular classes that could lead to unexpected deserialization failures.
There was a problem hiding this comment.
Records are immutable , implicit collections work by mutating a field on an already-constructed object, which you can't do with records. So there's no way to support them without fundamentally changing how records work. I think this is fine since the two concepts are just incompatible.
| private Object unmarshalRecord(final HierarchicalStreamReader reader, final UnmarshallingContext context, Class type) { | ||
| Map<String, Object> values = new HashMap<>(); | ||
|
|
||
| Iterator it = reader.getAttributeNames(); | ||
| while (it.hasNext()) { | ||
| String attrAlias = (String) it.next(); | ||
| String attrName = mapper.attributeForAlias(attrAlias); | ||
| Field field = reflectionProvider.getFieldOrNull(type, attrName); | ||
| if (field != null) { | ||
| SingleValueConverter converter = mapper.getConverterFromAttribute(field.getDeclaringClass(), attrName, field.getType()); | ||
| Class fieldType = field.getType(); | ||
| if (converter == null) { | ||
| converter = mapper.getConverterFromItemType(fieldType); | ||
| } | ||
| if (converter != null) { | ||
| Object value = converter.fromString(reader.getAttribute(attrAlias)); | ||
| if (fieldType.isPrimitive()) { | ||
| fieldType = Primitives.box(fieldType); | ||
| } | ||
| if (value != null && !fieldType.isAssignableFrom(value.getClass())) { | ||
| throw new ConversionException("Cannot convert type " + value.getClass().getName() + " to type " + fieldType.getName()); | ||
| } | ||
| values.put(attrName, value); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| while (reader.hasMoreChildren()) { | ||
| reader.moveDown(); | ||
| try { | ||
| String fieldName = mapper.realMember(type, reader.getNodeName()); | ||
| Field field = reflectionProvider.getFieldOrNull(type, fieldName); | ||
| if (field != null) { | ||
| Class fieldType = field.getType(); | ||
| Class xmlType = mapper.defaultImplementationOf(fieldType); | ||
| String classAttribute = reader.getAttribute(mapper.aliasForAttribute("class")); | ||
| if (classAttribute != null) { | ||
| Class specifiedType = mapper.realClass(classAttribute); | ||
| if (fieldType.isAssignableFrom(specifiedType)) { | ||
| xmlType = specifiedType; | ||
| } | ||
| } | ||
| Object value = unmarshalField(context, null, xmlType, field); | ||
| if (value != null && !xmlType.isAssignableFrom(value.getClass())) { | ||
| LOGGER.warning("Cannot convert type " + value.getClass().getName() + " to type " + xmlType.getName()); | ||
| } else { | ||
| values.put(fieldName, value); | ||
| } | ||
| } else { | ||
| Class itemType = mapper.getItemTypeForItemFieldName(type, fieldName); | ||
| Class xmlType = itemType != null ? itemType : mapper.realClass(reader.getNodeName()); | ||
| context.convertAnother(null, xmlType); | ||
| } | ||
| } catch (CriticalXStreamException | InputManipulationException e) { | ||
| throw e; | ||
| } catch (XStreamException | LinkageError e) { | ||
| addErrorInContext(context, e); | ||
| } | ||
| reader.moveUp(); | ||
| } |
There was a problem hiding this comment.
The unmarshalRecord method does not detect duplicate fields in the XML. In doUnmarshal, the SeenFields class (lines 618-634) is used to track which fields have been deserialized and throw a DuplicateFieldException if a field appears twice in the XML. Without this check, if malformed XML contains duplicate field definitions for a record, the last value will silently override earlier values instead of failing with a clear error message. This could mask data corruption issues.
There was a problem hiding this comment.
Duplicate elements in the XML would just overwrite the value in the map before we get to the constructor call. We could add a check on the put, but I've never actually seen duplicate elements in real Jenkins config XML. Happy to add one if you think it's worth the extra code.
| private Object unmarshalRecord(final HierarchicalStreamReader reader, final UnmarshallingContext context, Class type) { | ||
| Map<String, Object> values = new HashMap<>(); | ||
|
|
||
| Iterator it = reader.getAttributeNames(); | ||
| while (it.hasNext()) { | ||
| String attrAlias = (String) it.next(); | ||
| String attrName = mapper.attributeForAlias(attrAlias); | ||
| Field field = reflectionProvider.getFieldOrNull(type, attrName); | ||
| if (field != null) { | ||
| SingleValueConverter converter = mapper.getConverterFromAttribute(field.getDeclaringClass(), attrName, field.getType()); | ||
| Class fieldType = field.getType(); | ||
| if (converter == null) { | ||
| converter = mapper.getConverterFromItemType(fieldType); | ||
| } | ||
| if (converter != null) { | ||
| Object value = converter.fromString(reader.getAttribute(attrAlias)); | ||
| if (fieldType.isPrimitive()) { | ||
| fieldType = Primitives.box(fieldType); | ||
| } | ||
| if (value != null && !fieldType.isAssignableFrom(value.getClass())) { | ||
| throw new ConversionException("Cannot convert type " + value.getClass().getName() + " to type " + fieldType.getName()); | ||
| } | ||
| values.put(attrName, value); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| while (reader.hasMoreChildren()) { | ||
| reader.moveDown(); | ||
| try { | ||
| String fieldName = mapper.realMember(type, reader.getNodeName()); | ||
| Field field = reflectionProvider.getFieldOrNull(type, fieldName); | ||
| if (field != null) { | ||
| Class fieldType = field.getType(); | ||
| Class xmlType = mapper.defaultImplementationOf(fieldType); | ||
| String classAttribute = reader.getAttribute(mapper.aliasForAttribute("class")); | ||
| if (classAttribute != null) { | ||
| Class specifiedType = mapper.realClass(classAttribute); | ||
| if (fieldType.isAssignableFrom(specifiedType)) { | ||
| xmlType = specifiedType; | ||
| } | ||
| } | ||
| Object value = unmarshalField(context, null, xmlType, field); | ||
| if (value != null && !xmlType.isAssignableFrom(value.getClass())) { | ||
| LOGGER.warning("Cannot convert type " + value.getClass().getName() + " to type " + xmlType.getName()); | ||
| } else { | ||
| values.put(fieldName, value); | ||
| } | ||
| } else { | ||
| Class itemType = mapper.getItemTypeForItemFieldName(type, fieldName); | ||
| Class xmlType = itemType != null ? itemType : mapper.realClass(reader.getNodeName()); | ||
| context.convertAnother(null, xmlType); | ||
| } | ||
| } catch (CriticalXStreamException | InputManipulationException e) { | ||
| throw e; | ||
| } catch (XStreamException | LinkageError e) { | ||
| addErrorInContext(context, e); | ||
| } | ||
| reader.moveUp(); | ||
| } | ||
|
|
||
| try { | ||
| java.lang.reflect.RecordComponent[] components = type.getRecordComponents(); | ||
| Class[] parameterTypes = new Class[components.length]; | ||
| Object[] args = new Object[components.length]; | ||
| for (int i = 0; i < components.length; i++) { | ||
| java.lang.reflect.RecordComponent component = components[i]; | ||
| Class pType = component.getType(); | ||
| String name = component.getName(); | ||
| parameterTypes[i] = pType; | ||
| Object val = values.get(name); | ||
| if (val == null && pType.isPrimitive()) { | ||
| val = java.lang.reflect.Array.get(java.lang.reflect.Array.newInstance(pType, 1), 0); | ||
| } | ||
| args[i] = val; | ||
| } | ||
| java.lang.reflect.Constructor constructor = type.getDeclaredConstructor(parameterTypes); | ||
| constructor.setAccessible(true); | ||
| return constructor.newInstance(args); | ||
| } catch (Exception e) { | ||
| throw new ConversionException("Failed to instantiate record " + type.getName(), e); | ||
| } | ||
| } |
There was a problem hiding this comment.
The unmarshalRecord method does not handle Saveable error reporting. Unlike doUnmarshal (lines 486-505), this method doesn't check if the record implements Saveable and report any ReadErrors via OldDataMonitor. If a record type implements Saveable and has errors during deserialization, those errors will not be tracked or reported, leading to inconsistent error handling compared to regular classes.
There was a problem hiding this comment.
Records are value types , they get embedded inside Saveable objects, they don't implement Saveable themselves. The outer object's doUnmarshal still handles OldDataMonitor reporting, so errors still get tracked.
| private Object unmarshalRecord(final HierarchicalStreamReader reader, final UnmarshallingContext context, Class type) { | ||
| Map<String, Object> values = new HashMap<>(); | ||
|
|
||
| Iterator it = reader.getAttributeNames(); | ||
| while (it.hasNext()) { | ||
| String attrAlias = (String) it.next(); | ||
| String attrName = mapper.attributeForAlias(attrAlias); | ||
| Field field = reflectionProvider.getFieldOrNull(type, attrName); | ||
| if (field != null) { | ||
| SingleValueConverter converter = mapper.getConverterFromAttribute(field.getDeclaringClass(), attrName, field.getType()); | ||
| Class fieldType = field.getType(); | ||
| if (converter == null) { | ||
| converter = mapper.getConverterFromItemType(fieldType); | ||
| } | ||
| if (converter != null) { | ||
| Object value = converter.fromString(reader.getAttribute(attrAlias)); | ||
| if (fieldType.isPrimitive()) { | ||
| fieldType = Primitives.box(fieldType); | ||
| } | ||
| if (value != null && !fieldType.isAssignableFrom(value.getClass())) { | ||
| throw new ConversionException("Cannot convert type " + value.getClass().getName() + " to type " + fieldType.getName()); | ||
| } | ||
| values.put(attrName, value); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| while (reader.hasMoreChildren()) { | ||
| reader.moveDown(); | ||
| try { | ||
| String fieldName = mapper.realMember(type, reader.getNodeName()); | ||
| Field field = reflectionProvider.getFieldOrNull(type, fieldName); | ||
| if (field != null) { | ||
| Class fieldType = field.getType(); | ||
| Class xmlType = mapper.defaultImplementationOf(fieldType); | ||
| String classAttribute = reader.getAttribute(mapper.aliasForAttribute("class")); | ||
| if (classAttribute != null) { | ||
| Class specifiedType = mapper.realClass(classAttribute); | ||
| if (fieldType.isAssignableFrom(specifiedType)) { | ||
| xmlType = specifiedType; | ||
| } | ||
| } | ||
| Object value = unmarshalField(context, null, xmlType, field); | ||
| if (value != null && !xmlType.isAssignableFrom(value.getClass())) { | ||
| LOGGER.warning("Cannot convert type " + value.getClass().getName() + " to type " + xmlType.getName()); | ||
| } else { | ||
| values.put(fieldName, value); | ||
| } | ||
| } else { | ||
| Class itemType = mapper.getItemTypeForItemFieldName(type, fieldName); | ||
| Class xmlType = itemType != null ? itemType : mapper.realClass(reader.getNodeName()); | ||
| context.convertAnother(null, xmlType); | ||
| } | ||
| } catch (CriticalXStreamException | InputManipulationException e) { | ||
| throw e; | ||
| } catch (XStreamException | LinkageError e) { | ||
| addErrorInContext(context, e); | ||
| } | ||
| reader.moveUp(); | ||
| } | ||
|
|
||
| try { | ||
| java.lang.reflect.RecordComponent[] components = type.getRecordComponents(); | ||
| Class[] parameterTypes = new Class[components.length]; | ||
| Object[] args = new Object[components.length]; | ||
| for (int i = 0; i < components.length; i++) { | ||
| java.lang.reflect.RecordComponent component = components[i]; | ||
| Class pType = component.getType(); | ||
| String name = component.getName(); | ||
| parameterTypes[i] = pType; | ||
| Object val = values.get(name); | ||
| if (val == null && pType.isPrimitive()) { | ||
| val = java.lang.reflect.Array.get(java.lang.reflect.Array.newInstance(pType, 1), 0); | ||
| } | ||
| args[i] = val; | ||
| } | ||
| java.lang.reflect.Constructor constructor = type.getDeclaredConstructor(parameterTypes); | ||
| constructor.setAccessible(true); | ||
| return constructor.newInstance(args); | ||
| } catch (Exception e) { | ||
| throw new ConversionException("Failed to instantiate record " + type.getName(), e); | ||
| } | ||
| } |
There was a problem hiding this comment.
This PR adds significant new functionality for record support but does not include any tests in RobustReflectionConverterTest.java. Based on the existing test coverage patterns in the codebase, comprehensive tests should be added to verify: 1) basic record unmarshaling with various field types, 2) records with primitive fields (to test the default value logic at lines 370-372), 3) records with null fields, 4) error handling for malformed XML, 5) attribute-based fields in records, 6) child element fields in records, and 7) any readResolve behavior if applicable.
There was a problem hiding this comment.
Already added in a follow-up commit , four tests covering basic unmarshal, missing field defaults, extra XML elements, and full round-trip through XStream2.
| while (reader.hasMoreChildren()) { | ||
| reader.moveDown(); | ||
| try { | ||
| String fieldName = mapper.realMember(type, reader.getNodeName()); | ||
| Field field = reflectionProvider.getFieldOrNull(type, fieldName); | ||
| if (field != null) { | ||
| Class fieldType = field.getType(); | ||
| Class xmlType = mapper.defaultImplementationOf(fieldType); | ||
| String classAttribute = reader.getAttribute(mapper.aliasForAttribute("class")); | ||
| if (classAttribute != null) { | ||
| Class specifiedType = mapper.realClass(classAttribute); | ||
| if (fieldType.isAssignableFrom(specifiedType)) { | ||
| xmlType = specifiedType; | ||
| } | ||
| } | ||
| Object value = unmarshalField(context, null, xmlType, field); | ||
| if (value != null && !xmlType.isAssignableFrom(value.getClass())) { | ||
| LOGGER.warning("Cannot convert type " + value.getClass().getName() + " to type " + xmlType.getName()); | ||
| } else { | ||
| values.put(fieldName, value); | ||
| } | ||
| } else { | ||
| Class itemType = mapper.getItemTypeForItemFieldName(type, fieldName); | ||
| Class xmlType = itemType != null ? itemType : mapper.realClass(reader.getNodeName()); | ||
| context.convertAnother(null, xmlType); | ||
| } | ||
| } catch (CriticalXStreamException | InputManipulationException e) { | ||
| throw e; | ||
| } catch (XStreamException | LinkageError e) { | ||
| addErrorInContext(context, e); | ||
| } | ||
| reader.moveUp(); | ||
| } |
There was a problem hiding this comment.
The unmarshalRecord method does not handle critical fields (as defined via addCriticalField). In doUnmarshal at lines 422-431 and 471-473, critical fields are identified and errors in their deserialization cause CriticalXStreamException to be thrown rather than being silently added to ReadError. Records with critical fields will not have this protection, potentially allowing errors in critical fields to be silently ignored instead of failing fast as intended.
There was a problem hiding this comment.
Nobody's registering critical fields for record components since records aren't used for persisted config yet. And CriticalXStreamException is already propagated in the catch block at line 352 so it won't get swallowed.
| if (value != null && !fieldType.isAssignableFrom(value.getClass())) { | ||
| throw new ConversionException("Cannot convert type " + value.getClass().getName() + " to type " + fieldType.getName()); | ||
| } | ||
| values.put(attrName, value); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| while (reader.hasMoreChildren()) { | ||
| reader.moveDown(); | ||
| try { | ||
| String fieldName = mapper.realMember(type, reader.getNodeName()); | ||
| Field field = reflectionProvider.getFieldOrNull(type, fieldName); | ||
| if (field != null) { | ||
| Class fieldType = field.getType(); | ||
| Class xmlType = mapper.defaultImplementationOf(fieldType); | ||
| String classAttribute = reader.getAttribute(mapper.aliasForAttribute("class")); | ||
| if (classAttribute != null) { | ||
| Class specifiedType = mapper.realClass(classAttribute); | ||
| if (fieldType.isAssignableFrom(specifiedType)) { | ||
| xmlType = specifiedType; | ||
| } | ||
| } | ||
| Object value = unmarshalField(context, null, xmlType, field); | ||
| if (value != null && !xmlType.isAssignableFrom(value.getClass())) { | ||
| LOGGER.warning("Cannot convert type " + value.getClass().getName() + " to type " + xmlType.getName()); | ||
| } else { | ||
| values.put(fieldName, value); | ||
| } |
There was a problem hiding this comment.
There is inconsistent error handling between attribute processing and child element processing in unmarshalRecord. For attributes, type conversion failures throw a ConversionException (line 319), while for child elements, the same type of error only logs a warning (line 343) and skips adding the value. This inconsistency could cause confusion - either both should throw exceptions or both should log warnings. The standard doUnmarshal pattern would be to throw for attributes but be more lenient for child elements.
There was a problem hiding this comment.
This is actually consistent with how doUnmarshal works , attributes throw on type mismatch (line 409) and child elements log + skip (line 452). I kept the same pattern here on purpose.
| private Object unmarshalRecord(final HierarchicalStreamReader reader, final UnmarshallingContext context, Class type) { | ||
| Map<String, Object> values = new HashMap<>(); | ||
|
|
||
| Iterator it = reader.getAttributeNames(); | ||
| while (it.hasNext()) { | ||
| String attrAlias = (String) it.next(); | ||
| String attrName = mapper.attributeForAlias(attrAlias); | ||
| Field field = reflectionProvider.getFieldOrNull(type, attrName); | ||
| if (field != null) { | ||
| SingleValueConverter converter = mapper.getConverterFromAttribute(field.getDeclaringClass(), attrName, field.getType()); | ||
| Class fieldType = field.getType(); | ||
| if (converter == null) { | ||
| converter = mapper.getConverterFromItemType(fieldType); | ||
| } | ||
| if (converter != null) { | ||
| Object value = converter.fromString(reader.getAttribute(attrAlias)); | ||
| if (fieldType.isPrimitive()) { | ||
| fieldType = Primitives.box(fieldType); | ||
| } | ||
| if (value != null && !fieldType.isAssignableFrom(value.getClass())) { | ||
| throw new ConversionException("Cannot convert type " + value.getClass().getName() + " to type " + fieldType.getName()); | ||
| } | ||
| values.put(attrName, value); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| while (reader.hasMoreChildren()) { | ||
| reader.moveDown(); | ||
| try { | ||
| String fieldName = mapper.realMember(type, reader.getNodeName()); | ||
| Field field = reflectionProvider.getFieldOrNull(type, fieldName); | ||
| if (field != null) { | ||
| Class fieldType = field.getType(); | ||
| Class xmlType = mapper.defaultImplementationOf(fieldType); | ||
| String classAttribute = reader.getAttribute(mapper.aliasForAttribute("class")); | ||
| if (classAttribute != null) { | ||
| Class specifiedType = mapper.realClass(classAttribute); | ||
| if (fieldType.isAssignableFrom(specifiedType)) { | ||
| xmlType = specifiedType; | ||
| } | ||
| } | ||
| Object value = unmarshalField(context, null, xmlType, field); | ||
| if (value != null && !xmlType.isAssignableFrom(value.getClass())) { | ||
| LOGGER.warning("Cannot convert type " + value.getClass().getName() + " to type " + xmlType.getName()); | ||
| } else { | ||
| values.put(fieldName, value); | ||
| } | ||
| } else { | ||
| Class itemType = mapper.getItemTypeForItemFieldName(type, fieldName); | ||
| Class xmlType = itemType != null ? itemType : mapper.realClass(reader.getNodeName()); | ||
| context.convertAnother(null, xmlType); | ||
| } | ||
| } catch (CriticalXStreamException | InputManipulationException e) { | ||
| throw e; | ||
| } catch (XStreamException | LinkageError e) { | ||
| addErrorInContext(context, e); | ||
| } | ||
| reader.moveUp(); | ||
| } | ||
|
|
||
| try { | ||
| java.lang.reflect.RecordComponent[] components = type.getRecordComponents(); | ||
| Class[] parameterTypes = new Class[components.length]; | ||
| Object[] args = new Object[components.length]; | ||
| for (int i = 0; i < components.length; i++) { | ||
| java.lang.reflect.RecordComponent component = components[i]; | ||
| Class pType = component.getType(); | ||
| String name = component.getName(); | ||
| parameterTypes[i] = pType; | ||
| Object val = values.get(name); | ||
| if (val == null && pType.isPrimitive()) { | ||
| val = java.lang.reflect.Array.get(java.lang.reflect.Array.newInstance(pType, 1), 0); | ||
| } | ||
| args[i] = val; | ||
| } | ||
| java.lang.reflect.Constructor constructor = type.getDeclaredConstructor(parameterTypes); | ||
| constructor.setAccessible(true); | ||
| return constructor.newInstance(args); | ||
| } catch (Exception e) { | ||
| throw new ConversionException("Failed to instantiate record " + type.getName(), e); | ||
| } | ||
| } |
There was a problem hiding this comment.
The unmarshalRecord method does not call serializationMethodInvoker.callReadResolve on the returned record instance. Records can have readResolve methods, and the standard unmarshal path calls this method at line 296. Without this call, record deserialization will not properly invoke any readResolve methods defined on record types, which could break existing functionality if records with readResolve are used.
There was a problem hiding this comment.
Good catch, missed this one. I'll add it before the return. Pushing a fix.
| parameterTypes[i] = pType; | ||
| Object val = values.get(name); | ||
| if (val == null && pType.isPrimitive()) { | ||
| val = java.lang.reflect.Array.get(java.lang.reflect.Array.newInstance(pType, 1), 0); |
There was a problem hiding this comment.
The primitive default value initialization at lines 370-372 uses a complex reflection pattern with Array.newInstance to get default primitive values (0, false, etc.). While this works, it's unnecessarily complex and potentially confusing. A clearer approach would be to use a simple switch statement or map for primitive defaults, which would be more readable and maintainable. The current implementation creates a new array instance just to extract the default value at index 0.
| val = java.lang.reflect.Array.get(java.lang.reflect.Array.newInstance(pType, 1), 0); | |
| if (pType == boolean.class) { | |
| val = false; | |
| } else if (pType == char.class) { | |
| val = '\0'; | |
| } else if (pType == byte.class) { | |
| val = (byte) 0; | |
| } else if (pType == short.class) { | |
| val = (short) 0; | |
| } else if (pType == int.class) { | |
| val = 0; | |
| } else if (pType == long.class) { | |
| val = 0L; | |
| } else if (pType == float.class) { | |
| val = 0.0f; | |
| } else if (pType == double.class) { | |
| val = 0.0d; | |
| } |
There was a problem hiding this comment.
It's a one-liner that handles all 8 primitive types without a switch/map. I've seen this pattern used in a few places in JDK internals. A map would be more readable but also more code. Can change it if you prefer but it's not really hurting anything.
|
Good call, thanks @MarkEWaite! , Yes, absolutely , it makes total sense to include those in the main test suite instead of keeping them external. I’ve added them to
All tests go through I used Let me know if you'd like any additional scenarios covered. |
andreahlert
left a comment
There was a problem hiding this comment.
@subwaycookiecrunch could you address or reply to the Copilot review comments (readResolve, critical fields, duplicate detection, Saveable)? That would help move the PR forward.
|
@andreahlert sure thing, I just went through and replied to all of them inline. |
Fixes #26077
Currently, if you try to unmarshal a Java
recordusing RobustReflectionConverter on Java 15+, it blows up withjava.lang.UnsupportedOperationException: can't get field offset on a record class.This happens because XStream tries to inject fields via
sun.misc.Unsafeby default, which modern Java blocks for records since they are strictly immutable.This PR fixes it by catching
recordtypes right before they try to instantiate. Since Jenkins still compiles against Java 11 (meaning we can't use the actualrecordkeyword in the source), I just used a safe reflection checktype.getSuperclass().getName().equals("java.lang.Record").When it hits a record, it simply parses all the attributes/children into a map, grabs the
RecordComponents, and instantiates it properly through the canonical constructor instead of trying to hack the offsets.Testing done
Wrote a standalone test invoking this exact reflection logic against a Java 17
recordto verifyClass.getMethod("getRecordComponents")properly reconstructs the instance without throwingUnsupportedOperationExceptionand maps the fields perfectly.Screenshots (UI changes only)
Before
After
Proposed changelog entries
recordtypes in XStream2 RobustReflectionConverterProposed changelog category
/label bug
Proposed upgrade guidelines
N/A
Submitter checklist
@Restrictedor have@since TODOJavadocs, as appropriate.@Deprecated(since = "TODO")or@Deprecated(forRemoval = true, since = "TODO"), if applicable.evalto ease future introduction of Content Security Policy (CSP) directives (see documentation).Desired reviewers
@jglick