Skip to content

tool crashes with large input files in binary mode #8

@lukas0820

Description

@lukas0820

The following error occurs when large input files (>2GB) are to be decompressed in binary mode:

Exception in thread "main" java.lang.OutOfMemoryError: Requested array size exceeds VM limit
	at java.util.Arrays.copyOf(Arrays.java:3236)
	at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:118)
	at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
	at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:135)
	at org.openmainframeproject.tersedecompress.TerseDecompresser.PutChar(TerseDecompresser.java:108)
	at org.openmainframeproject.tersedecompress.NonSpackDecompresser.decode(NonSpackDecompresser.java:81)
	at org.openmainframeproject.tersedecompress.TerseDecompress.process(TerseDecompress.java:104)
	at org.openmainframeproject.tersedecompress.TerseDecompress.main(TerseDecompress.java:113)

The problem can be fixed by limiting the maximum record size in PutChar(TerseDecompresser.java:108):

record.write(X-1);  
final int MAX_ARRAY_SIZE = Integer.MAX_VALUE - 10;  
if (record.size() >= MAX_ARRAY_SIZE) {  
   endRecord();  
}

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions