Skip to content

CBORDecoder.read(n) raises MemoryError on large n instead of CBORDecodeEOF #285

@coco1629

Description

@coco1629

Things to check first

  • I have searched the existing issues and didn't find my bug already reported there

  • I have checked that my bug is still present in the latest release

cbor2 version

5.8.0

Python version

3.9.18

What happened?

When CBORDecoder.read(n) is called with n exceeding the actual stream length, a slightly large n (e.g. 10) correctly raises _cbor2.CBORDecodeEOF. However, a large n (e.g. 10**18) does not reach this check, it raises MemoryError instead.

How can we reproduce the bug?

import io
import cbor2

dec = cbor2.CBORDecoder(fp=io.BytesIO(b"111"), str_errors="error")

# 抛出 CBORDecodeEOF
dec.read(10)

# 抛出 MemoryError
dec2 = cbor2.CBORDecoder(fp=io.BytesIO(b"111"), str_errors="error")
dec2.read(10**18)

Traceback:

n = 10 时:

Traceback (most recent call last):
  File "test.py", line 38, in <module>
    ret = obj.read(10)
_cbor2.CBORDecodeEOF: premature end of stream (expected to read 10 bytes, got 3 instead)

n = 10**18 时:

Traceback (most recent call last):
  File "test.py", line 38, in <module>
    ret = obj.read(1000000000000000000)
MemoryError

Expected behavior :CBORDecoder.read(10**18) should raise _cbor2.CBORDecodeEOF(same as read(10)), not MemoryError.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions