Skip to content

Eliminating bignums by leveraging space unsafety #786

Open
@sorear

Description

@sorear

COUNT_LIST is a fairly hot function in the current compiler (a different issue, but let's pretend it's desired). It is compiled with bignum support, but a human can easily see that bignums aren't actually reachable there: the numbers represent elements in a list that exists, and memory will always be exhausted before the largest representable number is reached. Can we automate this reasoning in any way? Does any existing ML implementation do such reasoning?

Any such optimization would need to be fairly late since it needs to assume no deforestation-type optimization will run subsequently.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions