Skip to content

OK Grader receiving additional floating point values from Python data science package tables #1419

Open
@bbean02

Description

While using OK Grader via Otter Grader and data science tables, the OK grader client-side tests are receiving additional floating point values from Python. In one specific Data 8 Lab, values are passed from a table column after sorting and while the table output shows 4-5 points of precision, the output for one of the three array range values randomly add 9 additional points of precision and a 5 (64.4455000000000005, vise 64.445500, which OK grader expects).

Attempts to rectify this coercing the set_format formatter on the problem column does not work. Changing the format of the input file to eliminate encoding issues does not work. There seems to be no way to eliminate the floating point precision issue.

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions