Skip to content

Tokenizer crash when passing an array of hashes - Add support for "key[0].foo" #208

@ndbroadbent

Description

@ndbroadbent

Hello, I've been using dentaku for a few years, and it's been really great! Thanks for all your work on it!

I just ran into a crash where I need to process some formulas inside an array of hashes. (My case is that I have an array of fields in a table, where each hash represents a row in the table.)

I've been using a fork that just includes this one commit with an "expand" option: DocSpring@458ddd1

EDIT: It looks like Dentaku now calls FlatHash.expand to expand the result by default! That's great.

Here's the branch in my fork, where I've added a failing test case with this "array of hashes" case: https://github.com/DocSpring/dentaku/tree/array_of_hashes

Failing test:

    it "evaluates expressions in hashes and arrays, and expands the results" do
      calculator.store(
        fruit_quantities: {
          apple: 5,
          pear: 9
        },
        fruit_prices: {
          apple: 1.66,
          pear: 2.50
        }
      )
      expressions = {
        weekly_budget: {
          fruit:  "weekly_budget.apples + weekly_budget.pears",
          apples: "fruit_quantities.apple * discounted_fruit_prices.apple",
          pears:  "fruit_quantities.pear * discounted_fruit_prices.pear",
        },
        discounted_fruit_prices: {
          apple: "round(fruit_prices.apple * discounts[0], 2)",
          pear: "round(fruit_prices.pear * discounts[1], 2)"
        },
        discounts: ["0.4 * 2", "0.3 * 2"],
        percents: [
          { percent: 75 },
          { percent: 35 },
          { percent_avg: "round((percents[0].percent + percents[1].percent) / 2)" },
        ]
      }
      solver = described_class.new(expressions, calculator)

      expect(solver.solve!(expand: true)).to eq(
        "weekly_budget" => {
          "fruit" => 20.15,
          "apples" => 6.65,
          "pears" => 13.50
        },
        "discounted_fruit_prices" => {
          "apple" => 1.33,
          "pear" => 1.50
        },
        "discounts" => [0.8, 0.6],
        "percents" => [
          { "percent" => 75 },
          { "percent" => 35 },
          { "percent_avg" => 55 },
        ]
      )
    end

Failure error message:

$ rspec ./spec/bulk_expression_solver_spec.rb:81


  1) Dentaku::BulkExpressionSolver#solve! evaluates expressions in hashes and arrays, and expands the results
     Failure/Error: raise TokenizerError.for(reason, meta), message

     Dentaku::TokenizerError:
       parse error at: ':percent=>75}'
     # ./lib/dentaku/tokenizer.rb:107:in `fail!'
     # ./lib/dentaku/tokenizer.rb:27:in `tokenize'
     ...

(This test was previously passing when it didn't include the "percents" array.)

I can see that an array of strings is supported (discounts: ["0.4 * 2", "0.3 * 2"],), but it doesn't seem to handle the array of hashes.

I'm not quite sure where to start with this, so I was wondering if you might have any idea about how to handle this case in the tokenizer?

Thanks for your time, and no problem at all if you don't have time to look into this! (I know it's a very obscure case!)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions