Skip to content

Cross Entropy#26

Open
CeriAnneLaureyssens wants to merge 17 commits intoMichielStock:masterfrom
CeriAnneLaureyssens:master
Open

Cross Entropy#26
CeriAnneLaureyssens wants to merge 17 commits intoMichielStock:masterfrom
CeriAnneLaureyssens:master

Conversation

@CeriAnneLaureyssens
Copy link
Copy Markdown

First check-up file
There still has to be done some layouting, inserting some more descriptive comments to the code itself, some examples on how and when to use the code, and (maybe) adding some visuals

First check-up file
There still has to be done some layouting, inserting some more descriptive comments to the code itself, some examples on how and when to use the code, and (maybe) adding some visuals
@MichielStock
Copy link
Copy Markdown
Owner

Looks nice!

  • can you rename your notebook to just crossentropy?
  • to keep in tune with the course, I prefer 'objective' to 'loss'
  • Int64(floor(0.1*N)) => floor(Int, 0.1*N)
  • you might look to 'Algorithms for Optimization' for some additional pointers
  • explain it in words and with some formulas, add a couple of plots to explain.
  • why not use a fixed number of samples? Similarly, I would just use a fixed number of elites: keep it simple!
  • @assert length(losses) == N looks quite unnecessary
  • all(weights .≈ 0.) this won't work (check docs) better use all(abs.(weights) .< 1e8)
  • you can test them on some of the toy examples we used in the course. Would be fun to visualize the path!

Good luck!

@MichielStock
Copy link
Copy Markdown
Owner

Something is wrong with the notebook file. Better fix this!

@CeriAnneLaureyssens
Copy link
Copy Markdown
Author

Should be okay now, I made a mistake by deleting my old file and removing the folder completely. My bad

@MichielStock
Copy link
Copy Markdown
Owner

take care of mathematics: \text{argmin} \log (so they are correctly typeset in math mode). If you use images, just copy the link ![](link to figure), straight form the github repo

@menvdamm
Copy link
Copy Markdown

Hi!

I just took a look at your notebook and I'm very impressed with your coding skills, nice job! The subject you chose definitely doesn't seem like one of the easiest, so I think the project might benefit from a little bit more explanation or examples about what this technique is actually concretely used for.

Some other small remarks:

  • where you describe the implementation of the CE method: maybe put stuff like 'θgstar' and 'θiteration' in math mode with subscript to make it look a little bit nicer
  • in the importance sampling example, the code chunk of the loss function: a small typo 'of intereste' instead of 'of interest'
  • like Michiel already said changing 'loss function' to 'objective function' might make it a little easier to understand

@Triana-Forment
Copy link
Copy Markdown

  • Using $$f$$ and $$g$$ for the names of the probability distribution, as well as for each variable used in a formula and then described in text (D, E, X…)

  • $θ_{g}*$ instead of θ gstar

  • When you talk about some code functions, I would use Random.seed!(1234), instead of Random.seed!(1234)

  • This are just minor aesthetic corrections that facilitates the lecture.

  • I have not clear the concept of batched, so I tried to add “batched = true” as argument for running “cross_entropy_method” function (is_dist_opt = cross_entropy_method(l, is_dist_0; max_iter = 4, N=10000, weight_fn = w, verbose = true, batched = true) and I’ve obtained the error:

ArgumentError: invalid index: :x of type Symbol

to_index(::Symbol)@indices.jl:300
to_index(::Vector{Dict{Symbol, Vector{Int64}}}, ::Symbol)@indices.jl:277
to_indices@indices.jl:333[inlined]
to_indices@indices.jl:325[inlined]
getindex@abstractarray.jl:1170[inlined]
l(::Dict{Symbol, Vector{Distributions.Sampleable}}, ::Vector{Dict{Symbol, Vector{Int64}}})@Other: 3
var"#cross_entropy_method#1"(::Int64, ::Int64, ::Float64, ::Int64, ::Int64, ::typeof(Main.workspace6.w), ::Random._GLOBAL_RNG, ::Bool, ::Bool, ::Bool, ::Main.workspace4.var"#7#13", ::typeof(Main.workspace4.cross_entropy_method), ::typeof(Main.workspace5.l), ::Dict{Symbol, Vector{Distributions.Sampleable}})@Other: 29
top-level scope@Local: 5[inlined]

So maybe you could provide an example for running the function with the argument “batched = true”

  • README.md file the second and third paragraphs I would add them in the notebook, to clarify what has the dictionary that represents the distributions.

  • References are missing.

For the rest, good job, it doesn’t seem like an easy algorithm to implement (e.g. the package Distributions it’s totally new for me…).

@CeriAnneLaureyssens
Copy link
Copy Markdown
Author

@MichielStock I have uploaded my final version on my GitHub repo and on Ufora

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants