-
Notifications
You must be signed in to change notification settings - Fork 4
Increase max file size to 2 MB #74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## develop #74 +/- ##
========================================
Coverage 96.31% 96.31%
========================================
Files 4 4
Lines 217 217
========================================
Hits 209 209
Misses 8 8 ☔ View full report in Codecov by Sentry. |
|
I'd happy to make the change, but it would be helpful to memorialize a short explanation here. |
|
I think this is redundant because of #70 too? |
Was short on time earlier. Updated now! |
There are some changes coming down the pipeline (pun intended) to EECS 485 Project 5. The first stage of the MapReduce pipeline will process pages in parallel map tasks. Currently, the first mapper is a single task which processes all input files (each file is a wiki page). |
awdeorio
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
That sounds promising; is it related to this comment of mine? |
|
It's different. HTML parsing will be part of the first pipeline stage (stage 0). There will be one mapper per file, which means one execution per HTML document. |
|
Mappers will run per file instead of per line? That sounds like we're not going to be using a streaming interface anymore. |
To support EECS 485 P5 -- The HTML Dataset changes use full Wikipedia pages. We want to make sure that each file gets read in its entirety in a MapReduce job so that students can use the full document content in the pipeline. Some Wikipedia pages are larger than 1 MB, so let's bump the limit so they don't get split.