-
Notifications
You must be signed in to change notification settings - Fork 10
dry run mode for --jira-ticket parmeter in paasta spark-run #164
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
jira_ticket: The Jira ticket provided by the user | ||
""" | ||
# Get the jira ticket validation setting | ||
flag_enabled = self.mandatory_default_spark_srv_conf.get('spark.yelp.jira_ticket.enabled', 'false') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
are we storing a string or bool in srv-configs for this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
string
requirements.txt
Outdated
@@ -4,3 +4,4 @@ pyyaml >= 3.0 | |||
typing-extensions==4.13.2 | |||
# To resolve the error: botocore 1.29.125 has requirement urllib3<1.27,>=1.25.4, but you'll have urllib3 2.0.1 which is incompatible. | |||
urllib3==1.26.15 | |||
yelp-clog==7.2.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i have no idea what we're doing with this requirements file - but that's probably a problem to fix later
that said: we'll probably need to split things up into a yelpy and oss requirements.txt and pick between them based on where tests are being run from (see Tron and PaaSTA for examples)
additionally, we'll need to figure out what to do with https://github.com/Yelp/service_configuration_lib/blob/master/setup.py#L30-L36 - which is how apps that use this library will know what dependencies to pull in
i think we'll either need to have install_requires be read from a file that we can pick between in setup.py based on where we're being run from (either that, or add yelp-clog as an extra)
service_configuration_lib/utils.py
Outdated
@@ -217,3 +227,161 @@ def get_spark_driver_memory_overhead_mb(spark_conf: Dict[str, str]) -> float: | |||
) | |||
driver_mem_overhead_mb = driver_mem_mb * driver_mem_overhead_factor | |||
return round(driver_mem_overhead_mb, 5) | |||
|
|||
|
|||
def _load_default_service_configurations_for_clog() -> Optional[Dict[str, Any]]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i'm pretty sure we can obviate all of this if we do something like what we do in paasta:
try:
import clog
except ImportError:
clog = None
...
if clog is None:
print("CLog logger unavailable, exiting.", file=sys.stderr)
return
clog.config.configure(
scribe_host="169.254.255.254",
scribe_port=1463,
monk_disable=False,
scribe_disable=False,
)
...
clog.log_line(STREAM_WE_WANT_TO_LOG_TO, MESSAGE_WE_WANT_TO_LOG)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for this suggestion. It worked with one change
service_configuration_lib/spark_config.py:1059: error: Argument "scribe_port" to "configure" has incompatible type "int"; expected "Optional[str]" [arg-type]
What this change does
jira_ticket
handling to a separate private method. We still run all tests through the public interfaceget_spark_conf()
requirements-oss.txt
with all the previous dependencies andrequirements-yelp.txt
to includeyelp-clog
. We needyelp-clog
to write warning messages to monk when--jira-ticket
parameter is not passed.service-configuration-lib[yelp] >= 3.3.3
going forward. PaaSTA, spark_tools depend on this lib.spark.yelp.jira_ticket.enabled
is false in srv-configs, it logs a warning to monk with the user param passed to get_spark_conf(). Ensures thatpaasta validate
andpaasta m-f-d
work as usual.jenkins
to allowed users list (Proof that this user is indeed calledjenkins
)Testing
When flag is disabled
When flag is enabled