Skip to content

Rule file provider #4633

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 11 commits into
base: main
Choose a base branch
from
Open

Rule file provider #4633

wants to merge 11 commits into from

Conversation

Nadahar
Copy link
Contributor

@Nadahar Nadahar commented Mar 5, 2025

Rule file provider

  • Create rule file provider for JSON files found in conf/automation/rules
  • Create rule file provider for YAML files in accordance with the "new YAML format" (RFC: YAML configuration #3666)
  • Create rule template provider for YAML files in accordance with the "new YAML format" (RFC: YAML configuration #3666)
  • Fix DELETE event bug in FolderObserver

Description

This PR has changed quite a bit from when it was originally created, so I've updated the description.

The primary task of this PR is to provide file-based YAML and JSON rules. The JSON parser supports multiple rules in one file if they are supplied within an array, which means enclosed in [ and ] and separated by ,.

YAML files also support rule templates, both in the "new YAML format" developed in #3666. There has been a separate RFC in #4797 discussing how to define the syntax for rules and rules templates. Rules use the rules element, white rule templates use the ruleTemplates element.

This PR is very loosely coupled to #4591 in that they are both split out as PRs from the same "work branch". The FolderObserver file watcher DELETE fix has been included here, since it was discovered after #4591 was submitted.

It should also be mentioned that the documentation currently claims support for JSON based rules, but no evidence that this exists, or has existed in previous versions, has been found in the code:

The automation engine reads rule json files from the {openhab-dir}/automation/*.json directory.

The documentation should probably be updated to specify {openhab-dir}/automation/rules/*.json

Original PR description below:

Rule file provider

  • Create rule file provider for JSON and YAML files
  • Fix DELETE event bug in FolderObserver

Description

The primary task of this PR is to provide file-based YAML and JSON rules. It supports multiple rules in one file if they are supplied within an array, which means enclosed in [ and ] and separated by ,.

This PR is very loosely coupled to #4591 in that they are both split out as PRs from the same "work branch", and this PR use the AbstractJacksonYAMLParser introduced in #4591. The FolderObserver file watcher DELETE fix has been included here, since it was discovered after #4591 was submitted.

There is one issue that might need to be discussed, and that is the location to look for the YAML and JSON files. There are two "obvious candidates", conf/automation/rules and conf/rules. The latter is the folder already in use for Rules DSL files, while the former isn't currently used (or automatically created when OH is installed). In either case, some minor changes to the documentation should be made as well, but it's hard to write that until the final location has been decided.

Currently, this PR uses conf/rules, but that's very easy to change. From a user's perspective, this might be the most logical, as they are still rules, although the format is different. It doesn't seem like this causes any conflicts, but a debug log message is created both from the Rules DSL parser (that it ignores the .yaml or .json file) and from the AbstractFileProvider that it doesn't have a parser for .rules files. None of these are shown by default. The involved classes could be modified to not log these cases, as they are expected, but both are "generic in nature" and introducing exceptions to concrete file extensions doesn't really "fit".

If conf/automation/rules are used instead, the "conflict" with two parsers monitoring the same folder would be avoided, but it might be more confusing for the end user to have to place the rule files in two different locations. It should also be considered if this folder should be created when the installation is set up, like with many of the other folders.

It should also be mentioned that the documentation currently claims support for JSON based rules, but no evidence that this exists, or has existed in previous versions, has been found in the code:

The automation engine reads rule json files from the {openhab-dir}/automation/*.json directory.

If this actually is wrong, the documentation should be corrected.

@Nadahar Nadahar requested a review from a team as a code owner March 5, 2025 16:24
@Nadahar Nadahar force-pushed the rule-file-provider branch from ba2b127 to 7fde6c3 Compare March 5, 2025 16:28
@lolodomo
Copy link
Contributor

lolodomo commented Mar 9, 2025

The primary task of this PR is to provide file-based YAML and JSON rules.

We are already in the process to add YAML for different kinds of things with the ability to put everything in a same file.
I don't know what you are doing exactly with this PR but we have to take care to have something globally consistent. And also use the same YAML library.
In case it makes no sense to consider the other discussions about YAML in the current context, you can forget my message.

@Nadahar
Copy link
Contributor Author

Nadahar commented Mar 9, 2025

We are already in the process to add YAML for different kinds of things with the ability to put everything in a same file.
I don't know what you are doing exactly with this PR but we have to take care to have something globally consistent. And also use the same YAML library.

I'm not familiar with this effort and exactly what it will end up doing, but this PR adds a rule file provider for both JSON and YAML. The YAML parser is the same as the one already used for rule templates, and it is the same YAML parser (Jackson) as I've found other places where YAML is parsed in Core.

This isn't related to any UI or how MainUI deals with YAML. It simply deserializes "object" in either YAML or JSON into a "OH Rule object".

@Nadahar Nadahar force-pushed the rule-file-provider branch from 7fde6c3 to cd52101 Compare March 12, 2025 19:36
@Nadahar
Copy link
Contributor Author

Nadahar commented Mar 13, 2025

What I need is some feedback on which folder to use. If conf/rules is preferred, this PR is ready. If not, I need to make some modifications.

@lolodomo
Copy link
Contributor

You complains about no feedback on your PRs but I see I already answered to you. I am still thinking and I am not alone that we should have a global approach for YAML, the idea is the ability to put all sorts of things in our new YAML file format including items, things, but also rules. If you propose to create another YAML file, I am not sure it is the best idea.
Typically, we should simply add a "rules" element in our new YAML format.
That may be the reason you got no other feedback.

Regarding your last question about the folder, sorry I have not the answer, rule management is one of the OH parts I am knowing less.

@Nadahar
Copy link
Contributor Author

Nadahar commented Apr 28, 2025

You complains about no feedback on your PRs but I see I already answered to you. I am still thinking and I am not alone that we should have a global approach for YAML, the idea is the ability to put all sorts of things in our new YAML file format including items, things, but also rules. If you propose to create another YAML file, I am not sure it is the best idea.
Typically, we should simply add a "rules" element in our new YAML format.
That may be the reason you got no other feedback.You complains about no feedback on your PRs but I see I already answered to you. I am still thinking and I am not alone that we should have a global approach for YAML, the idea is the ability to put all sorts of things in our new YAML file format including items, things, but also rules. If you propose to create another YAML file, I am not sure it is the best idea.
Typically, we should simply add a "rules" element in our new YAML format.
That may be the reason you got no other feedback.

If that's the case, it should be made much more clear. I interpreted your comment more as a "please remember to think of this", not as a blocker.

To explain, this is a part of a larger work that I have isolated out as a separate PR to make things more manageable to review. So, I have other code that builds on the capabilities this provides.

Currently, there are only two sources of rules in standard OH, and that is using rule DSL files or managed rules created in the UI. In addition, add-ons can of course inject rules any way they like, but I don't think this is done much except for by the scripting add-ons.

This means that there are no way to provide unmanaged rules unless they are written as rule DSL. The rule DSL syntax doesn't support rule templates, so if you want to use rule templates, you must use managed rules only. This PR provides a way to create unmanaged rules of any type, it's not bound to rules DSL (although the rules CAN use rules DSL or any other scripting language) and it supports rule templates.

According to the current documentation, this already exists for JSON - but I've both tested it and searched everywhere in core for a trace of this, and it's simply not there. So, this in fact provides functionality that's already claimed to be there.

The reason for adding YAML in addition to JSON is because there's a "unique problem" with JSON when dealing with scripts. Almost all rules will contain at least one script, and since JSON doesn't allow multi-line strings, you have to provide the whole script as one long line separated by \n. That's not really practical, which is why YAML is very helpful, because it does have multi-line string support. If it wasn't for that, I'd rather see YAML burn in a certain place, because I just can't stand white-space based formats. I think they are highly "volatile" to work with, and not any easier to read or write. It takes very little time to get used to a properly formatted syntax, and people should invest that little time instead of going down the path of all the problems white-space based formats cause.

The YAML parser used for this is the same YAML parser already used for rule templates. Rules and rule templates are "almost the same thing", a rule template is a rule with some placeholders not yet filled in. So, it would be very logical, to me at least, that they support the same syntax and have the same possibilities.

I tried to find information about this "global approach for YAML", and I didn't really find it. I read through some PRs that I thought might be related, I searched the forum, but I still don't know what the idea really is. As such, it's difficult for me to know how these things would work together or conflict.

If the idea is that you can have one huge YAML file with all the different kind of configuration in it, I'd say that I personally wouldn't find that attractive. It's comparable to write a program as one huge class/unit of continuous code instead of separating it into a logical structure separate unrelated things. I can see that it can be handy in some cases, it might be handy to be able to define a Thing and the corresponding Item(s) in the same file. That's fine, but to go from there to forcing "everything" into the same file seems strange to me. I don't know if that's the idea though.

But, in any case, this new "global parser" would have to implement a huge number of Providers. As such, it would have to implement RuleProvider to support rules. There's no conflict with having a standalone rule provider and one "combined provider" that also provides rules. Are you planning to remove all the existing standalone providers for different things when this "combined provider" is in place? If so, the rule provider in this PR could be removed along with all the others. I still don't quite see the conflict.

@lolodomo
Copy link
Contributor

lolodomo commented Apr 29, 2025

This means that there are no way to provide unmanaged rules unless they are written as rule DSL. The rule DSL syntax doesn't support rule templates, so if you want to use rule templates, you must use managed rules only. This PR provides a way to create unmanaged rules of any type, it's not bound to rules DSL (although the rules CAN use rules DSL or any other scripting language) and it supports rule templates.
...
The YAML parser used for this is the same YAML parser already used for rule templates. Rules and rule templates are "almost the same thing", a rule template is a rule with some placeholders not yet filled in. So, it would be very logical, to me at least, that they support the same syntax and have the same possibilities.

DSL rule syntax should have been enhanced to support template. As you explained it, there is almost no difference between a rule and a rule template. This is something I could have a look (but not soon).

I tried to find information about this "global approach for YAML", and I didn't really find it. I read through some PRs that I thought might be related, I searched the forum, but I still don't know what the idea really is. As such, it's difficult for me to know how these things would work together or conflict.

There is mainly a very big issue/RFC #3666, if I am not wrong, you even posted a message in it.

If the idea is that you can have one huge YAML file with all the different kind of configuration in it, I'd say that I personally wouldn't find that attractive. It's comparable to write a program as one huge class/unit of continuous code instead of separating it into a logical structure separate unrelated things. I can see that it can be handy in some cases, it might be handy to be able to define a Thing and the corresponding Item(s) in the same file. That's fine, but to go from there to forcing "everything" into the same file seems strange to me. I don't know if that's the idea though.

Putting several kind of things in the same YAML file will just be an option, of course. You could also have one file per item, per thing, per rule, if that is your preference.
I am pinging @rkoshak as I believe he is one of the big sponsors of a YAML file format that could contain different kind of information and I believe his idea was also for rules.

But, in any case, this new "global parser" would have to implement a huge number of Providers. As such, it would have to implement RuleProvider to support rules. There's no conflict with having a standalone rule provider and one "combined provider" that also provides rules. Are you planning to remove all the existing standalone providers for different things when this "combined provider" is in place? If so, the rule provider in this PR could be removed along with all the others. I still don't quite see the conflict.

No, we add something new but there is no plan to remove existing providers.

My point is that you want to define a new YAML file format dedicated to rules and rule templates, I am just not sure it should be done outside our YAML model repository. And the syntax should be discussed to be the most user friendly as possible. I see in your PR that you made it aligned to the JSON object.
@rkoshak: what is your feeling with that ?

My suggestion would be at this time to just consider a JSON provider + a potential extension of the DSL rule to support rule template. But once again, that is only MY suggestion and not a decision against your proposal. Maybe there is something I do not understand well as I am not familiar with rule template.

I am finishing these days the support of items/metadata/channel links in the new YAML file format. The next step is probably the support of rules and rule templates. The syntax will certainly be discussed in #3666 .

@Nadahar
Copy link
Contributor Author

Nadahar commented Apr 29, 2025

DSL rule syntax should have been enhanced to support template. As you explained it, there is almost no difference between a rule and a rule template. This is something I could have a look (but not soon).

I don't think that would be very practical, but I guess I'd have to make some examples to show what I mean. However, this PR allows the use of templates both for Rules DSL and other scripts. The reason that I don't think it's "practical" is that there's nowhere to store the "rule stub"/"rule configuration" in Rules DSL. This is stored in the configuration object of the rule itself, which doesn't "exist" in a Rules DSL context. Remember that a Rules DSL file is parsed into a Rule, and during that parsing, the content is split into trigger(s) and an action. The Rules DSL format is quite limited in that it doesn't support conditions and doesn't support multiple actions, for example. Neither does it allow storage of for example the configuration object. But, the resulting object that is created by parsing a Rules DSL script does. With this PR, you can supply such an "already parsed" Rules DSL rule, where you can have as many triggers, conditions and actions that you want - in addition to said configuration.

There is mainly a very big issue/RFC #3666, if I am not wrong, you even posted a message in it.

OK, then I failed to grasp the full intention of #3666. I tried to read it, but it is extremely long, so I couldn't keep "full focus" all the way.

Putting several kind of things in the same YAML file will just be an option, of course. You could also have one file per item, per thing, per rule, if that is your preference.
I am pinging @rkoshak as I believe he is one of the big sponsors of a YAML file format that could contain different kind of information and I believe his idea was also for rules.

Yes, but fundamentally, to make it possible to put several things in the same YAML file, you must define some syntax that allow these to be parsed as different objects. This PR does no such thing, it only tries to deserialize a Rule from either YAML or JSON. As such, I still don't think there is much conflict, although you could maybe argue that this provider might become redundant when #3666 is in place, depending on how this is actually implemented. But, I still think this provider has its purpose as a complement to the rule template parser that is already there, not to mention that the documentation states that this functionality (minus YAML) already is in OH.

My point is that you want to define a new YAML file format dedicated to rules and rule templates,

Again, I don't see it as "defining a format", as it's just a "raw mapping" of the actual structure of a Rule. There is no syntax that has any meaning specific to this provider, there is nothing for anybody to "learn". You can copy what you find in the "Code" tab in the UI and paste it in a file that will then be picked up by this provider.

My suggestion would be at this time to just consider a JSON provider + a potential extension of the DSL rule to support rule template.

The problem with that, as I see it, is how terribly inconvenient it is to work with scripts in JSON, having to edit/maintain an entire script that is one line, separated by \n. Humans aren't really suitable for that, so it would probably mean to have to write it in YAML or some other format, and then do the conversion to JSON manually, outside of OH. It makes much more sense that OH will do that conversion for you.

@rkoshak
Copy link

rkoshak commented Apr 29, 2025

OK, then I failed to grasp the full intention of #3666. I tried to read it, but it is extremely long, so I couldn't keep "full focus" all the way.

#3666 is creating a standardized way to define everything that is currently definable via DSL (i.e. .items, .things, .persist, and .rules) using YAML. It's coupled with other PRs which added ability to have OH translate between the formats and UI updates which allow one to see the DSL version of a Thing (for example) and not just a YAML representation of the JSON.

Anything done on this PR needs to be reconciled with what is going on on #3666 because we don't want to inadvertantly introduce multiple YAML approaches.

You can copy what you find in the "Code" tab in the UI and paste it in a file that will then be picked up by this provider.

With #3666 and related PRs the YAML you find in the code tab will not necessarily be a straight mapping of the JSON any more. It might not even be YAML any more.

The problem with that, as I see it, is how terribly inconvenient it is to work with scripts in JSON, having to edit/maintain an entire script that is one line, separated by \n. Humans aren't really suitable for that, so it would probably mean to have to write it in YAML or some other format, and then do the conversion to JSON manually, outside of OH. It makes much more sense that OH will do that conversion for you.

I think @lolodomo's intent is to suggest that the YAML part of this waits until more progress is made on #3666 and then support for YAML templates would be added as part of #3666 when they get to that point.

I am pinging @rkoshak as I believe he is one of the big sponsors of a YAML file format that could contain different kind of information and I believe his idea was also for rules.

I definitely want rules to be included as part of #3666 and I would very much like the ability to define, load, and use rule tempaltes as well.

My point is that you want to define a new YAML file format dedicated to rules and rule templates, I am just not sure it should be done outside our YAML model repository. And the syntax should be discussed to be the most user friendly as possible. I see in your PR that you made it aligned to the JSON object.
@rkoshak: what is your feeling with that ?

I agree. I think we have an opportunity to take advantage of both PRs and come to a uinfied solution.

I think the main problem here is that the two efforts have been going on in parallel without coordination. I don't think we necessarily need to wait for5 one to finish before working on the other, but I think more coordination is needed to come out with a unified approach.

@Nadahar
Copy link
Contributor Author

Nadahar commented Apr 29, 2025

Anything done on this PR needs to be reconciled with what is going on on #3666 because we don't want to inadvertantly introduce multiple YAML approaches.

I get that, I'm just not sure I understand exactly how to "reconcile" it. #3666 is about a new format, based on YAML, if I understand it correctly, while this PR merely converts YAML to JSON - just like what is already done with rule templates. I think there should be parity between rules and rule templates as well.

Are there any examples of the formats that has already been implemented by #3666, so that I can get a better understanding of the idea?

The current implementation for rule templates doesn't really "bind" much, since rule templates as a whole is undocumented. As such, very few know that this even exists or how to use it. #3666 could probably simply replace the YAML parsing for rule templates without actually "breaking" anything out there. The question is how far into the future this is.

It could be argued that the same could be done with this. Merge it now with the "raw mapping" and then remove the YAML parsing when the result of #3666 is ready. As long as it's not documented, it won't see much use anyway, but it will at least be "usable" without having to do YAML -> JSON using an external converter for any change to the rules.

@Nadahar
Copy link
Contributor Author

Nadahar commented Apr 29, 2025

This is some of the test files I've used, it's the same two rules in JSON and YAML. See how utterly useless JSON is for scripts..

JSON:

[
  {
    "actions": [
      {
        "configuration": {
          "script": "// Version 1.0\nvar {TimerMgr, helpers} = require('openhab_rules_tools');\nconsole.loggerName = 'org.openhab.automation.rules_tools.TimeStateMachine';\n//osgi.getService('org.apache.karaf.log.core.LogService').setLevel(console.loggerName, 'DEBUG');\n\nhelpers.validateLibraries('4.2.0', '2.0.3');\n\nconsole.debug('Starting state machine in ten seconds...');\n\n// Properties\nvar STATE_ITEM  = \"DemoDateTime\";\nvar DT_GROUP    = \"DemoSwitchGroup\";\nvar DAY_TYPES   = ['custom', 'holiday', 'dayset', 'weekend', 'weekday', 'default'];\nvar NAMESPACE   = 'tsm';\nvar USAGE =   'Time Based State Machine Usage:\\n'\n            + 'All date times must be a member of ' + DT_GROUP + '.\\n'\n            + 'Each member of the Group must have ' + NAMESPACE + ' Item metadata of the following format:\\n'\n            + '  .items file: ' + NAMESPACE +'=\"STATE\"[type=\"daytype\", set=\"dayset\", file=\"uri\"]\\n'\n            + \"  UI YAML: use '\" + NAMESPACE + \"' for the namespace and metadata format:\\n\"\n            + '    value: STATE\\n'\n            + '    config:\\n'\n            + '      type: daytype\\n'\n            + '      set: dayset\\n'\n            + '      file: uri\\n'\n            + 'Where \"STATE\" is the state machine state that begins at the time stored in that Item, '\n            + '\"daytype\" is one of \"default\", \"weekday\", \"weekend\", \"dayset\", \"holiday\", or \"custom\". '\n            + 'If \"dayset\" is chosen for the type, the \"set\" property is required indicating the name of the '\n            + 'custom dayset configured in Ephemeris. If \"custom\" is chosen as the type, the \"file\" property '\n            + 'is required and should be the fully qualified path the the Ephemeris XML file with the custom '\n            + 'holidays defined. The \"set\" and \"file\" properties are invalid when choosing any of the other '\n            + '\"types\".';\n\n/**\n * Validates the passed in Item has valid NAMESPACE metadata.\n *\n * @param {string} itemName name of the Item to check\n * @throws exception if the metadata doesn't exist or is invalid\n */\nvar validateItemConfig = (itemName) => {\n  const md = items[itemName].getMetadata()[NAMESPACE];\n\n  if(md.value === undefined || md.value === null || md.value === '') {\n    throw itemName + ' has malformed ' + NAMESPACE + ' metadata, no value found!';\n  }\n\n  const dayType = md.configuration['type'];\n  if(!dayType) {\n    throw itemName + ' has malformed ' + NAMESPACE + ' metadata, required \"type\" property is not found!';\n  }\n\n  if(dayType == 'dayset' && !md.configuration['set']) {\n    throw itemName + ' has malformed ' + NAMESPACE + ' metadata, type is \"dayset\" but required \"set\" property is not found!';\n  }\n\n  if(dayType == 'custom' && !md.configuration['file']) {\n    throw itemName + ' has malformed ' + NAMESPACE + ' metadata, type is \"custom\" but required \"file\" property is not found!';\n  }\n\n  if(!items[itemName].type.startsWith('DateTime')) {\n    throw itemName + ' is not a DateTime Item!';\n  }\n\n  if(items[itemName].isUninitialized) {\n    throw itemName + \" is not initialized!: \" + items[itemName].state;\n  }\n\n  console.debug(itemName+ ' is valid');\n};\n\n/**\n * Return all members of the DT_GROUP that has a \"type\" metadata configuration property that\n * matches the passed in type.\n *\n * @param {string} type the day type defined in the metadata we want to get the Items for\n * @returns {Array} all the Items with the matching type in the metadata\n */\nvar getItemsOfType = (type) => {\n  const allItems = items[DT_GROUP].members;\n  return allItems.filter( item => item.getMetadata()[NAMESPACE].configuration['type'] == type);\n};\n\n/**\n * Returns true if all the Items of the given type have a unique \"state\" value\n * in the metadata.\n *\n * @param {string} the day type\n * @returns {boolean} true if all states are unique, false otherwise\n */\nvar checkUniqueStates = (type) => {\n  const allItems = getItemsOfType(type);\n  const states = new Set(allItems.map(i => { return i.getMetadata()[NAMESPACE].value; }));\n  return !allItems.length || allItems.length == states.size;\n};\n\n/**\n * Check that all Items are configured correctly.\n */\nvar validateAllConfigs = () => {\n  console.debug('Validating Item types, Item metadata, and Group membership');\n\n  // Check that all members of the Group have metadata\n  const itemsWithMD = items[DT_GROUP].members.filter(item => item.getMetadata(NAMESPACE)).length;\n  if(itemsWithMD != items[DT_GROUP].members.length) {\n    const noMdItems = items[DT_GROUP].members.filter(item => !item.getMetadata(NAMESPACE));\n    console.warn('The following Items do not have required ' + NAMESPACE + ' metadata: ' + noMdItems.map(item => item.name).join(', '));\n    return false; // no sense on performing any additional tests\n  }\n\n  // Check each Item's metadata\n  let isGood = helpers.checkGrpAndMetadata(NAMESPACE, DT_GROUP, validateItemConfig, USAGE);\n\n  // Check the state item\n  if(!items[STATE_ITEM]){\n    console.warn('The state Item ' + STATE_ITEM + ' does not exist!');\n    isGood = false;\n  }\n\n  if(!items[STATE_ITEM].type.startsWith('String')) {\n    console.warn('The state Item ' + STATE_ITEM + ' is not a String Item!');\n    isGood = false;\n  }\n\n  // Check to see if we have a default set of Items\n  if(!getItemsOfType('default')) {\n    console.warn('There are no \"default\" day type Items defined! Make sure you have all day types covered!');\n    // we'll not invalidate if there are no \"default\" items\n  }\n\n  // Check that each data set has a unique state for each Item\n  DAY_TYPES.forEach(type => {\n    if(!checkUniqueStates(type)) {\n      console.warn('Not all the metadata values for Items of type ' + type + ' are unique!');\n      isGood = false;\n    }\n  })\n\n  // Report if all configs are good or not\n  if(isGood) {\n    console.debug('All ' + NAMESPACE + ' Items are configured correctly');\n  }\n  return isGood;\n};\n\n/**\n * Pull the set of Items for today based on Ephemeris. The Ephemeris hierarchy is\n * - custom\n * - holiday\n * - dayset\n * - weeekend\n * - weekday\n * - default\n *\n * If there are no DateTime Items defined for today's type, null is returned.\n */\nvar getTodayItems = () => {\n  // Get all the DateTime Items that might apply to today given what type of day it is\n  // For example, if it's a weekend, there will be no weekday Items pulled. Whether or not\n  // the entry in this dict has an array of Items determines whether today is of that day\n  // type.\n  const startTimes = [\n    { 'type': 'custom',  'times' : getItemsOfType('custom').filter(item => actions.Ephemeris.isBankHoliday(0, item.getMetadata()[NAMESPACE].configuration['file'])) },\n    { 'type': 'holiday', 'times' : (actions.Ephemeris.isBankHoliday()) ? getItemsOfType('holiday') : [] },\n    { 'type': 'dayset',  'times' : getItemsOfType('dayset').filter(item => actions.Ephemeris.isInDayset(items.getMetadata()[NAMESPACE].configuration['set'])) },\n    { 'type': 'weekend', 'times' : (actions.Ephemeris.isWeekend()) ? getItemsOfType('weekend') : [] },\n    { 'type': 'weekday', 'times' : (!actions.Ephemeris.isWeekend()) ? getItemsOfType('weekday') : [] },\n    { 'type': 'default', 'times' : getItemsOfType('default') }\n  ];\n\n  // Go through startTimes in order and choose the first one that has a non-empty list of Items\n  const dayStartTimes = startTimes.find(dayset => dayset.times.length);\n\n  if(dayStartTimes === null) {\n    console.warn('No DateTime Items found for today');\n    return null;\n  }\n  else {\n    console.info('Today is a ' + dayStartTimes.type + ' day.');\n    return dayStartTimes.times;\n  }\n};\n\n/**\n * Returns a function called to transition the state machine from one state to the next\n *\n * @param {string} state the new state to transition to\n * @param {function} the function that transitions the state\n */\nvar stateTransitionGenerator = (state) => {\n  return function() {\n    console.info('Transitioning Time State Machine from ' + items[STATE_ITEM].state + ' to ' + state);\n    items[STATE_ITEM].sendCommand(state);\n  }\n}\n\n/**\n * Returns a function that generates the timers for all the passed in startTimes\n *\n * @param {Array} startTimes list of today's state start times\n * @param {timerMgr.TimerMgr} timers collection of timers\n * @returns {function} called to generate the timers to transition between the states\n */\nvar createTimersGenerator = (timers) => {\n  return function() {\n\n    if(validateAllConfigs()) {\n\n      // Cancel the timers, skipping the debounce timer\n      console.debug('Cancelling existing timers');\n      timers.cancelAll();\n\n      // Get the set of Items for today's state machine\n      console.debug(\"Acquiring today's state start times\");\n      const startTimes = getTodayItems();\n\n      // Get the state and start time, sort them ignoring the date, skip the ones that have\n      // already passed and create a timer to transition for the rest.\n      console.debug('Creating timers for times that have not already passed');\n      var mapped = startTimes.map(i => { return { 'state': i.getMetadata()[NAMESPACE].value,\n                                                  'time' : time.toZDT(i.state).toToday() } });\n      mapped.sort((a,b) => {\n               if(a.time.isBefore(b.time)) return -1;\n                else if(a.time.isAfter(b.time)) return 1;\n                else return 0;\n              })\n              .filter(tod => tod.time.isAfter(time.toZDT()))\n              .forEach(tod => {\n                // TODO: see if we can move to rules instead of timers\n                console.debug('Creating timer for ' + tod.state + ' at ' + tod.time);\n                timers.check(tod.state, tod.time.toString(), stateTransitionGenerator(tod.state));\n              });\n\n      // Figure out the current time of day and move to that state if necessary\n      var beforeTimes = mapped.sort((a,b) => {\n                                if(a.time.isAfter(b.time)) return -1;\n                                else if(a.time.isBefore(b.time)) return 1;\n                                else return 0;\n                              })\n                              .filter(tod => tod.time.isBefore(time.toZDT()));\n      if(!beforeTimes.length) {\n        console.debug(\"There is no date time for today before now, we can't know what the current state is, keeping the current time of day state of \" + items[STATE_ITEM].state + \".\");\n      }\n      else {\n        const currState = beforeTimes[0].state\n        const stateItem = items[STATE_ITEM];\n        console.info('The current state is ' + currState);\n        if(stateItem.state != currState) stateItem.sendCommand(currState)\n      }\n    }\n    else {\n      console.warn('The config is not valid, cannot proceed!');\n    }\n\n  };\n};\n\nvar timers = cache.private.get('timers', () => TimerMgr());\n\n// Wait a minute after the last time the rule is triggered to make sure all Items are done changing (e.g.\n// Astro Items) before calculating the new state.\ntimers.check('debounce',\n             'PT10S',\n             createTimersGenerator(timers),\n             true,\n             () => { console.debug('Flapping detected, waiting before creating timers for today'); });\n",
          "type": "application/javascript"
        },
        "id": "3",
        "inputs": {
        },
        "type": "script.ScriptAction"
      }
    ],
    "conditions": [
    ],
    "configDescriptions": [
      {
        "context": "item",
        "description": "String Item that holds the current time of day's state.",
        "filterCriteria": [
          {
            "name": "type",
            "value": "String"
          }
        ],
        "label": "Time of Day State Item",
        "name": "timeOfDay",
        "required": true,
        "type": "TEXT"
      },
      {
        "context": "item",
        "description": "Has as members all the DateTime Items that define time of day states.",
        "filterCriteria": [
          {
            "name": "type",
            "value": "Group"
          }
        ],
        "label": "Times of Day Group",
        "name": "timesOfDayGrp",
        "required": true,
        "type": "TEXT"
      },
      {
        "description": "The Item metadata namespace (e.g. \"tsm\").",
        "label": "Time of Day Namespace",
        "name": "namespace",
        "required": true,
        "type": "TEXT"
      }
    ],
    "description": "Creates timers to transition a state Item to a new state at defined times of day.",
    "name": "Time Based State Machine rule",
    "triggers": [
      {
        "configuration": {
          "groupName": "DemoSwitchGroup"
        },
        "id": "1",
        "type": "core.GroupStateChangeTrigger"
      },
      {
        "configuration": {
          "startlevel": 100
        },
        "id": "2",
        "type": "core.SystemStartlevelTrigger"
      },
      {
        "configuration": {
          "time": "00:05"
        },
        "id": "4",
        "type": "timer.TimeOfDayTrigger"
      }
    ],
    "uid": "rules_tools:tsm4"
  },
  {
    "actions": [
      {
        "configuration": {
          "script": "var from = parseFloat(oldState.toString().split(' ')[0]);\nvar to = parseFloat(newState.toString().split(' ')[0]);\n\nprint(from + '>' + to);\n\nif (to < 2 && from >= 2) {\n  events.sendCommand('DemoSwitch', 'LSELECT');\n}\n",
          "type": "application/javascript"
        },
        "id": "2",
        "inputs": {
        },
        "type": "script.ScriptAction"
      }
    ],
    "conditions": [
    ],
    "configDescriptions": [
      {
        "context": "item",
        "description": "Item that holds the power (in watts) of the washing machine. Can be a quantity type (Number:Power).",
        "label": "Power Item",
        "name": "powerItem",
        "required": true,
        "type": "TEXT"
      },
      {
        "defaultValue": 2,
        "description": "When the power measurement was at or above the threshold and crosses below it, trigger the alert.",
        "label": "Threshold",
        "name": "threshold",
        "required": true,
        "type": "DECIMAL"
      },
      {
        "context": "item",
        "description": "Item to send a command to when the measured power gets below the threshold. For instance, a Hue light advanced Alert channel.",
        "label": "Alert Item",
        "name": "alertItem",
        "required": true,
        "type": "TEXT"
      },
      {
        "defaultValue": "LSELECT",
        "description": "Command to send to the alert item (for an item linked to a Hue light alert channel, LSELECT will flash the light for a few seconds).",
        "label": "Alert Command",
        "name": "alertCommand",
        "required": true,
        "type": "TEXT"
      }
    ],
    "description": "This will monitor the power consumption of a washing machine and send an alert command when it gets below a threshold, meaning it has finished.",
    "name": "Alert when Washing Machine Finished rule",
    "triggers": [
      {
        "configuration": {
          "itemName": "CurrentPower",
          "state": ""
        },
        "id": "1",
        "type": "core.ItemStateChangeTrigger"
      }
    ],
    "uid": "ysc:washing_machine_alert2.4"
  }
]

YAML:

- actions:
    - configuration:
        script: >
          // Version 1.0

          var {TimerMgr, helpers} = require('openhab_rules_tools');

          console.loggerName =
          'org.openhab.automation.rules_tools.TimeStateMachine';

          //osgi.getService('org.apache.karaf.log.core.LogService').setLevel(console.loggerName,
          'DEBUG');


          helpers.validateLibraries('4.2.0', '2.0.3');


          console.debug('Starting state machine in ten seconds...');


          // Properties

          var STATE_ITEM  = "DemoDateTime";

          var DT_GROUP    = "DemoSwitchGroup";

          var DAY_TYPES   = ['custom', 'holiday', 'dayset', 'weekend',
          'weekday', 'default'];

          var NAMESPACE   = 'tsm';

          var USAGE =   'Time Based State Machine Usage:\n'
                      + 'All date times must be a member of ' + DT_GROUP + '.\n'
                      + 'Each member of the Group must have ' + NAMESPACE + ' Item metadata of the following format:\n'
                      + '  .items file: ' + NAMESPACE +'="STATE"[type="daytype", set="dayset", file="uri"]\n'
                      + "  UI YAML: use '" + NAMESPACE + "' for the namespace and metadata format:\n"
                      + '    value: STATE\n'
                      + '    config:\n'
                      + '      type: daytype\n'
                      + '      set: dayset\n'
                      + '      file: uri\n'
                      + 'Where "STATE" is the state machine state that begins at the time stored in that Item, '
                      + '"daytype" is one of "default", "weekday", "weekend", "dayset", "holiday", or "custom". '
                      + 'If "dayset" is chosen for the type, the "set" property is required indicating the name of the '
                      + 'custom dayset configured in Ephemeris. If "custom" is chosen as the type, the "file" property '
                      + 'is required and should be the fully qualified path the the Ephemeris XML file with the custom '
                      + 'holidays defined. The "set" and "file" properties are invalid when choosing any of the other '
                      + '"types".';

          /**
           * Validates the passed in Item has valid NAMESPACE metadata.
           *
           * @param {string} itemName name of the Item to check
           * @throws exception if the metadata doesn't exist or is invalid
           */
          var validateItemConfig = (itemName) => {
            const md = items[itemName].getMetadata()[NAMESPACE];

            if(md.value === undefined || md.value === null || md.value === '') {
              throw itemName + ' has malformed ' + NAMESPACE + ' metadata, no value found!';
            }

            const dayType = md.configuration['type'];
            if(!dayType) {
              throw itemName + ' has malformed ' + NAMESPACE + ' metadata, required "type" property is not found!';
            }

            if(dayType == 'dayset' && !md.configuration['set']) {
              throw itemName + ' has malformed ' + NAMESPACE + ' metadata, type is "dayset" but required "set" property is not found!';
            }

            if(dayType == 'custom' && !md.configuration['file']) {
              throw itemName + ' has malformed ' + NAMESPACE + ' metadata, type is "custom" but required "file" property is not found!';
            }

            if(!items[itemName].type.startsWith('DateTime')) {
              throw itemName + ' is not a DateTime Item!';
            }

            if(items[itemName].isUninitialized) {
              throw itemName + " is not initialized!: " + items[itemName].state;
            }

            console.debug(itemName+ ' is valid');
          };


          /**
           * Return all members of the DT_GROUP that has a "type" metadata configuration property that
           * matches the passed in type.
           *
           * @param {string} type the day type defined in the metadata we want to get the Items for
           * @returns {Array} all the Items with the matching type in the metadata
           */
          var getItemsOfType = (type) => {
            const allItems = items[DT_GROUP].members;
            return allItems.filter( item => item.getMetadata()[NAMESPACE].configuration['type'] == type);
          };


          /**
           * Returns true if all the Items of the given type have a unique "state" value
           * in the metadata.
           *
           * @param {string} the day type
           * @returns {boolean} true if all states are unique, false otherwise
           */
          var checkUniqueStates = (type) => {
            const allItems = getItemsOfType(type);
            const states = new Set(allItems.map(i => { return i.getMetadata()[NAMESPACE].value; }));
            return !allItems.length || allItems.length == states.size;
          };


          /**
           * Check that all Items are configured correctly.
           */
          var validateAllConfigs = () => {
            console.debug('Validating Item types, Item metadata, and Group membership');

            // Check that all members of the Group have metadata
            const itemsWithMD = items[DT_GROUP].members.filter(item => item.getMetadata(NAMESPACE)).length;
            if(itemsWithMD != items[DT_GROUP].members.length) {
              const noMdItems = items[DT_GROUP].members.filter(item => !item.getMetadata(NAMESPACE));
              console.warn('The following Items do not have required ' + NAMESPACE + ' metadata: ' + noMdItems.map(item => item.name).join(', '));
              return false; // no sense on performing any additional tests
            }

            // Check each Item's metadata
            let isGood = helpers.checkGrpAndMetadata(NAMESPACE, DT_GROUP, validateItemConfig, USAGE);

            // Check the state item
            if(!items[STATE_ITEM]){
              console.warn('The state Item ' + STATE_ITEM + ' does not exist!');
              isGood = false;
            }

            if(!items[STATE_ITEM].type.startsWith('String')) {
              console.warn('The state Item ' + STATE_ITEM + ' is not a String Item!');
              isGood = false;
            }

            // Check to see if we have a default set of Items
            if(!getItemsOfType('default')) {
              console.warn('There are no "default" day type Items defined! Make sure you have all day types covered!');
              // we'll not invalidate if there are no "default" items
            }

            // Check that each data set has a unique state for each Item
            DAY_TYPES.forEach(type => {
              if(!checkUniqueStates(type)) {
                console.warn('Not all the metadata values for Items of type ' + type + ' are unique!');
                isGood = false;
              }
            })

            // Report if all configs are good or not
            if(isGood) {
              console.debug('All ' + NAMESPACE + ' Items are configured correctly');
            }
            return isGood;
          };


          /**
           * Pull the set of Items for today based on Ephemeris. The Ephemeris hierarchy is
           * - custom
           * - holiday
           * - dayset
           * - weeekend
           * - weekday
           * - default
           *
           * If there are no DateTime Items defined for today's type, null is returned.
           */
          var getTodayItems = () => {
            // Get all the DateTime Items that might apply to today given what type of day it is
            // For example, if it's a weekend, there will be no weekday Items pulled. Whether or not
            // the entry in this dict has an array of Items determines whether today is of that day
            // type.
            const startTimes = [
              { 'type': 'custom',  'times' : getItemsOfType('custom').filter(item => actions.Ephemeris.isBankHoliday(0, item.getMetadata()[NAMESPACE].configuration['file'])) },
              { 'type': 'holiday', 'times' : (actions.Ephemeris.isBankHoliday()) ? getItemsOfType('holiday') : [] },
              { 'type': 'dayset',  'times' : getItemsOfType('dayset').filter(item => actions.Ephemeris.isInDayset(items.getMetadata()[NAMESPACE].configuration['set'])) },
              { 'type': 'weekend', 'times' : (actions.Ephemeris.isWeekend()) ? getItemsOfType('weekend') : [] },
              { 'type': 'weekday', 'times' : (!actions.Ephemeris.isWeekend()) ? getItemsOfType('weekday') : [] },
              { 'type': 'default', 'times' : getItemsOfType('default') }
            ];

            // Go through startTimes in order and choose the first one that has a non-empty list of Items
            const dayStartTimes = startTimes.find(dayset => dayset.times.length);

            if(dayStartTimes === null) {
              console.warn('No DateTime Items found for today');
              return null;
            }
            else {
              console.info('Today is a ' + dayStartTimes.type + ' day.');
              return dayStartTimes.times;
            }
          };


          /**
           * Returns a function called to transition the state machine from one state to the next
           *
           * @param {string} state the new state to transition to
           * @param {function} the function that transitions the state
           */
          var stateTransitionGenerator = (state) => {
            return function() {
              console.info('Transitioning Time State Machine from ' + items[STATE_ITEM].state + ' to ' + state);
              items[STATE_ITEM].sendCommand(state);
            }
          }


          /**
           * Returns a function that generates the timers for all the passed in startTimes
           *
           * @param {Array} startTimes list of today's state start times
           * @param {timerMgr.TimerMgr} timers collection of timers
           * @returns {function} called to generate the timers to transition between the states
           */
          var createTimersGenerator = (timers) => {
            return function() {

              if(validateAllConfigs()) {

                // Cancel the timers, skipping the debounce timer
                console.debug('Cancelling existing timers');
                timers.cancelAll();

                // Get the set of Items for today's state machine
                console.debug("Acquiring today's state start times");
                const startTimes = getTodayItems();

                // Get the state and start time, sort them ignoring the date, skip the ones that have
                // already passed and create a timer to transition for the rest.
                console.debug('Creating timers for times that have not already passed');
                var mapped = startTimes.map(i => { return { 'state': i.getMetadata()[NAMESPACE].value,
                                                            'time' : time.toZDT(i.state).toToday() } });
                mapped.sort((a,b) => {
                         if(a.time.isBefore(b.time)) return -1;
                          else if(a.time.isAfter(b.time)) return 1;
                          else return 0;
                        })
                        .filter(tod => tod.time.isAfter(time.toZDT()))
                        .forEach(tod => {
                          // TODO: see if we can move to rules instead of timers
                          console.debug('Creating timer for ' + tod.state + ' at ' + tod.time);
                          timers.check(tod.state, tod.time.toString(), stateTransitionGenerator(tod.state));
                        });

                // Figure out the current time of day and move to that state if necessary
                var beforeTimes = mapped.sort((a,b) => {
                                          if(a.time.isAfter(b.time)) return -1;
                                          else if(a.time.isBefore(b.time)) return 1;
                                          else return 0;
                                        })
                                        .filter(tod => tod.time.isBefore(time.toZDT()));
                if(!beforeTimes.length) {
                  console.debug("There is no date time for today before now, we can't know what the current state is, keeping the current time of day state of " + items[STATE_ITEM].state + ".");
                }
                else {
                  const currState = beforeTimes[0].state
                  const stateItem = items[STATE_ITEM];
                  console.info('The current state is ' + currState);
                  if(stateItem.state != currState) stateItem.sendCommand(currState)
                }
              }
              else {
                console.warn('The config is not valid, cannot proceed!');
              }

            };
          };


          var timers = cache.private.get('timers', () => TimerMgr());


          // Wait a minute after the last time the rule is triggered to make
          sure all Items are done changing (e.g.

          // Astro Items) before calculating the new state.

          timers.check('debounce',
                       'PT10S',
                       createTimersGenerator(timers),
                       true,
                       () => { console.debug('Flapping detected, waiting before creating timers for today'); });
        type: application/javascript
      id: '3'
      inputs: {}
      type: script.ScriptAction
  conditions: []
  configDescriptions:
    - context: item
      description: String Item that holds the current time of day's state.
      filterCriteria:
        - name: type
          value: String
      label: Time of Day State Item
      name: timeOfDay
      required: true
      type: TEXT
    - context: item
      description: Has as members all the DateTime Items that define time of day states.
      filterCriteria:
        - name: type
          value: Group
      label: Times of Day Group
      name: timesOfDayGrp
      required: true
      type: TEXT
    - description: The Item metadata namespace (e.g. "tsm").
      label: Time of Day Namespace
      name: namespace
      required: true
      type: TEXT
  description: >-
    Creates timers to transition a state Item to a new state at defined times of
    day.
  name: Time Based State Machine rule
  triggers:
    - configuration:
        groupName: DemoSwitchGroup
      id: '1'
      type: core.GroupStateChangeTrigger
    - configuration:
        startlevel: 100
      id: '2'
      type: core.SystemStartlevelTrigger
    - configuration:
        time: '00:05'
      id: '4'
      type: timer.TimeOfDayTrigger
  uid: 'rules_tools:tsm5'
- actions:
    - configuration:
        script: |
          var from = parseFloat(oldState.toString().split(' ')[0]);
          var to = parseFloat(newState.toString().split(' ')[0]);

          print(from + '>' + to);

          if (to < 2 && from >= 2) {
            events.sendCommand('DemoSwitch', 'LSELECT');
          }
        type: application/javascript
      id: '2'
      inputs: {}
      type: script.ScriptAction
  conditions: []
  configDescriptions:
    - context: item
      description: >-
        Item that holds the power (in watts) of the washing machine. Can be a
        quantity type (Number:Power).
      label: Power Item
      name: powerItem
      required: true
      type: TEXT
    - defaultValue: 2
      description: >-
        When the power measurement was at or above the threshold and crosses
        below it, trigger the alert.
      label: Threshold
      name: threshold
      required: true
      type: DECIMAL
    - context: item
      description: >-
        Item to send a command to when the measured power gets below the
        threshold. For instance, a Hue light advanced Alert channel.
      label: Alert Item
      name: alertItem
      required: true
      type: TEXT
    - defaultValue: LSELECT
      description: >-
        Command to send to the alert item (for an item linked to a Hue light
        alert channel, LSELECT will flash the light for a few seconds).
      label: Alert Command
      name: alertCommand
      required: true
      type: TEXT
  description: >-
    This will monitor the power consumption of a washing machine and send an
    alert command when it gets below a threshold, meaning it has finished.
  name: Alert when Washing Machine Finished rule
  triggers:
    - configuration:
        itemName: CurrentPower
        state: ''
      id: '1'
      type: core.ItemStateChangeTrigger
  uid: 'ysc:washing_machine_alert2.5'

@rkoshak
Copy link

rkoshak commented Apr 29, 2025

Are there any examples of the formats that has already been implemented by #3666, so that I can get a better understanding of the idea?

I think Items and Things are done but rules haven't been started yet AFAIK.

The question is how far into the future this is.

The goal is #3666 will be part of OH 5.0. That seems achievable.

@Nadahar Nadahar force-pushed the rule-file-provider branch 2 times, most recently from 5e607ab to 2c52e91 Compare May 3, 2025 12:49
@lolodomo
Copy link
Contributor

lolodomo commented May 3, 2025

Are there any examples of the formats that has already been implemented by #3666, so that I can get a better understanding of the idea?

Yes, you can find it in #4691 and #4776.

The question is how far into the future this is.

The goal is #3666 will be part of OH 5.0. That seems achievable.

Yes but you will not have everything supported in the YAML format in OH 5.0. The delay is too short.

It could be argued that the same could be done with this. Merge it now with the "raw mapping" and then remove the YAML parsing when the result of #3666 is ready. As long as it's not documented, it won't see much use anyway, but it will at least be "usable" without having to do YAML -> JSON using an external converter for any change to the rules.

That is a possible option.

@Nadahar Nadahar force-pushed the rule-file-provider branch from 2c52e91 to fd4cc51 Compare May 3, 2025 13:25
@Nadahar
Copy link
Contributor Author

Nadahar commented May 3, 2025

Yes, you can find it in #4691 and #4776.

Thanks.

I'm trying to debug YamlModelRepositoryImpl to see how the files are parsed (I find that easier than browsing to related and unrelated code, to get my bearings), but I can't get it to start (breakpoints don't trigger, there's nothing logged by it in the log). Is it disabled somehow in current main, and if so, how do I "enable" it?

@lolodomo
Copy link
Contributor

lolodomo commented May 3, 2025

In Eclipse, you need to add org.openhab.core.model.yaml
Probably something to change to have it by default.

@Nadahar
Copy link
Contributor Author

Nadahar commented May 3, 2025

In Eclipse, you need to add org.openhab.core.model.yaml

Add where? To the "demo app" run requirements?

@lolodomo
Copy link
Contributor

lolodomo commented May 3, 2025

The best is to discuss the syntax first ho get an agreement.

@Nadahar
Copy link
Contributor Author

Nadahar commented May 3, 2025

The best is to discuss the syntax first ho get an agreement.

I just want to run/debug to understand how it works, including how the files are parsed.

@lolodomo
Copy link
Contributor

lolodomo commented May 3, 2025

image

@lolodomo
Copy link
Contributor

lolodomo commented May 3, 2025

I just want to run/debug to understand how it works, including how the files are parsed.

Files are parsed in class YamlModelRepositoryImpl and resulting DTO for each type of elements (things, items, tags, ...) are "distributed" to providers.
So to add a new kind of elements, you need mainly to add a DTO class + a provider in charge to handle DTO to provide final elements (things, items, ...) to one of our existing registries.

@Nadahar
Copy link
Contributor Author

Nadahar commented May 3, 2025

Files are parsed in class YamlModelRepositoryImpl and resulting DTO for each type of elements (things, items, tags, ...) are "distributed" to providers.
So to add a new kind of elements, you need mainly to add a DTO class + a provider in charge to provide a registry.

Yes, I've gotten the basic idea of this. There are some details that aren't clear to me though, which I think I will figure out quickly using the debugger. That is things like the "watch folder" where these files are picked up, and how the object itself is deserialized. I mean, if it only supports "declared" fields, or if it will support anything that can be deserialized. Likewise, if there is some kind of default mechanism for unspecified fields.

When it comes to supporting e.g rules in itself using @YamlElementName("rules") is quite obvious, so I'm more interested in the details. I usually find that figuring those things out using a debugger is more efficient than me asking lots and lots of questions 😉

@Nadahar
Copy link
Contributor Author

Nadahar commented May 3, 2025

Adding it to "Run Requirements" did it, thanks.

I now see that it's watching the whole of conf for .yaml files, except the automation subfolder. Has this been discussed and decided to be the "best" solution, or is it somewhat preliminary?

I'm thinking that it might lead to some potential trouble if this parser will claim "all" YAML files, given that YAML is used for so many things these days. There can be e.g addons that want to use YAML for configuration or similar. Given that this is a "specific format" built upon YAML, it might be better to come up with an entirely new extension and leave .yaml to general use. That way, this parser could "more rightfully" require that the specific structure with element names etc. are followed for these files, instead of throwing a tantrum over files that are for other things.

MIME type aliases let users specify less cryptic, shorter "codes" for scripting language MIME types in the new YAML based format. The full MIME types can still be used.

Signed-off-by: Ravi Nadahar <[email protected]>
@Nadahar Nadahar force-pushed the rule-file-provider branch 8 times, most recently from 338db3f to d0afa4d Compare May 17, 2025 21:46
@Nadahar
Copy link
Contributor Author

Nadahar commented May 17, 2025

I wasn't aware that GitHub logged all the steps of the rebasing, so I'll complete the rebasing to a different branch before pushing here again.

Ravi Nadahar added 2 commits May 18, 2025 05:42
YamlConfigDescriptionParameterDTO is separated from the other rule-related classes because it can also be used by other element types.

Signed-off-by: Ravi Nadahar <[email protected]>
@Nadahar Nadahar force-pushed the rule-file-provider branch from d0afa4d to ab49611 Compare May 19, 2025 19:13
Ravi Nadahar added 2 commits May 20, 2025 03:40
This allows parsing of rule templates using the "ruleTemplates" element in the new YAML based format.

Signed-off-by: Ravi Nadahar <[email protected]>
@Nadahar Nadahar force-pushed the rule-file-provider branch from ab49611 to b120719 Compare May 20, 2025 14:41
@Nadahar Nadahar force-pushed the rule-file-provider branch 2 times, most recently from 7e0b267 to 7d77767 Compare May 26, 2025 02:20
@Nadahar Nadahar marked this pull request as ready for review May 26, 2025 02:38
@Nadahar Nadahar force-pushed the rule-file-provider branch from 7d77767 to beb8c62 Compare May 26, 2025 03:31
@Nadahar
Copy link
Contributor Author

Nadahar commented May 26, 2025

Whoever is merged first of this and #4718, I need to make a little tweak to the last one to adapt to the other PR.

@Nadahar Nadahar marked this pull request as draft May 26, 2025 22:57
@Nadahar
Copy link
Contributor Author

Nadahar commented May 26, 2025

I put this back to draft while I incorporate the changes from #4718.

@Nadahar Nadahar force-pushed the rule-file-provider branch from beb8c62 to 0b8a375 Compare May 27, 2025 00:42
@Nadahar
Copy link
Contributor Author

Nadahar commented May 27, 2025

#4718 is integrated.

@Nadahar Nadahar marked this pull request as ready for review May 27, 2025 00:44
@Nadahar
Copy link
Contributor Author

Nadahar commented May 27, 2025

I have a very hard time understanding how this PR could affect the current test failure:

Error:  Failures: 
Error:    ExpireManagerTest.testIgnoreCommandDoesNotExtendExpiryOnStateUpdate:236 

As far as I can understand, this PR does nothing that even remotely influences this. I'm doing a full build of core locally, with tests and all, and it's already way beyond the point where this build failed. So, either this test is "time sensitive" (which usually means "something isn't thread-safe") and thus a bit random, or something else that I don't understand is going on.

edit: The local build finally completed:

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for openHAB Core 5.0.0-SNAPSHOT:
[INFO]
[INFO] openHAB Core ....................................... SUCCESS [  1.066 s]
[INFO] openHAB Core :: BOM ................................ SUCCESS [  0.058 s]
[INFO] openHAB Core :: BOM :: Compile ..................... SUCCESS [  0.918 s]
[INFO] openHAB Core :: BOM :: Compile Model ............... SUCCESS [  1.784 s]
[INFO] openHAB Core :: BOM :: Runtime ..................... SUCCESS [  0.904 s]
[INFO] openHAB Core :: BOM :: Runtime Index ............... SUCCESS [  6.183 s]
[INFO] openHAB Core :: BOM :: Test ........................ SUCCESS [  0.123 s]
[INFO] openHAB Core :: BOM :: Test Index .................. SUCCESS [  0.668 s]
[INFO] openHAB Core :: Bundles ............................ SUCCESS [  3.132 s]
[INFO] openHAB Core :: Bundles :: Core .................... SUCCESS [04:25 min]
[INFO] openHAB Core :: Bundles :: Test .................... SUCCESS [ 20.651 s]
[INFO] openHAB Core :: Bundles :: Configuration Core ...... SUCCESS [ 35.302 s]
[INFO] openHAB Core :: Bundles :: Add-on XML .............. SUCCESS [ 16.689 s]
[INFO] openHAB Core :: Bundles :: Console ................. SUCCESS [ 18.830 s]
[INFO] openHAB Core :: Bundles :: Transformation Service .. SUCCESS [ 29.759 s]
[INFO] openHAB Core :: Bundles :: Semantics ............... SUCCESS [ 28.594 s]
[INFO] openHAB Core :: Bundles :: Thing ................... SUCCESS [ 53.724 s]
[INFO] openHAB Core :: Bundles :: Configuration Discovery . SUCCESS [ 23.313 s]
[INFO] openHAB Core :: Bundles :: Network I/O ............. SUCCESS [ 26.709 s]
[INFO] openHAB Core :: Bundles :: Ephemeris ............... SUCCESS [ 14.352 s]
[INFO] openHAB Core :: Bundles :: Automation .............. SUCCESS [ 40.866 s]
[INFO] openHAB Core :: Bundles :: Magic Bundle ............ SUCCESS [ 14.656 s]
[INFO] openHAB Core :: Bundles :: Add-on Suggestion Service SUCCESS [ 13.580 s]
[INFO] openHAB Core :: Bundles :: REST Interface .......... SUCCESS [ 16.492 s]
[INFO] openHAB Core :: Bundles :: Karaf Integration ....... SUCCESS [ 17.269 s]
[INFO] openHAB Core :: Bundles :: HTTP Interface .......... SUCCESS [ 15.168 s]
[INFO] openHAB Core :: Bundles :: Audio ................... SUCCESS [ 38.593 s]
[INFO] openHAB Core :: Bundles :: Voice ................... SUCCESS [ 31.480 s]
[INFO] openHAB Core :: Bundles :: ID ...................... SUCCESS [ 13.350 s]
[INFO] openHAB Core :: Bundles :: Persistence ............. SUCCESS [ 31.202 s]
[INFO] openHAB Core :: Bundles :: JAAS Authentication ..... SUCCESS [ 13.213 s]
[INFO] openHAB Core :: Bundles :: OAuth2Client ............ SUCCESS [ 19.984 s]
[INFO] openHAB Core :: Bundles :: Binary To JSON converter  SUCCESS [ 16.255 s]
[INFO] openHAB Core :: Bundles :: mDNS Service ............ SUCCESS [ 19.577 s]
[INFO] openHAB Core :: Bundles :: Modbus Transport ........ SUCCESS [ 58.667 s]
[INFO] openHAB Core :: Bundles :: Serial Transport ........ SUCCESS [ 11.918 s]
[INFO] openHAB Core :: Bundles :: Console for OSGi framework Eclipse Equinox SUCCESS [ 10.161 s]
[INFO] openHAB Core :: Bundles :: Console for OSGi Console RFC 147 SUCCESS [ 11.250 s]
[INFO] openHAB Core :: Bundles :: Console for OSGi runtime Karaf SUCCESS [ 17.355 s]
[INFO] openHAB Core :: Bundles :: HTTP Interface Authentication SUCCESS [ 16.442 s]
[INFO] openHAB Core :: Bundles :: Monitor ................. SUCCESS [ 33.334 s]
[INFO] openHAB Core :: Bundles :: Audio REST Interface .... SUCCESS [ 11.849 s]
[INFO] openHAB Core :: Bundles :: Authentication Support for the REST Interface SUCCESS [ 19.558 s]
[INFO] openHAB Core :: Bundles :: REST Interface :: Core .. SUCCESS [ 29.848 s]
[INFO] openHAB Core :: Bundles :: REST UI Logging Module .. SUCCESS [ 11.302 s]
[INFO] openHAB Core :: Bundles :: REST mDNS Announcer ..... SUCCESS [ 11.253 s]
[INFO] openHAB Core :: Bundles :: Model Core .............. SUCCESS [ 28.948 s]
[INFO] openHAB Core :: Bundles :: Model Sitemap ........... SUCCESS [02:04 min]
[INFO] openHAB Core :: Bundles :: UI ...................... SUCCESS [ 55.747 s]
[INFO] openHAB Core :: Bundles :: Sitemap REST Interface .. SUCCESS [01:04 min]
[INFO] openHAB Core :: Bundles :: SSE Interface ........... SUCCESS [ 24.370 s]
[INFO] openHAB Core :: Bundles :: REST Interface :: Swagger 1 SUCCESS [ 14.143 s]
[INFO] openHAB Core :: Bundles :: Transformation REST Interface SUCCESS [ 13.326 s]
[INFO] openHAB Core :: Bundles :: UI REST Interface ....... SUCCESS [ 13.538 s]
[INFO] openHAB Core :: Bundles :: Voice REST Interface .... SUCCESS [ 15.734 s]
[INFO] openHAB Core :: Bundles :: MQTT Transport .......... SUCCESS [ 22.463 s]
[INFO] openHAB Core :: Bundles :: Serial Transport for Java Communications API SUCCESS [ 14.078 s]
[INFO] openHAB Core :: Bundles :: Serial Transport for RXTX SUCCESS [ 15.291 s]
[INFO] openHAB Core :: Bundles :: Serial Transport for RFC2217 SUCCESS [ 15.317 s]
[INFO] openHAB Core :: Bundles :: UPnP Transport .......... SUCCESS [ 22.069 s]
[INFO] openHAB Core :: Bundles :: WebSocket ............... SUCCESS [ 26.435 s]
[INFO] openHAB Core :: Bundles :: SSL Certificate Generator SUCCESS [ 23.726 s]
[INFO] openHAB Core :: Bundles :: IP-based Suggested Add-on Finder SUCCESS [ 15.255 s]
[INFO] openHAB Core :: Bundles :: Configuration mDNS Discovery SUCCESS [ 17.199 s]
[INFO] openHAB Core :: Bundles :: mDNS Suggested Add-on Finder SUCCESS [ 24.098 s]
[INFO] openHAB Core :: Bundles :: Process-based Suggested Add-on Finder SUCCESS [ 18.224 s]
[INFO] openHAB Core :: Bundles :: Configuration SDDP Discovery SUCCESS [ 21.335 s]
[INFO] openHAB Core :: Bundles :: SDDP Suggested Add-on Finder SUCCESS [ 16.490 s]
[INFO] openHAB Core :: Bundles :: uPnP Suggested Add-on Finder SUCCESS [ 17.897 s]
[INFO] openHAB Core :: Bundles :: Configuration USB-Serial Discovery SUCCESS [ 16.158 s]
[INFO] openHAB Core :: Bundles :: USB Suggested Add-on Finder SUCCESS [ 18.707 s]
[INFO] openHAB Core :: Bundles :: Configuration USB-Serial Discovery for Linux using sysfs scanning SUCCESS [ 16.619 s]
[INFO] openHAB Core :: Bundles :: Configuration USB-Serial Discovery using ser2net mDNS scanning SUCCESS [ 26.545 s]
[INFO] openHAB Core :: Bundles :: Configuration UPnP Discovery SUCCESS [ 16.453 s]
[INFO] openHAB Core :: Bundles :: Configuration Dispatcher  SUCCESS [ 19.075 s]
[INFO] openHAB Core :: Bundles :: Configuration Serial .... SUCCESS [ 19.256 s]
[INFO] openHAB Core :: Bundles :: Automation Script Modules SUCCESS [01:01 min]
[INFO] openHAB Core :: Bundles :: Automation Media Modules  SUCCESS [ 18.057 s]
[INFO] openHAB Core :: Bundles :: Automation Script RuleSupport SUCCESS [ 25.542 s]
[INFO] openHAB Core :: Bundles :: Automation REST API ..... SUCCESS [ 12.275 s]
[INFO] openHAB Core :: Bundles :: Model Items ............. SUCCESS [ 48.534 s]
[INFO] openHAB Core :: Bundles :: Model Item IDE .......... SUCCESS [ 25.241 s]
[INFO] openHAB Core :: Bundles :: Model Items Runtime ..... SUCCESS [ 17.184 s]
[INFO] openHAB Core :: Bundles :: Model Persistence ....... SUCCESS [ 52.151 s]
[INFO] openHAB Core :: Bundles :: Model Persistence IDE ... SUCCESS [ 22.766 s]
[INFO] openHAB Core :: Bundles :: Model Script ............ SUCCESS [01:14 min]
[INFO] openHAB Core :: Bundles :: Model Rules ............. SUCCESS [01:00 min]
[INFO] openHAB Core :: Bundles :: Model Rule IDE .......... SUCCESS [ 20.746 s]
[INFO] openHAB Core :: Bundles :: Model Script IDE ........ SUCCESS [ 19.116 s]
[INFO] openHAB Core :: Bundles :: Model Sitemap IDE ....... SUCCESS [ 31.004 s]
[INFO] openHAB Core :: Bundles :: Model Thing ............. SUCCESS [ 46.654 s]
[INFO] openHAB Core :: Bundles :: Model Thing IDE ......... SUCCESS [ 18.916 s]
[INFO] openHAB Core :: Bundles :: Language Server ......... SUCCESS [ 19.015 s]
[INFO] openHAB Core :: Bundles :: Model Persistence Runtime SUCCESS [  9.920 s]
[INFO] openHAB Core :: Bundles :: Model Script Runtime .... SUCCESS [ 13.245 s]
[INFO] openHAB Core :: Bundles :: Model Rules Runtime ..... SUCCESS [ 12.009 s]
[INFO] openHAB Core :: Bundles :: Model Sitemap Runtime ... SUCCESS [  9.495 s]
[INFO] openHAB Core :: Bundles :: Model Thing Runtime ..... SUCCESS [  9.308 s]
[INFO] openHAB Core :: Bundles :: Model YAML .............. SUCCESS [ 27.669 s]
[INFO] openHAB Core :: Bundles :: UI Icon Support ......... SUCCESS [ 11.827 s]
[INFO] openHAB Core :: Bundles :: JSON Storage ............ SUCCESS [ 16.028 s]
[INFO] openHAB Core :: Bundles :: Eclipse Add-on Service .. SUCCESS [  9.390 s]
[INFO] openHAB Core :: Bundles :: Marketplace Add-on Services SUCCESS [ 16.783 s]
[INFO] openHAB Core :: Bundles :: Community Marketplace Add-on Service :: Karaf SUCCESS [  8.173 s]
[INFO] openHAB Core :: BOM :: openHAB Core ................ SUCCESS [  0.050 s]
[INFO] openHAB Core :: BOM :: openHAB Core Index .......... SUCCESS [  9.136 s]
[INFO] openHAB Core :: Bundles :: Configuration USB-Serial Discovery for Windows SUCCESS [ 10.398 s]
[INFO] openHAB Core :: Features ........................... SUCCESS [  0.049 s]
[INFO] openHAB Core :: Features :: Karaf .................. SUCCESS [  0.255 s]
[INFO] openHAB Core :: Features :: Karaf :: Target Platform SUCCESS [ 23.306 s]
[INFO] openHAB Core :: Features :: Karaf :: Core .......... SUCCESS [01:41 min]
[INFO] openHAB Core :: Tools .............................. SUCCESS [  0.051 s]
[INFO] openHAB Core :: Tools :: Archetypes ................ SUCCESS [  0.141 s]
[INFO] openHAB Core :: Tools :: Archetypes :: Binding ..... SUCCESS [  1.381 s]
[INFO] openHAB Core :: Tools :: Internationalization Maven Plugin SUCCESS [ 15.824 s]
[INFO] openHAB Core :: Tools :: Upgrade tool .............. SUCCESS [ 17.287 s]
[INFO] openHAB Core :: Integration Tests .................. SUCCESS [  2.052 s]
[INFO] openHAB Core :: Integration Tests :: Tests ......... SUCCESS [ 35.937 s]
[INFO] openHAB Core :: Integration Tests :: OAuth2Client Tests SUCCESS [ 23.031 s]
[INFO] openHAB Core :: Integration Tests :: Automation Tests SUCCESS [ 31.510 s]
[INFO] openHAB Core :: Integration Tests :: Automation Core Module Tests SUCCESS [ 25.153 s]
[INFO] openHAB Core :: Integration Tests :: Automation Module Timer Tests SUCCESS [ 28.731 s]
[INFO] openHAB Core :: Integration Tests :: Automation Script Module Tests SUCCESS [ 19.403 s]
[INFO] openHAB Core :: Integration Tests :: Automation Integration Tests SUCCESS [ 49.134 s]
[INFO] openHAB Core :: Integration Tests :: Add-on XML Tests SUCCESS [ 23.329 s]
[INFO] openHAB Core :: Integration Tests :: Configuration Core Tests SUCCESS [ 20.602 s]
[INFO] openHAB Core :: Integration Tests :: Configuration mDNS Discovery Tests SUCCESS [ 19.803 s]
[INFO] openHAB Core :: Integration Tests :: Configuration Discovery Tests SUCCESS [ 54.636 s]
[INFO] openHAB Core :: Integration Tests :: Configuration Discovery UsbSerial Tests SUCCESS [ 19.604 s]
[INFO] openHAB Core :: Integration Tests :: Configuration Dispatcher Tests SUCCESS [ 25.509 s]
[INFO] openHAB Core :: Integration Tests :: Ephemeris Tests SUCCESS [ 20.425 s]
[INFO] openHAB Core :: Integration Tests :: Network I/O Tests SUCCESS [ 19.901 s]
[INFO] openHAB Core :: Integration Tests :: IO REST Core Tests SUCCESS [ 54.248 s]
[INFO] openHAB Core :: Integration Tests :: Model Item Tests SUCCESS [ 34.998 s]
[INFO] openHAB Core :: Integration Tests :: Model Rule Tests SUCCESS [ 27.226 s]
[INFO] openHAB Core :: Integration Tests :: Model Script Tests SUCCESS [ 31.684 s]
[INFO] openHAB Core :: Integration Tests :: Model Thing Test Support SUCCESS [ 10.234 s]
[INFO] openHAB Core :: Integration Tests :: Model Thing Tests SUCCESS [ 31.681 s]
[INFO] openHAB Core :: Integration Tests :: JSON Storage Tests SUCCESS [ 16.568 s]
[INFO] openHAB Core :: Integration Tests :: Thing Tests ... SUCCESS [01:57 min]
[INFO] openHAB Core :: Integration Tests :: Voice Test .... SUCCESS [ 42.582 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  58:55 min
[INFO] Finished at: 2025-05-27T03:56:40+02:00
[INFO] ------------------------------------------------------------------------

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants