Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow to set severity of failing test #187

Open
Arno500 opened this issue Nov 14, 2023 · 8 comments
Open

Allow to set severity of failing test #187

Arno500 opened this issue Nov 14, 2023 · 8 comments

Comments

@Arno500
Copy link
Contributor

Arno500 commented Nov 14, 2023

Hello team,

I am currently writing a Redfish interop profile and was wondering about warnings. Warnings are currently emitted when there is an issue, like an error when fetching a resource.

But I also found cases where you may want to validation to succeed, but still consider that a test gave a warning. For instance in our situation, we have basic Redfish needs (that would end with errors when validating) but also "nice to have" (like newer or more complete features) that are not critical.

Would you consider that a good idea? Or the philosophy of the tool is to always fail and the operator shall read all the errors to evaluate the severity?

Thanks!

@mraineri
Copy link
Contributor

The reason we treat that case as a warning is because it doesn't signify that the implementation doesn't support the profile; it interferes with our testing since we can't get a resource, but we can move on with other testing. The scope of the interop validator is to verify requirements spelled out in the profile, and failures are constituted based on how the service does not meet the profile. Based on the profile specification, you either meet the profile or you don't; the profile syntax doesn't really have a "warning" end result.

@Arno500
Copy link
Contributor Author

Arno500 commented Nov 14, 2023

Right, I understand a bit better. But then, in the output, there is not much difference between a "Supported" failing test (showing PASS) and a "Mandatory" PASS, rendering the "Supported" useless unless manual inspection of all the results (and for clarity inside the profile). Maybe adding an "Unsupported" status, then?

Same for "IfImplemented" but here you absolutely accept that the property might not be implemented and if it is, you can check values so a specific type of output might not be necessary.

@mraineri
Copy link
Contributor

Both of those cases are still requirements in terms of PASS/FAIL criteria and don't give leeway for warnings.

"Supported" indicates the property is required, and as long as it's on at least one resource instance, The usage of this came about because a system might not be fully populated, and so resources can show up with a state of "Absent", and thus have a very short list of properties. You don't want to fail service for not surfacing a property that isn't relevant to the system configuration, but still need to test that the property is present for cases where the configuration is expected to have it, such as when only half of your DIMMs are populated.

"IfImplemented" recognizes the fact that not all systems are designed the same and have underlying hardware differences. You don't want to use a Redfish profile to define hardware requirements; the scope is to test the manageability exposed by the BMC. For example, a user might have requirements to collect fan information in a consistent way, but not every hardware vendor would necessarily have fans as part of the system design. It would be in bad practice to require a BMC to report fan resources if none exist, so using "IfImplemented" in this case would make sense. However, we do have a gap in our testing since we don't have a way to inspect these types of requirements to know if the system under test is expected to have that functionality. At this time, we just treat "IfImplemented" as "Recommended", meaning if we don't find the property or resource, that's considered a PASS. We do have an open issue about this gap (#105).

@Arno500
Copy link
Contributor Author

Arno500 commented Nov 14, 2023

Fair enough, I apparently didn't grasp the concept of those before. But then there is no workaround to indicate something "optional".
And for "Recomended", what is the use case? Simply allow the tool to check without failing or logging anything? Wouldn't it make sense to output differently?

@mraineri
Copy link
Contributor

"Recommended" is optional, but the author of the profile leans towards "I'd rather see this implemented than not".

Everything that's not mentioned in the profile is truly optional; the profile really is meant to document requirements that a user cannot live without.

Profiles are also not meant to restrict the capabilities of a service. So, a profile cannot forbid a service from implementing something, meaning that they have the flexibility to implement the entire data model, even if the profile's user has no use case for it.

@Arno500
Copy link
Contributor Author

Arno500 commented Nov 14, 2023

In this case, I think it would still make sense to improve the output to differentiate a "Recommended" that is not present in the response. Because then, it's a green and a "PASS" whether it is present or not. It should not generate an error, but the operator would still want to know at a glance whether it was present without necessarily having to check manually the value or "DNE" entry.

@jautor
Copy link
Member

jautor commented Nov 21, 2023

@Arno500 you might want to take a look at the Profile output mode of the Redfish doc-generator https://github.com/DMTF/Redfish-Tools/tree/main/doc-generator

Given a profile document, that tool will produce a "schema guide" with each of the requirements pulled from the profile. That may give you more context for comparing an implementation to the profile.

But I do see your point that the tool doesn't provide a "wishlist" of support for all the non-required items shown in the profile. I think we should consider some support for that - but it would be a different mode of operation, as that output would be geared towards an implementer. The goal of the tool is to VALIDATE an implementation against a profile. But using it to create a "to do" list for an implementation is certainly a useful function.

@Arno500
Copy link
Contributor Author

Arno500 commented Nov 21, 2023

Thank you @jautor for your input. This is indeed a very interesting tool that you have here, but I am on the "user" side, so it won't be of much help in my case :)

What I had more in mind was to give a shopping list to OEMs, so they can implement the features we need. We could have two different profiles (a minimal profile where it's only required features and a wishlist) as well. But I really liked the idea of something differentiating between "WE NEED THAT" and "well, you apparently don't have this one, but you know, that would be really awesome and we are gearing towards that in the future, know that it could impact our capacity to buy your servers".
And all of this in one clear output with fails and wished (at least this is my vision). So maybe it doesn't need another mode of operation but it would be perfectly fitting for this use case.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants