Part 4 describes in https://reference.opcfoundatio.....t4/5.12.1/ that a server might/can support only a set of possible sampling intervals (so not all possible values/intervals larger than the minimum sampling interval):
It is expected that Servers will support only a limited set of sampling intervals to optimize their operation. If the exact interval requested by the Client is not supported by the Server, then the Server assigns to the MonitoredItem the most appropriate interval as determined by the Server. It returns this assigned interval to the Client. The Server Capabilities Object defined in OPC 10000-5 identifies the sampling intervals supported by the Server.
The texts points to the Server Capability node (ServerCapabilitiesType) exposing the supported sampling intervals to the client applications. But at the type (https://reference.opcfoundatio.....rt5/6.3.2/) there is only a MinSupportedSampleRate variable with a scalar value of the one minimal value. (Also other like the OperationLimits seem not to expose a set of sampling intervals.)
It could be done using a server/vendor specific variable, but then it cannot be used by client applications in an standardized manner.
Is there another standard variable defined containing the set of supported sampling intervals, e.g., an array of Duration values?
That text is out of date.
The capability was removed because it made no sense to provide a server wide setting since the set of available sampling intervals could change for each variable in the address space.
Added a mantis issue:
...since the set of available sampling intervals could change for each variable in the address space.
I had some similar thoughts after posting my initial question:
If there is no (global or variable specific) interval set given, it will be completely hidden for the client applications what will happen and what sampling intervals a server can provide for a specific variable. The client developers can only try and handle the revised sampling interval, maybe with some iterations - or hope for a good server (product) manual listing the the supported sampling intervals.
To automate it based on a standard, it would be necessary that each variable has its own list of supported intervals (or refers to one, e.g., an entry in a server global list of possible interval sets).
But in general the option to implement several sets of sampling intervals remains, that is good.
Thanks for your quick reply.
If there is no (global or variable specific) interval set given, it will be completely hidden for the client applications what will happen and what sampling intervals a server can provide for a specific variable.
It is not clear how clients could do useful optimizations even of it was known given the report on change nature of OPC UA.
Especially since the available intervals no only depends on the variable it also depends on what else is going on inside the server (i.e. the available scan rates can change as the server load changes).