02/12/2016
Hello,
With reference to Part 13: Aggregates OPC UA Specifications I have the following questions
1.Section A.1.1 Example Aggregate data – Historian 1 has the following raw input sample with the configuration has TreatUncertainAsBad = False
12:01:10 70 Raw, Uncertain
However the aggregates like Average ,Minimum, Maximum and Count have ignored the above sample while computing within the corresponding aggregate interval.
Could you please justify why this raw data sample is ignored?
Moderators-Specifications
Moderators-Companion
Moderators-Implementation
Moderators-Certification
Moderators-COM
02/24/2014
The definition of Average, Minimum, Maximum explicitly defines that only Good status values are used. Count explicitly defines that non-Good values are not included in the count. The value is not bad since the TreatUncertainAsBad = False, but it does not meanthat they are Good – they are uncertain. Some aggregates only skip Bad values, TimeAverage for example.
Paul Hunkar - DSInteroperability
02/12/2016
Hello Paul,
Thank you for your response.
If we receive a raw data sample with quality as Uncertain and
If TreatUncertainAsBad = False Then the sample is considered as not bad ,not good but uncertain
If TreatUncertainAsBad = True then the sample is considered as bad , not good.
To summarize – An uncertain data sample can never become good sample because of the TreatUncertainAsBad parameter value . Either it remains as uncertain or bad
Could you please correct me if I am wrong here?
Moderators-Specifications
Moderators-Companion
Moderators-Implementation
Moderators-Certification
Moderators-COM
02/24/2014
Yes you are correct, it is still Uncertain for TreatUncertainAsBad = False, but for some aggregates this mean that it will be used in the computation of the aggregate. This depends on the definition of the Aggregate. Some only exclude Bad. The definition of each aggregate has to be looked at.
Paul Hunkar - DSInteroperability
02/12/2016
Hello,
I have encountered an ambiguous scenario with TreatUncertainAsBad usage in the Count Aggregate with respect to Historian1 input data.
Count aggregate definition says “Consider only good samples”
Historian 1 settings says “TreatUncertainAsBad=False”
For an interval from 12:01:04 to 12:01:20 we have only one sample(uncertain quality) specified below .
12:01:10 | 70 | Raw, Uncertain |
ANNOTATION: Technician_1
Jan-02-2012 8:00:00 Value flagged as questionable |
The computed aggregate result for this interval is 0 .
12:01:04.000 |
0 | UncertainDataSubNormal, Calculated |
My question here is
How to interpret the count =0 ?
Why can’t it be bad no data instead of 0? like how it is computed for the interval from 12:00:32.000 to 12:00:48.000 having only one bad sample?
Regards
Samba
The interval at 12:01:04.000 should have a value of 0 and a status of UncertainDataSubNormal, Calculated.
The value is 0 because there is no Good data in the interval which is directly stated in the definition of the Count aggregate.
The status is UncertainDataSubNormal, Calculated and not Bad because the interval has an Uncertain entry which is not Bad. The interval at 12:00:32.000 is Bad because it has only bad entries. If you look at 5.4.3.2.1 there is an explanation of how an interval with Percent Bad %100 creates the Bad status code and that the interval that is not %100 bad has the Uncertain status.
Regards,
Rod
Rod Stein Manager of Technology Matrikon OPC http://www.matrikonopc.com
1 Guest(s)