OPC UA Aggregates- Interpolation for Standard Deviation Sample? | OPC UA Standard | Forum

Avatar
Search
Forum Scope


Match



Forum Options



Minimum search word length is 3 characters - maximum search word length is 84 characters
Lost password?
sp_Feed sp_PrintTopic sp_TopicIcon
OPC UA Aggregates- Interpolation for Standard Deviation Sample?
Avatar
SambsivaReddy Appireddygari
Member
Members
Forum Posts: 24
Member Since:
02/12/2016
sp_UserOfflineSmall Offline
1
02/17/2016 - 00:26
sp_Permalink sp_Print

Hello,

With reference to Part 13: Aggregates OPC UA Specifications I have the following questions

  1. Why the end boundary sample is excluded while computing the simple aggregates like(Count, Minimum, Maximum, etc...?)
  2.  The aggregate result is represented with the start time of the interval? Can we change it to end time of the interval? Is it a mandatory to represent result with start of the interval only?
  3.  The section 5.4.3.37 (Standard Deviation Sample) recommends to consider Bounding values using Simple Bounding technique.

Why should we  compute bounding values for the computation of Standard Deviation?

 

 

Avatar
Paul Hunkar
Cleveland, Ohio, USA
Moderator
Members

Moderators-Specifications

Moderators-Companion

Moderators-Implementation

Moderators-Certification
Forum Posts: 89
Member Since:
02/24/2014
sp_UserOfflineSmall Offline
2
02/18/2016 - 15:00
sp_Permalink sp_Print

Some answers,

1) The boundary item needs to be include in only one calculation - i.e. I have a start time of 12:00:00 and an end time of 12:10:00 with 1 minute mins.  Each minute has a range - 12:00:00 to 12:01:00, 12:01:00 to 12:02:00 .....  If a minimum occurred at 12:01:00 - you don't want it reported twice - it needs to be only in one of the ranges - so by definition the start bound is included the end is not included.

2) The aggregate definition defines what time is to be represented - MinumimActualTime returns the actual time of the minimum.  The time that is to be displayed can not be changed from what is defined in the aggregate definition.   The rule do describe how an aggregated behave if start and end time are switched (time is going forward or backward).

Paul Hunkar - DSInteroperability

Avatar
Rod Stein
Canada
Member
Members
Forum Posts: 27
Member Since:
04/01/2014
sp_UserOfflineSmall Offline
3
02/18/2016 - 16:08
sp_Permalink sp_Print

The end boundary is always excluded from the data of the interval.  The interval is always start time to the last possible unique time before the end time.  The aggregates like Count, Minimum, etc. all use actual Raw data in the interval and no interpolation.  Since the end time is outside the interval it will not be considered.

The aggregates are defined in the specification and cannot change.  There is the option of reversing the start and end times so that the most recent time becomes the start time and it would be the time stamp of the aggregate.  This has other ramifications and does not always calculate out to the same value as if it were not reversed (the start time included and end time not included can make a difference).

You are correct, bounds should not be calculated when calculating the standard deviation (or variance) aggregates.  The examples should be correct, the tables 48 - 51 should have the Use Bounds table field as None.  I will enter an issue in the OPC Foundation issue tracker.

Rod Stein               Manager of Technology Matrikon OPC               http://www.matrikonopc.com

Avatar
Rod Stein
Canada
Member
Members
Forum Posts: 27
Member Since:
04/01/2014
sp_UserOfflineSmall Offline
4
02/18/2016 - 16:12
sp_Permalink sp_Print

Rod Stein               Manager of Technology Matrikon OPC               http://www.matrikonopc.com

Avatar
SambsivaReddy Appireddygari
Member
Members
Forum Posts: 24
Member Since:
02/12/2016
sp_UserOfflineSmall Offline
5
02/22/2016 - 03:12
sp_Permalink sp_Print

Hello,

 

      5.4.3.37   StandardDeviationSample

The StandardDeviationSample Aggregate defined in Table 48 uses the formula:

where X is each Good raw value in the interval, Avg(X) is the average of the Good raw values, and n is the number of Good raw values in the interval.

With respect to the  above section, I have the following observations

1. Only Good raw samples shall be processed. However the output of Historian 1 (Section A.35.2) has considered the uncertain data point for the below specified interval 

12:01:00.000 7.071 UncertainDataSubNormal, Calculated 

Could you please justify this ambiguity?

Avatar
Rod Stein
Canada
Member
Members
Forum Posts: 27
Member Since:
04/01/2014
sp_UserOfflineSmall Offline
6
02/22/2016 - 16:47
sp_Permalink sp_Print sp_EditHistory

Once again you are correct and this is an issue.  The aggregate should only be calculated using good values and not uncertain values.  I have entered an issue for this.  This also stands for the variance aggregate for Historian1.

http://opcfoundation-onlineapp.....hp?id=3313

Rod Stein               Manager of Technology Matrikon OPC               http://www.matrikonopc.com

Forum Timezone: America/Phoenix
Most Users Ever Online: 202
Currently Online: Vinay Salian
Guest(s) 14
Currently Browsing this Page:
1 Guest(s)
Top Posters:
Forum Stats:
Groups: 2
Forums: 9
Topics: 894
Posts: 2764