05/29/2017
Hi, i’m using a client-server scenario with OPC UA. The client lives on PC the server on PLC.
The client create a session on server with approximately 14000 monitoredItem.
Without active session the PLC CPU usage is around 10%.
With 250ms of samplingInterval the CPU usage increase to 50%
With 1000ms of samplingInterval the CPU usage increase to 30%
Is a normal behaviour?
Can i do something to reduce CPU usage with 250ms of samplingInterval?
P.S.
In a scenario with, OPC DA with OPC server (winCodesys) on PC, with 200ms of samplingRate, the PLC CPU usage is 20%.
Thanks.
Moderators-Specifications
Moderators-Companion
Moderators-Implementation
Moderators-Certification
Moderators-COM
02/24/2014
I think you would need to provide more information to get a reasonable response. What PLC? Who implemented the Server? What are the exact parameters for the subscription and publishing (filter, queue size, publish interval ….). Are you using a secure connection (encrypting and signing the communication?) One subscription or multiple subscriptions?
You are comparing it to a OPC Classic DA server that is running on a PC and communicating to the PLC, but you don’t describe how that communication is occurring? what protocol is used for communicating to the PLC? (is it Secure?). For the DA case all of the server processing is being done on the PC, only the polling of values is on the PLC.
All of this can affect the performance, where implementation can be the biggest factor. It is very possible that the PLC would have higher CPU usage, since it is running the server code (the intermediate PC is not needed), but a lot would depend on the Server implementation and how it is obtaining the values. Is it just layered on top of the existing PLC / communication stack? or is it embedded in the PLC as the communication stack used for DA communications?
You might want to discuss this with the PLC vendor.
Paul
Paul Hunkar - DSInteroperability
1 Guest(s)