How fast can a Windows SQL client write data to a database?

AlfredoQuintero

Lifetime Supporting Member
Join Date
Feb 2015
Location
Yokohama
Posts
1,550
Hello.
A semiconductor production machinery maker has inquired about a Windows application that be able to cyclically read data from an OPCUA server and transfer this data to an SQL database. Not yet sure about what type of data or how much, but I think it will be a number of analogue readings from the OPCUA server. Now, pretty standard stuff that can be handled by many middle ware products with OPCUA client and SQL client capabilies, but the requirement seems to be unrealistic as the customer wants readings every one millisecond. Before spending effort in thinking of non-Windows solutions to crack this problem, I am interested in the opinion of this forum's members with experience on SQL technology, so I can get a reasonable number of the very minimum achievable through Windows to see if such reality check can help the customer rethink his expectations. Thanks for reading and if you can provide further advice I will be so grateful.
 
1ms is a fantasy. Most PLCs don't even scan the program in 1ms. And if your reading the data over Ethernet 1ms is impossible.


I have been doing this for 45 years and I have never seen a process that needed less than a 100ms logging rate.


if the PLC is fast enough you could buffer it up in memory blocks in the PLC and write some code on the Windows side to grab the data and stuff it in SQL, but time stamping would be a problem.



Now if this is a special application there are some stand alone data loggers that can log that fast. They queue it up and dump it to the SQL database when they get full.
 
Last edited:
Can you consolidate the data in memory and flush to SQL every second or so? Telegraf from InfluxData can do that.
 
Writing the data in chunks to files, and keeping track of the files in SQL, is the closest I would expect to do.


1kHz is 86.4Msamples/d (million samples per day), so even a 64-bit timestamp with every sample necessary is two-thirds of a GB per day, so I disk space is going to be a bigger problem than database performance, depending on how long you want to archive the data.

It might be possible to use SQLite to do it, starting a new database every day or week. I have an app that writes just under 200GB of R*Tree-indexed star catalog data in about half a day, IIRC.
 
Thanks very much, gentlemen for all these insights. The OPCUA server would be a Windows workstation, not a PLC. Communication with the OPCUA client would take place through a local network, and the SQL database would be hosted on the same workstation as the OPCUA client, thus all dstabase interactions are local.

Again, thanks for sharing your experience and wisdom.
 
I just had to try to give a few things to think about
The program scan timers are not relevant to reading data from the tag / data table in the PLC
I have seen program times of less than 2 10,000th of a sec for a ladder program of over a 1,500 rungs that contain many AOI’s and functions calls
What’s really important is how fast can you read in the data from the I/O, the default update time for most analog input modules is 10ms you may be able to shorten that down to 1 ms but you may have to sacrifice update time for other I/O to do that. You may be able to do an interrupt read on the input but in all cases you will be limited to max update rate of the input module.
Keep in mind that on the Logix procurers there are multiple processors working together to handle the work. There are multiple processors just for the program execution , multiple processors to handle the I/O communications , back plane , Ethernet ,device net each have their own processors to handle the communication and plc memory read write updates
There is also separate processors to handle the communication between the outside devices and the PLC. All of this have direct access to the memory / tag base.
The real limiting factor would be how fast the software in the PC and the communication in the PLC can process the data request and return the requested data. The only way to really know would be to test it
But if you are trying to pass the data from an Input module o a PC SQL server there would be no value in polling that data faster than you can get it from the I/O module. So even if you could transfer the data from the PLC memory to the PC SQL every ms you would just be transferring the exact same data unchanged 10 times using a lot of processor time.
 
Is the OPC UA Server residing on the PC, or is it to be on the PLC ?
If the OPC UA server is on the PLC, it will be significantly slower than on the PC.
If the OPC UA Server is on the PC, it will connect to the PLC with a more direct protocol.

If this was a Siemens S7 PLC, you could use Simatic Net as a Profinet IO Controller or Device. You can get into IRT with such a solution.
Simatic Net can then be a OPC UA server to pass the data on to OPC UA Client and database.

But I am not sure if the limitation will still be on the OPC UA side or database.
I have no experience logging at such fast speed, but I think that 10 ms is a hard lower limit.
I would think it would be better to log a large chunk of data every 100 ms instead of little data every 1 ms.

I concur with what the other has said about the data volume and the need for special analog cards to get the data into the PLC.

I think that there must be dedicated data acquisation systems that are better suited.
 
Mitsubishi iQR has both OPC UA and SQL and I think that it is good solution for you. CPU inside is really fast. Biggest limitation will be speed of SD card.
 
I have a question not answer, is it possible to sample analog inputs that fast? If yes, which hardware?


https://www.beckhoff.com/en-en/products/i-o/ethercat-terminals/el3xxx-analog-input/
If you filter for Sampling rate / conversion time = >1 .. 10kSps you get 74 results


There is also an option for >10kSps. You would need to make use of oversampling, and each IO update you will get, for example 100 values. but this is far beyond the OPs 1ms requirement.



I agree with the others, the requirement is not to update the SQL table every 1ms, but to record the history with 1ms resolution. So you could feasibly (for example) have your PLC record every 1ms, and publish 1000 values at once to OPC UA, and perform one SQL transaction which adds 1000 rows.
 

Similar Topics

Hi All, I've been playing with 2 stratix switches in my test bench and seeing how different configurations affect the behaviour when 2 managed...
Replies
3
Views
228
Dear all, Im using Allen Bradley Compact Logix L36ERM as a PLC. and facing a problem where I need to create a FIFO buffer that fills an...
Replies
5
Views
1,659
Dear Members; We have Fluke 754 hart communicator. We have wired up three transmitters of Rosemount Model C3051 with this input module. But when...
Replies
4
Views
1,849
Hi all, I have several machines utilising a PF755 on CIP motion. Occasionally the operator reports the jog speed ramps up rapidly, then the drive...
Replies
2
Views
1,366
Hello guys, this is my first post on this forum and i hope u can help me. Im doing a project where we need to read data in the Wincc Professional...
Replies
0
Views
1,040
Back
Top Bottom