Network layout suggestions

Brian123

Member
Join Date
Sep 2007
Location
Nebraska
Posts
307
Good afternoon everyone,

I am thinking about a new physical network layout for our plant. I have attached a diagram of our floor plan. The five bays are 40'x60' each. Current network is as follows:
Everything labeled "Draw #" and "Gen #" is currently on a common RS485 serial link that runs to our current SCADA server. Everything labeled "L#", "Dow", and "Surface Comb" have individual serial links to the SCADA server. The Vacuum Furnace has an HMI, PLC and unmanaged switch inside of it's control cabinet. We also have an ethernet link (on a dedicated card) directly from the SCADA server to the vacuum furnace. A separate ethernet card connects the SCADA server to the office network.

Each piece of equipment is a standalone part of our operation. None of the pieces talk to each other or need to share data.

We also have WiFi access points located at the entrance to the Server Room and Tool Room. These to a fair job of covering the whole building, with the exception of the office area near the bathrooms. The WiFi access points are connected to the office network.

With the backstory out of the way, I am considering installing new infrastructure because we are adding a new piece of equipment in the far right bay and it has caused me to start comparing the current state of affairs to the future. The new piece will have a PLC and HMI connected via ethernet similar to the vacuum furnace. I would like to connect them to the SCADA system over ethernet as well, but don't currently have any network backbone in the area. Also, down the road, I will likely be upgrading the controls in the existing pieces of equipement and those links will likely be best on ethernet. Finally, sometime in the future I would like to upgrade our SCADA server itself. Currently, it is just a desktop Dell computer with old SCADA software that doesn't really do all the things I would like it to do. I would like to eventually transition to Ignition. The office network is currently served by a decently specced Dell server that has VMWare ESXi running. It has two independent network cards and is expandable enough to cover the hardware needs of running Ignition, I think. Other thoughts for the future include VOIP phone system, and IP based security cameras.

What I am considering doing is installing an enclosure in each bay with a managed switch inside. I would run an ethernet line between them and another managed switch in the server room. (Another possibility is a trunk link that daisy chains the switches together, possibly with a link to the server room at both ends for redundancy). Then, any piece of equipment that needs to be added to the SCADA Ethernet network can connect to the box in the room. If we need to add additional workstations for SCADA access, VOIP phones, cameras, or whatever, we can connect them to the box in the room as well (using VLANs to separate traffic as needed.)

I'm thinking this should be a decent layout. I'm still a bit fuzzy as to where routers would be necessary, and whether each machine should have a mananged switch to handle internal PLC-HMI traffic.

If it makes a difference, we are mostly an Automation Direct shop. The older controls that we will be replacing are Honeywell and Eurotherm stuff. Everything right now talks Modbus/RTU (or Modbus/TCP in the case of the vacuum furnace).

Questions and comments are greatly appreciated. I'll take brand recommendations into consideration, but as with all small businesses, I will also need to take cost into consideration as well.

Thanks,
Brian

shop layout.jpg
 
Here's what I would recommend (most of it is similar to what you are already considering).

Backbone:
A managed switch in every bay and one in the server room, but I would consider running your VOIP phones and cameras on different switches altogether than the ones used for your controls. If you must have them on the same switches, using VLANs is the way to go, but you will want to have a Gigabit link connecting your backbone switches.

If you are able to home run all the bay switches to the server room switch, I would not bother adding the complexity of a redundant ring since it won't add anything to your network reliability in that particular configuration (with the exception I will explain below).

Actually, if you decide to physically isolate your controls network from your VOIP phone and camera network AND you can home run all your bay switches to the server room switch, then I would keep things simpler / cheaper and use unmanaged switches for your entire controls network.

Individual machines:
I would use a 10/100BaseT 5 or 8 port unmanaged switch inside your individual machine control cabinets and home run those switches to your respective bay switch. Don't bother with managed. I don't see how it would help you any.

Equipment:
On the unmanaged side, I've worked with Automation Direct, N-Tron (Red-lion), Moxa, and Advantech switches and they are all fine. If you choose to use unmanaged on the bay switches, I would recommend at least a 16 port switch. N-Tron has a good one (116TX)

On the managed side, I've only worked with N-Tron and I believe they would have all the features you mentioned you might need. In particular, I have been pleased with the N-Tron 7026TX-AC which has 24 10/100BaseT and an optional 2 fiber or copper Gigabit ports. It is a rack-mount unit and it runs on AC power.
Otherwise, you might have to dig a little between the brands to find a switch with the best configuration for your needs (enough 10/100 ports for each bays needs plus the 1000 ports needed for your backbone).

Something else to consider:
If you decide to run your VOIP phones and cameras on the same physical network as your controls AND you go with Gigabit ports between your backbone switches, then it might be more cost effective to connect them in a redundant loop so none of your backbone switches need more than 2 Gigabit ports.

Hope this helps.
 
Sounds like you wish to run a converged network.

I have done it both ways and a converged network is my least favorite but it is the cheapest way.

But with a central SCADA system for the plant I would make it an isolated physical LAN with it's own switches,etc this way you plant tdoes not go down if IT makes a boo boo.

In a converged network at a minimum you will want mananged switches and a seperate VLAN for SCADA, VOIP, Cameras,Servers and PC's.

I also would not run my SCADA system on the same ESXI host as everything else. You mention a single ESXI host so what happens when it goes down? They should have 2 just for standard buisness apps in case one host fails.

I would virtualize the SCADA as thats a very good Idea I just would not rely on running on a single host with other buiness related VM's

Every with 2 buisness ESXI hosts I would have the SCADA system seperate but with 2 buisness hosts you could run the SCADA there and be fairly safe.

In the end it's all about Risk and Cost and Downtime. In many places the cost of downtime and the probability of hardware failure,IT mistakes,etc that level of Risk will justify the cost of a physical seperate network meaning wiring, Swithing, Server,etc and many times it does not.

It all depends on your buisness needs. This may be a decision for the entire organization to make as a whole with all the major players involved.
 
Thanks for the responses, guys.

I agree with both of you that running separate physical networks would be the ideal route, but I don't think I can justify it at this point.

We have had our SCADA system in place for almost ten years now. Originally, it started as a 'hey, I know a used equipment dealer that has a license for a low-end SCADA software he's willing to let go for super cheap. Let's get that and a cheap computer and we can see equipment status without walking over to it.' While it has become a valuable tool over the years, if it fell over tomorrow, the company would survive. People would grumble and we would lose a couple of nice features, but we would get along.

The business applications are in a similar boat. While I would love to have redundant servers with Vmotion and FT to fail-over seamlessly, I can't currently justify the costs. Currently, the shop floor runs on a paper-based system with the billing being done on a commercial accounting package. If the mainline business server went down, we could probably live for a week or two without it.

In the future, as we become more dependent on a SCADA system (such as a major transition to Ignition, with it handling some or all of the paper based system), I will definitely be pushing for redundant servers. At that point, if the Voip, cameras, or PCs are causing problems on the converged network, I will still have the option to run another line out to the enclosures and add switches to separate the networks. As for IT mistakes, well, I will only have myself to blame :). We have about a dozen employees and I'm the only guy that handles the 'computer stuff'.

Thanks again for your input, I really do appreciate it.
Brian
 
Nothing wrong with doing a converged network and shared server if thats what fits your buisness needs best and it sounds like it does.

In that case depending on port count (Sounds small)I would go with industrial mananged switches on the floor and IT grade managed switches in office areas and such.

I would select models with POE on all downlink ports to power phones and cameras easily in the future.

Running one server make sure you have good backups of your VM's

Tools such as Veeam http://www.veeam.com/vmware-esxi-vsphere-virtualization-tools-hyper-v-products.html

And Shadow Protect are good choices. http://www.shadowprotect.com/
 
That's a good thought on POE switches. Looks like AD's managed switch offerings don't include that feature. Any favorite brands?

I do have backups of the critical bits for the virtual machines running on my ESXi server, but none of them are actually backing up at the VM level. Basically, the accounting software generates a backup file and puts it in a directory everyday. Then, I have CrashPlan upload the files to a computer I have at home over the web. Other software packages do the same thing.

At the very least, I can reinstall the OS and programs from scratch and restore from those offsite files. I also have the SBS server saving a complete image file to a separate area of the hard drive. If whatever takes out the OS doesn't toast the image file, I can just restore that. Come to think of it, I might have that image pushed out to my home server with CrashPlan. I'll have to check on that one.

Thanks for the backup software pointers. I'll have to check into those. One limitation that I ran into when I was setting this stuff up a while ago is that I am just using the free ESXi version. Several of the VM level backup solutions required a paid version of ESXi. I'll check and see if that has changed. Perhaps it is different in version 5+; if so, I might look at upgrading to the new free version.

Thanks again,
Brian
 

Similar Topics

Hello, folks. Looking for suggestions on network layout. I'm designing 3 stations with 6 pieces of conveyor on each. They are part of the same...
Replies
21
Views
5,738
Hey all Can I get profibus layout diagram in a nice diagram so I could troubleshoot the network. I have seen this in rsnetworx. Does any one know...
Replies
5
Views
2,446
Looking for a supplier of Layer 3 Network Switches DIN RAIL MOUNT, in Alabama, In the UK we would use Typically in the UK we would use...
Replies
3
Views
69
We are having an issue with some servers, with "Teamed NICs" is we plug one cable leg of the team into one switch and the other to another...
Replies
0
Views
54
Good morning fellow sea captains and wizards, I am being asked to do the above and obtain 4 values from each slave, I know about the MRX and MWX...
Replies
27
Views
610
Back
Top Bottom