I was lucky enough to get our team of engineers & specialist a long waiting 3 day site stand down.
System upgrades were being done to low temperature separator gas plant. New refrigerant gas separator unit, control system upgrade to Delta-V, and new 6 stream gas chromatograph analyzer system, etc. I was responsible for the analyzer system integration. The analyzer's results were data mapped to a new Network Access Unit (NAU) which would communicate to the Delta-V via RS-485. The communication was simply not working as it should, so I made comms setting changes and saved the program to flash. I then proceed to power cycle the NAU so the new setting would take effect. That's when all Hell Broke Loose... All the sudden the very loud sound of the whole gas plant Going Down... Operator's / Technicians scramble to get the plant back up... I looked around and made the statement "Did I Do That"??? Everyone in the room looked at me "Naw it couldn't have been you".
After about 15min. operations had the plant back online and I continued my troubleshooting with the NAU. After making some additional changes in the communication setting, I proceeded again to power cycle the NAU... and you guessed it... The plant went Down, but this time operations could not get it recovered and 3 plant flares were now playing there part of the Plant Shutdown... Biggest Flaring Event I've ever seen, not just gas flaring but liquefied gas pouring out of the flare systems... talk about a heat wave and we were about a mile away.
Time forward, Plant investigators were now onsite to find the cause of the plant shutdown. Investigation led to the UPS cabinet that the new instruments, Delta-V, and my good ole NAU were powered from. Electrician found some loose neutral wires, which they tightened. Load test was performed on the UPS - 23% load, so good there... Everything seemed to check out OK. Operation was given the OK to startup the plant, which they did with no event.
So far I am still in the clear as no one has pin pointed the issue, except that the shutdown had something to do with the UPS system? This time around I had Techs helping me troubleshoot the comms problem, check wiring, loop checks, NAU power supply voltages/current draw, etc. It all checked out. This time around I ran a temporary signal cable to a newly configured Delta-V RS-485 comm. port for testing purpose and proceed to cycle the power on the NAU. This time though I let Everyone in the room know what I'm doing... and sure enough THE PLANT GOES DOWN... D@mn it not again.
Our project team (7 of us) are told that we are in Stand Down until further notice, but we had to show up onsite every day and twiddle our thumbs while the incident was being investigated. Once we have the OK to continue working, I noticed the NAU had no power and it had its power wiring completely removed from the UPS panel. I asked the Electrician about power for the NAU and he stated get an extension cord and plug is in the wall outlet over there...
To this day the NAU is still being powered by the extension cord plug into the wall outlet... and there is a LOTO on the UPS cabinet with RED WARNING tape plastered all over the Cabinet. Oh,,, and the MAIN cause of the NAU communications problem was the ribbon cable from the NAU mainboard to the 9 DSUB connector being off by one... This was supposedly checked and tested by the manufacturer..... Geesh I don't think anyone there involved was willing to tell "The Big Wigs" about the bad ribbon cable.