PLC for Dummies

drew2

Member
Join Date
Sep 2004
Location
Mississippi
Posts
13
Hello everyone it is a pleasure to be a member. I am a first semester
plc student I am learning on rslogic is there a plc book for dummies or some book that just shows the basics? Thanks in advance.
 
Click on either "Learn PLCs" or "Online Tutorial" at the top of this page to check out Phil's free offering. Then buy the book. It helps support the forum. If there is anything in the tutorial or the book that you don't understand, post your questions here. We'll try to answer them for you.
 
Last edited:
PLC's for Dummies???

It can't be done. At least, it can't be done from a functional point of view.

The truely generic functional portion of PLC programming can be described in about 3 or 4 pages (depending on type-size). However, those pages will describe only a very small portion of what any particular PLC can do.

As long as there are so many goofy variations between the various manufacturers, the idea of a "PLC for Dummies" is rather ludicrous.

Some PLC's have a real bit-shift function... some don't.

Some PLC's do their normal math with Integers... some don't.

Some PLC's have real math... some don't.

Some PLC's have Element-based Indirect Addressing... some don't.

Some PLC's have Element-based Indexed Addressing... some don't.

Some PLC's have only Count-Up Counters... some don't.

Some PLC's have only Count-Down Timers... some don't.

If one is content with simple combinations of AND and OR logic then fine, there can be a "PLC's for Dummies" (3 or 4 pages long).

However, in reality, the idea of "PLC's for Dummies" is about as simple as "Human Nature, as it applies in each and every Culture, for Dummies", or "REAL-WORLD Economics for Dummies". There ain't nothin' simple about either of those!

With respect to PLC's, the only approach that will work is from the "CONCEPT point of view".

The "concept point of view" doesn't care about particular "functions" that are available from a PLC, it only cares about the "capability" of the processor in the PLC.

Since there are only a few processors in use, that should be manageable.
 
this is nuts!

I've programmed Intel 8051 family microcontrollers for well over 30 years in native assembly. Nobody needs a 60 year old assembly language 'hacker' (from an time when it was a good thing). If I'm ever going to get an job again, it will be in PLC support. Nobody does embedded hardware engineering in this country anymore... My job went to Malaysia. The PLC's we program are from China...

Good Grief! In efforts to make it 'machine control easily understood by electricians through tree logic tokens, the most obscure, confusing, and backwards thinking techniques and implementations been created bordering on 'technologically impaired' at times. I have never dealt with so much obscure, at times intentionally misleading, double talk since College and I have a MSEE. From what I've seen and read so far, the best way to describe implementation of a typical PLC would be an exercise in brute force artificial inelegance.

Good example: shift one bit through 16 bits worth of I/O to sequentially illuminate discrete indicators one at a time. Simple. 6 instructions with an 8051, hardware would use a a 4 bit BCD counter, a LS154 decoder/driver and a 555 for a clock source. In a PLC, either page after page of shift rights with 1,2,3,4... 15 presets or a sequencer with 0000000000000001, 0000000000000010,0000000000000100 (...) 1000000000000000 data, a FF mask and rising edge event triggered... give me a break o_O

I have yet to find a 'crash course' study course meant for someone with professional level training that can add, subtract, multiply and divide in both hexadecimal and octal pencil in hand on paper. A few references are actually incorrect in definition and show examples that clearly will not implement the function described. It is almost impossible to re-train an embedded microcontroller algorithmic thought process, where efficiency and speed are absolute paramount, into what is required for PLC ladders and to the 'electrician' level intended.

Just out of curosity, am I missing the whole boat here or is it indeed preferred to have absolutely no formal computer or hardware design skills whatsoever before picking up a PLC data sheet?
 
The more ascii (or tokens) one uses in expressing an algorithm, the more time is takes for the infernal machine to decide what to do with it. This is aggravated by using an interpreted language (as in relay tree logic) opposed to a compiled native binary. Speed and efficiency are paramount in ANY control application whatever it may be. Less program latency (the processor oozing it's way through page after page of mush) means faster control system response. Lazy is allowing inefficient and sloppy coding techniques to persist, but that 30 years under my belt speaking... Everybody knows that knowledge and experience goes 'poof' after 40 years old...
 
In the PLC world there are many more things to consider besides just programing. for example: just try pulling a 8081 off the shelf and plugging in a few cards hooking up some to field devices and having a running machine in hours. PLCs are the assembly line of todays automation not just chip you program. there are other "coding" methods besides ladder you can use. ladder just gave the avg. guy the ability to program in a way the old relay wiring was done on machines. I could go on and on about this. But I am glad there were folks like you around to make things easy on us so I won't Thanks,
 
Speed and efficiency are paramount in ANY control application, whatever it may be.
Yes, and that usually means that the most efficient industrial control computer is one that is easy for the average factory technician to program and maintain. One-time assembly-language programs don't cut it, but instead an easy fast symbolic language that allows fast changes. The speed of program execuation is rarely a problem, but the ease of maintenance is a big factor.

At least that is the way it used to be before the establishment of IT departments in the factory. Now all common sense has been kicked out the door in the name of the great IT gods.
 
I've programmed Intel 8051 family microcontrollers for well over 30 years in native assembly. Nobody needs a 60 year old assembly language 'hacker' (from an time when it was a good thing). If I'm ever going to get an job again, it will be in PLC support. Nobody does embedded hardware engineering in this country anymore... My job went to Malaysia. The PLC's we program are from China...

Good Grief! In efforts to make it 'machine control easily understood by electricians through tree logic tokens, the most obscure, confusing, and backwards thinking techniques and implementations been created bordering on 'technologically impaired' at times. I have never dealt with so much obscure, at times intentionally misleading, double talk since College and I have a MSEE. From what I've seen and read so far, the best way to describe implementation of a typical PLC would be an exercise in brute force artificial inelegance.

Good example: shift one bit through 16 bits worth of I/O to sequentially illuminate discrete indicators one at a time. Simple. 6 instructions with an 8051, hardware would use a a 4 bit BCD counter, a LS154 decoder/driver and a 555 for a clock source. In a PLC, either page after page of shift rights with 1,2,3,4... 15 presets or a sequencer with 0000000000000001, 0000000000000010,0000000000000100 (...) 1000000000000000 data, a FF mask and rising edge event triggered... give me a break o_O

I have yet to find a 'crash course' study course meant for someone with professional level training that can add, subtract, multiply and divide in both hexadecimal and octal pencil in hand on paper. A few references are actually incorrect in definition and show examples that clearly will not implement the function described. It is almost impossible to re-train an embedded microcontroller algorithmic thought process, where efficiency and speed are absolute paramount, into what is required for PLC ladders and to the 'electrician' level intended.

Just out of curosity, am I missing the whole boat here or is it indeed preferred to have absolutely no formal computer or hardware design skills whatsoever before picking up a PLC data sheet?

To me, ladder logic is simply a tool that does for machine designers/technicians/electricians what a spreadsheet program does for a business person - it makes the power of computing accessible to them through a medium they're familiar with. In both cases, the end product may be inefficient or inelegant from a formal computer science viewpoint, but it gets the job done. If it can keep up with the process it controls, it's fast enough.

As a PLC beginner with a background in "regular" computer programming, I'm interested in the possible differences in PLC learning approaches between people with hardwired relay logic backgrounds versus those with software-development experience, as discussed here: http://www.plctalk.net/qanda/showthread.php?t=77938
 
Good example: shift one bit through 16 bits worth of I/O to sequentially illuminate discrete indicators one at a time. Simple. 6 instructions with an 8051, hardware would use a a 4 bit BCD counter, a LS154 decoder/driver and a 555 for a clock source. In a PLC, either page after page of shift rights with 1,2,3,4... 15 presets or a sequencer with 0000000000000001, 0000000000000010,0000000000000100 (...) 1000000000000000 data, a FF mask and rising edge event triggered... give me a break o_O
What?

Here's a 6 instruction program that does what you describe, unless I misunderstood the example given:
attachment.php


I understand that assembly is very powerful, but don't bash a PLC programming environment just because you haven't figured out how to use it effectively. I will grant you that some PLCs are more capable than others, but a bit shift instruction is hardly a rare item. The SHFL instruction is available across all of the direct logic series of processors. Actually, the STRPD is less well supported.

Brian

bitshift.PNG
 
Speed and efficiency are paramount in ANY control application whatever it may be.

This is not the whole picture. Most processors in PLCs are more than capable of running the software at speeds that are more than fast enough for most machinery. Even bloated, inefficient code doesn't slow the process down as, normally, the limiting factor is the machinery itself as oppose to the controller.

What is "paramount" however is the ability to trace and diagnose faults as fast as possible.

Nobody in a production environment cares that your scan time is less than 1ms as oppose to 20ms if those two timings have zero effect on the process.

What people do care about is whether it takes maintenance guys days and days to find a fault because the code is so 'efficient' as to make it extremely difficult to fault find with.

I think this is an important mindset that you need to adopt if you are entering the realms of PLCs in a production environment.

;-)
 

Similar Topics

Can someone "dumb down" an explanation of the difference between a normal Siemens PLC and Safety PLC. What's the difference between the standard...
Replies
10
Views
9,605
Dear All, Currently i'm working with Endress Hauser flowmeter and AB PLC and try to integrate with 2 items. Flow meter spec using is Promass...
Replies
3
Views
50
Can I connect two A-B Panel View 7 to a A-B PLC. Same graphics etc. One on the local control panel and the other Panel View 7 in a remote control...
Replies
2
Views
39
Hello, I'm working with Studio 5000 and ME Station, and I'm trying to find a way to detect if the PC with the HMI is shut down or not. I've tried...
Replies
5
Views
116
Hi all smart and knowledgeable people.I have a hmi connected to 2 Plcs via ethernet.These 2 Plcs connected to control 2 systems in one machine...
Replies
2
Views
106
Back
Top Bottom