Production Planning

Resource Planning in the traditional sense originally included planning the material requirements for producing batches of parts or discrete finished goods or sub-assemblies. In the original MAPICS Material Requirements Planning stages the fundamental bottleneck to be addressed was the Material Requirements for varied discrete requirements and varied process requirements. Requirements were the sum of firm customer orders and the Master Production Schedule, a top level forecast of requirements not yet on firm orders.

When the IBM mainframe was the computer of choice, (mainframes also came from Burroughs Corporation, and there were the Sperry Univac computers) prior to the dumb terminal, the mini-computer (now called the server) and computer networks, computer room operations were done entirely with punch cards. Also called the Hollerith Card and the 96 column card, computer punch cards were roughly the same as one line of computer instructions, e.g. <h3>A Minor Heading of a webpage</h3>

In the above example, we see the computer language, html, and some date. This was also the case in stacks of computer cards. A stack of computer cards in an IBM mainframe job would include some JCL or Job Control Language, a computer program, and then the data to be processed, followed by a card that would signify that there was no more data.

Thus when the Basic computer language came on the scene, we had data statements that were part of the computer program. This seems odd, but it is both an evolution of the old stacks of computer cards and also demonstrated by the simple html example above as a real world situation of programming computers.

Back to the Material Requirements Planning process, the data involved is the Master Production Schedule and the Order Backlog, followed by the Product Structure which would need to be ‘exploded’ to calculate low level demand. The final output of the computer program run would be a new stack of computer cards with part numbers and the total requirements for that material. Next, this stack of computer punch cards was combined with the current On Hand Inventory to calculate Purchase Requirements.

Thus, many processes in the punch card days of computing might be more than one step. A computer operator would assemble the stack of cards and process them to produce a new stack of cards that would be combined with other cards and a different computer program (which is on cards) to produce another stack of cards, that might then be processed by an RPG program to produce a report for the purchasing department.

Besides RPG, Fortran, LISP and Cobol made their way into computer programming before the interactive computer terminal and cathode ray tube, that is when computers were programmed with a stack of cards. The computer card actually came before the digital computer, in the end of the 19th century (around 1900) with the need for accurate counting of Census information, among other things. Herman Hollerith was a statistician that was also a bit of a mechanical engineer who designed ways of punching cards and reading cards to speed up the manual processing of statistical information. Hollerith first convinced the government to use his card system in 1890 for the census, speeding tabulation be almost ten times.

The main difference between tabulation around 1900 and computing around 1960 with the digital computer was that originally card processing was primarily a sorting operation. If in 1900 one asked the question of how many nuns lived in Colorado, the Colorado stack of cards would need be be sorted for employees of religious institutions, then for females, and then often it would be hand inspected to publish a final result.

Because of the rich history of tabulating, when the original digital computers came about, in which a series of vacuum tubes would store a zero or one, and another series of tubes would perform simple operations such as fetching a byte, storing a byte, or nanding a byte with another byte, quickly we interfaced the crude methods of getting data in and out of the limited amount of computer memory with the tabulating equipment that had been in use for more than half a century. The stack of cards now served as computer memory itself, an important breakthrough, because in the early 1960’s even a large IBM 360 computer might have less than 8k of memory. Eight thousand bytes of electronic memory in 1960 required several racks of transistors,where as one million times that amount of storage now fits on a USB flash drive.

Of course, this difference in the memory hierarchy had keen ramifications for the design and implementation of computing projects such as a Gross Requirements Explosion for Material Requirements when planning production at at manufacturer or chemical processing industrial facility, such as a refinery or a maker of laundry detergent. Today, however, we are back to the same dichotomies, because of an entirely different set of computing resources. We no longer use the Hollerith Card, but we also make little use of the USB flash drive in computing, that is we may use a flash drive to store images or videos, maybe even documents and procedures, but in a computational format we make little use of this sort of ‘off-line’ memory that has so many similarities to the Hollerith Card.

However, we do engage in a process most similar to the JCL language of the early IBM mainframes. In modern Enterprise Resource Planning, we knit together a database from one server with html and php served from another node of the network, where our desktop with the aid of Java and the JBoss stack provides the computational power to produce the new stack of punch cards, which in this case is actually a stream of updates to the database, MySql or more often PostgresSql which then has captured a new state of the planning and implementation process. Where we often spent a month or more between time when we would combine demand with a Master Production Schedule and the current purchasing schedule complete with scheduled deliveries, we now look whenever we need at these data, combine them, recalculate our needs and assess whether we need to call a vendor to expedite or delay a specific delivery release.