You are on page 1of 13

The Recipe Model: New Advances in using Life-Cycle

Principles in the Batch Industry R&D and Manufacturing


Environment
presented at the
World Batch Forum 1998

Joel M. Vardy Reiner Musier, Ph.D. Suresh Sundaram, Ph.D.


AspenTech AspenTech AspenTech
105 Clover Leaf Ln. 10 Canal Park 10 Canal Park
North Wales Pa 19454 Cambridge, MA 02141 Cambridge, MA 02141
(215) 699-4256 (617) 949-1696 (617) 949-1292

KEYWORDS

Recipe, Technology Transfer, Modeling, Simulation, Information Technology, Genealogy

ABSTRACT

The S88 Recipe Model has spawned several new technologies that let todays Batch Process
Manufacturer use consistent approaches from R&D through Manufacturing. This paper will
highlight how a General Recipe can be developed using off-line simulation technologies and
scaled through a Master Recipe level. Managing the manufacturing recipe is then refined
through on-line analysis tools that feed back the results to R&D and help Manufacturing
optimize its processes. Details will be reviewed on the principles of batch modeling,
including examples in the industry to extract the maximum benefits from the recipe model.
Significant improvements in time-to-market, intra-company communications and batch
process optimization have already been achieved and are anticipated for future implementation
as the art of integrating these technologies matures.

I. History

Throughout all history the tools used have been an outflow of the need. So also has been the
case with the short history of batch process development and automation. The historical
needs of the R&D, Engineering and Manufacturing departments have been met with
adaptations of computational tools that mirrored practices of earlier periods. R&D tools have
been complex and custom in nature. Having roots in early uses of mainframe and later
minicomputer hardware used by the academic community, much of the software was written
with software tools such as FORTRAN and APL. Knowledge of the intricacies of operating
system terminology and syntax was a clear advantage to the user and as such the tools

1
remained in the hands of specialists who had both the knack and the interest to work through
the long hours of frustrating periods becoming adept with the tool rather than concentrating on
its applicability. They were known as the gurus of their departments who were to be
respected and feared for the magical skills of being able to slay the mighty computer dragons.
They frequently spoke in specialized languages that confined their world to hobby clubs and
the computer technicians who would come to inspect their hardware and exchange a few
pleasantries with these high priests of computers. Over time the scientists and engineers
would line up to ask for their problems to be run by these specialists. Eventually many gave up
to run their problems manually or not at all.

Likewise, the engineering community would have the same experiences, running their
simulations and designs using calculators/sliderules or waiting for their turn to run their ideas
through the valuable computer time that was more times than not limited by the time of the
operator or engineer who would have to interpret the requesters meaning. Multiple runs were
rare and therefore limited the creative impulses of the designer/scientist.

Interactions between the R&D community and Engineering was confined to the hand-off
process, which in no small measure required interpretation of the documents and language
developed between the two communities. Each had developed a language to express the tasks
it was doing and the backgrounds of the majority population. R&D tended towards the
Chemists laboratory-based language, while the Engineers tended towards a more equipment-
oriented approach. As a result the hand-off meetings were managed by experienced
practitioners of the art of interpretation. Novices would sit in awe and more than likely miss
the content or the meaning of the conversations. As is frequently the case they couldnt even
ask the important questions and would have to come back for clarification if they dared.

The hand-off to manufacturing had to be handled both from R&D and Engineering so the
aforementioned difficulties could be multiplied several-fold. The manufacturing community
had altogether another language that was based on the day to day needs of transforming
recipe-based conversion of raw materials to finished goods in facilities that frequently had
more room dedicated to maintaining inventories than making the product. Scoping meetings
with engineering centered around synchronizing the meaning of common words like units and
phases. Doing this with the folks from the labs or pilot operations was even more uncertain
since they were even further removed from the equipment oriented world.

Each of the three types of departments had evolved a language that suited their roles and
therefore depictions of the information they were conveying to the other departments required
experience and skill. Often the ball was dropped when assumptions missed the disconnects
represented by common words. The overly scientific accuracy requirements of the R&D
people have little or no meaning on the plant floor. The designs that seem to work on paper in
the office environment or in a lab could run into practical difficulties if they havent been
piloted or plant-tested. The inconsistent tools did not make the process run smoother but it
did create a premium for the experienced technology transfer practitioners. Their tools were
more common sense than the structured approach of a standard for models and languages and
software that represented it. Whats more, the modeling tools of the day were crude

2
extensions of continuous process modeling technologies. Dynamics were complex to model,
and hence not recognized. Equipment downtime was seen as a scheduled event that the whole
plant went through. As a result both the design community and the controls community had to
leave with the compromises of either using approximations or writing custom code that only a
few understood. Figure 1 shows the diversity of the presentation tools matched to the needs of
the departments with minimal cross-departmental integration.

Handoff
R&D Operations

Lab Journals
Custom Programs Daily Schedules
Scaleup Spreadsheets Control Systems
Process Validation Control Charts
SOPs
MSDs

Process/ Project
Engineering M

Handoff
Handoff
Project Mgt. Tools
Process Costing
Modeling Tools Packages
CAD
Equipment Qualification

Figure 1: Data presentation based on different needs of the Organization

II. Evolution of the Three Primary Batch Technologies

The business processes of the Batch Process Community changed in the 1990s to reflect the
reduction in margins as competition and globalization reduced the options available. The
fragmented approaches of the 70s and 80s are coalescing into three technologies centered
around the major business processes represented by the R&D, Engineering and Operations
communities. For process development the Design Process represents new challenges that in
the Pharmaceutical sector requires better collaboration with the Discovery and Product
Development groups. Additionally, they are experiencing new pressures to streamline
processes relating to technology transfer to manufacturing in order to reduce time to market
factors. The rapid deployment of microprocessor technologies in the mid 80s and 1990s has
accelerated the changes going on in each of the departments cited above. R&D professionals
are no longer dependent on the computer guru of to check a simulation run. The models are
still custom programs but this time written for the machine that sits in every lab and on every
desktop.

3
Engineering designs have migrated to desktop CAD programs and crude simulations programs
have been developed using spreadsheet programs originally developed for the accounting
community. Their computational components now include statistical and graphing
capabilities. Both the R&D and Engineering community are now using more similar tools
thanks to the evolution of the personal computer and the emergence of windows technologies
pioneered by Xerox, Apple and Microsoft. The Microsoft standard has drawn the scientific
and engineering organizations closer while minimizing some of the handoff issues.

In the Batch community the three primary technologies seem to center around the Design of
the Recipe, its Operation in Manufacturing and the Management of the data in reports and
materials that specify the inputs and outputs of the manufacturing process. The other
evolution that has occurred organizationally to manage the material flow of the operation is
that supply chain people in the plants and corporate now manage the planning and scheduling
functions to minimize inventories and track products and their constituents. These groups are
the nerve center of the plant and are called Operations Planning or Materials Management
departments in charge of scheduling, procurement and sometimes customer relationships. The
mid 1980s saw the evolution of a new crop of people with process control and business
management backgrounds that could speak to both the Information and Process technologists
in the plant. This was a realization of the importance of the Plant Database function that they
owned.

The operations groups have instrumented more of the plant with largely hardware specific
solutions dominated by DCS (Distributed Control Systems) that have evolved batch specific
software. The PLC (Programmable Logic Control) community has largely penetrated the
batch applications because pre-packaged machines on the packaging line came imbedded with
these technologies and the raw capabilities of the hardware and software could be
programmed to run simple reactors with ancillary equipment. They were well suited to the
event-based world of the batch manufacturer and were a natural fit to the packaging line
automation needs. Their weakness was primarily found with the unintegrated batch software
that had to be configured with a separate MMI (man machine interface) database or what is
today called HMI (human machine interface). The logical merger of the two technologies
between DCS and PLC found both hardware-based approaches used in hybrid applications.
DCS users found the better networking and software integration suited those systems for the
largely parallel tasks of the chemical reaction process, while the PLC was found to be better
suited for the largely sequential operations of upstream and downstream operations.

Once the basic operations of the batch manufacturer were met the needs of managing the data
and analyzing the results were a logical next steps. Historians that were once used for mostly
continuous operations found their way into the batch community. Eventually batch functions
were added that recognized the event-based needs of batch applications. The Manage
technologies are still evolving though they are no longer seen as an extension of continuous
historian data. Batch specific historians are gaining acceptance as the Supply Chain
community is insisting on better means of tracking operational functions while the process
engineering community is finding ways of gaining process intelligence which is being used to
optimize the unit operations. As more of these technologies become pervasive the logical next

4
step will be to optimize the entire plant and eventually link the plants together. Future uses
will see inter-company integration as well.

III. Integration of the Three Primary Technologies

As technical integration of automation technologies becomes feasible the world searches for
practical approaches to the Life Cycle implementation of the Recipe-based industries. Such
areas as Specialty Chemicals, Pharmaceuticals, Biotechnology, Consumer Products and Food
and Beverage all share the same product development and manufacturing structure that is
essentially the model we call a Recipe. With S88.01 a reality we have the perfect vehicle to
construct the model for taking the Discovery of the compound from General Recipe Format
(largely a combination of Formula with crude Procedures) and taking it through the stages of
Process Development, into Manufacturing as a Master Recipe and Control Recipe and
Managing it through the Analysis stage where it can create the foundation of better accounting
and process understanding. With understanding established the intelligence can loop back to
better design and operations. Figure 2 shows the Recipe model evolving from the General
form to the Master form in Manufacturing. At various stages it grows in specificity, both
procedurally and in equipment capability.

Formula
Ver. 2.3 Ver. 2.3
Formula Ver. 2.3
RECIPE RECIPE
RECIPE
Formula Procedure
Formula Procedure
Procedure Formula Procedure

Procedure
Equipment
Safety Requirements Safety Requirements
Safety Requirements
Compliance

General Master
Laboratory Recipe Process
Recipe Manufacturing
Development
M
Chemist

Figure 2: Moving the Recipe model through its path from General to Master Form

The use of a consistent model in the Life Cycle will have a dramatic effect on the
organization. In a recently published book by Gary Pisano, The Development Factory he
argues that a high value exists in better integration of Process Development with
Manufacturing in the Pharmaceutical Industry. Handoffs are reduced to simple non-iterative
communication processes do not require expertise developed over many years. Novice
scientists and engineers are able to understand each other using a common language learned
either in their formal education or during orientation. Concurrent engineering and

5
development dramatically reduces the time to market and assures regulators that consistent
process development means less errors and requires less audits. FDA, OSHA and the EPA
can be involved early in the process with less anxiety and more determinism as to the actual
start date of manufacture. This will not only accelerate the returns to the investment
community but establish new benchmarks in effectiveness and increase the number of good
ideas that can hit the market.

The deep process knowledge that is imbedded in the recipe model is a natural complement to
the Information System groups. They can concentrate on the transaction based processes that
are being streamlined and depend on the Supply Chain to produce timely information on
inventories, costs and quality parameters.

Information flows can now extend to the R&D communities needs as the loop-back from the
Manage tools of analysis can effectively feed the Design tools of development. Below (Figure
3) is a model of information flows as we bind the recipe development tools of today with the
execution tools and analysis tools. The tie back to design can be either logical or automatic,
though we are seeing the first stages of batch process optimization as the loops are closed
logically.

Enterprise System
RM Costs Material Consumed
Recipe Schedule Process Analysis
Material Recipe Personnel History
-Properties Yield Analysis

Recipe Recipe Batch Information


Modeling Development
Simulation Execution & Analysis
Scaleup
Event Recording
Process
Engineering Rigorous Control System Material Tracking Modeling
Modeling Analysis Engine
Modeling
Actual vs. Design Simulation
Debottlenecking
Discovery Process Development Pilot/Manufacturing
Off-line Modeling On/Off-line Modeling

Figure 3: Enterprise-wide batch Integration furthers the state of the art

The tie-back between the on-line models of Manage and off-line models of Design allow a
logical check and adjust process that looks at actual vs. design parameters. Supply Chain
parameters can be tracked more accurately and the APS (Advanced Planning and Scheduling)
systems updated with accurate information.

6
IV. A Detailed Look at Current State of Batch Modeling

Development of simulation tools for batch process has paralleled the development of
tools for continuous processes, with a time lag of about 15 years. Simulation tools for
continuous processes have reached an advanced state of development. Such tools are
routinely used in the development and optimization of processes. The first available tools
modeled single unit operations. The models were custom-made to solve specific
problems. As more models became available, simple packages were developed that
contained standard unit operation blocks. An early example of this type of package is
FLOWTRAN. Later, larger general purpose simulators were developed. These packages
were able to simulate entire flowsheets, and contained physical property databanks,
standard unit operation blocks, and numerical algorithms to solve systems of
simultaneous nonlinear algebraic equations. Packages such as ASPEN PLUS, PRO/II
and Hysim are current examples.

Early batch simulation tools also modeled single unit operations. Modeling single batch
units is more complicated than continuous ones because of their inherent non-steady
state. Models require the solution of a system of differential-algebraic equations.
Although the system of equations that describe batch processes were known, general
purpose simulators were not available until the seventies. MULTIBATCH was
developed in 1974 by researchers Sparrow, Rippin and Forder at the ETH to model the
behavior of an entire batch process. The models in this package were simple split-
fraction models or user-supplied FORTRAN subroutines. Examples of single unit
operation models include BATCHFRAC (rigorous batch distillation) and BatchCad
(batch reactions).

The first commercial simulator designed for an entire batch process was BATCHES
(1984) by BPT, Inc. This simulator provides the user with a set of general process tasks
which can be combined into a process model. In the history of batch process simulation,
BATCHES might be the analog of FLOWTRAN as a groundbreaking simulator for batch
processes. Recently, Aspen Technology has introduced a new state-of-the-art simulator
called BATCH PLUSTM which enables users to easily model and simulate complex batch
recipes at laboratory-scale, pilot-scale, or full-scale.

A more integrated approach

During process development, a typical batch process goes through several phases. Early
in process development, chemists and process research scientists conduct small-scale
laboratory experiments to study process feasibility studies, conduct mass and energy
balances, evaluate reaction kinetics, etc. to choose a process route that is likely to be
commercially viable. Later, pilot-plant engineers move the process to an intermediate
scale. They focus on problems of scale, safety, yields, and have the additional goal of
coming up with a safe, efficient, and robust manufacturing process. Once the pilot-scale
process has been developed satisfactorily, if the product economics continue to warrant it,
the process is transferred to a manufacturing site. Product is manufactured in bulk, for

7
sale. Typical decisions at this stage include deciding whether to manufacture in an
existing facility, build a new facility, or contract the process to a toll manufacturer.
Throughout process development, other groups may be involved in various stages. The
pharmaceutical, biotech, agricultural chemical, foods, and consumer products industries
are typically regulated environments (one or more of the FDA, USDA, EPA), and safety
groups and environmental groups collect data necessary for regulatory approval.

Hence, the requirements for a successful batch process simulator are to :

allow a standard, flexible Recipe representation of the process at any scale


allow the Recipe and the corresponding model to grow in detail, complexity, and rigor
as experience and know-how about the process increase during its lifecycle
allow unit operation models of varying complexity
conduct mass and energy balances
calculate cycle times and equipment utilization
perform environmental and safety analyses
compare alternative processes and routes
scale the process (both scale-up and scale-down)
conduct process-fit analysis (e.g. moving a process from one site to another)
do equipment selection and sizing
simulate multi-batch multi-product production plan
share process descriptions among many users, with translation of the recipe into
multiple languages
access corporate equipment and materials properties data
access commercial vendor equipment and materials properties data
have graphical equipment and block diagrams
automated generation of operating instructions
report results in easy-to-use and store format

The time is right

Four things have contributed to the current successful development of batch process
simulators. First, in todays global markets, the batch process industry, which typically
manufacture high-value added, low volume products, faces increased competition.
Margins are at historically low levels, and there is a drive to reduce costs, increase return
on assets, and reduce the time to market. Simulators help by automating the routine
aspects of process development, allowing engineers to focus on real problems.

Second, there is a growing number of chemical engineering graduates familiar with the
batch process design problem. Graduate programs at Carnegie-Mellon, Purdue, Imperial
College, MIT and ETH-Zurich have had a focused research program in batch processes
for over twenty years, and these graduates are now in industry. Batch process design is
also part of chemical engineering curricula at various universities, increasing the number
of batch savvy engineers in the workforce.

8
Third, the emergence of the S88.01 means that people can study batch processes in a
single unifying framework. One of the barriers to the previous development of a general-
purpose simulator is that different companies, and even different groups within the same
company, used different conceptual models and jargon when describing batch processes.
To a large extent in practice this is still true, but S88 is a certainly a primary and
substantial driving force unifying the way companies think about batch processes around
the world. The S88 defines a new standard for representing batch processes, which
reduces the barrier to information flow. We believe that it will enable the emergence and
development of batch process modeling software industry as a viable commercial
enterprise.

Fourth, developments in computer hardware and software means that the computational
difficulty of simulating batch processes are minimized. Engineers have easy access to
fast, cheap computers. Object-oriented technology and languages like C++ and Java
mean that software can be developed rapidly, and adapted quickly to the needs of the
users. The popularity of programs such as EXCEL and WORD, which are easy-to-use
and share a similar look and feel, means that information can be shared amongst the
various parties in a large organization.

A new batch process simulator from AspenTech, BATCH PLUSTM, combines many of
the above features. The software uses a single model of the chemical process to perform
material and energy balances, calculation of streams and vessel contents, and design and
scale-up calculations. It applies S88 with SFC diagrams to represent the process. The
system contains an extensive library of engineering models for more that 60 common unit
operations. The user can simulate one batch of one process, or many batches of many
simultaneous processes with the system. The system runs on Windows NT or 95, and is
integrated with MS EXCEL, ACCESS, and Visio. The modeling system has been used
successfully in various companies, and the next section highlights some of the successes
of modeling technology.

V. Batch Modeling Tools in Todays World - Recent Industry Experiences

Over the last 12 months, a number of companies, including Merck, GlaxoWellcome, DuPont,
BASF, Abbott Labs, E. Merck Gmbh, Hoffmann LaRoche, Astra, Janssen Pharma, Novartis,
Ciba Specialty Chemicals, Phillips Petroleum, DSM, among others have been quick to apply
the recently available batch modeling technologies in various applications across the process
lifecycle in research and development, engineering, and operations. The following points
illustrate their experience and their expectations:

Batch Process Modeling is having a dramatic effect in work flow management of process
engineering, scale-up and process analysis. Future uses will allow processes to be designed
more cost effectively and more quickly. Additionally, the model will be used as a means
by which to communicate process information throughout the company.

9
Scale-up from lab-scale in a pilot facility was achieved in record time. It was strategically
important to the company to reduce time-to-market, and this team had a deadline which
was tighter than ever before. It enabled faster conceptualization of the new process and
facilitated generation of operating instructions, raw materials ordering and hazard analysis.

Use as a means of communication. The model provided a form of representing the


process that everyone had in common. It spoke the language of the chemist and engineer
accelerating its acceptance. Additional benefits found were lower stress levels and
elimination of non-value-added work.

Planning and designing of a new multi-product facility. Historically, the process design
and scale-up of each product was handled individually. Today, several R+D product
teams and the project team are using models to communicate and design the facility. The
modeling system enables optimal equipment selection and overall design. The result is a
better facility at lower capital cost by avoiding over-design.

Evaluating whether to move a process into a different existing facility. Doubt existed
whether the facility had the capacity, equipment, and capability to handle the new process.
The model answered the questions quickly. Time was available to look at the deeper
technical issues allowing rapid decision making.

A new process development and scale-up project is 2 weeks ahead of schedule because of
the use of a model. The team was required to develop material and energy balances, cycle
time calculations, determine streams and vessel contents, perform utilities analysis,
develop the gantt chart for the process, analyze wastes and air emissions, and develop an
estimated product cost at full-scale production.

A custom Excel program that took 2 man-months develop for scale-up of a process was
duplicated by a commercial modeling system could solve the problem with only a few
hours work. Additionally, the custom program could only be applied to a single process
while the commercial system could be applied across all the companys batch processes.

Reduction of 2 weeks of effort in the utilities design. The material and energy balances, as
well as the equipment diagrams were part of the Design Package. Payback was achieved
within one year.

Reduction of frequent upsets in the operation of a waste treatment plant. The model is
used to simulate a weeks production in advance, letting the treatment plant know what is
coming.

Calculation of air emissions streams in a pharmaceutical company . To be reported to the


environmental agency as a best available technology. This avoids the capital expense of
installing measurement devices on all vents.

10
VI. A Look Into the Future

The future in many ways synthesizes the progress of the past. Ever higher levels of integration
lead to a more seamless design-operate-manage capability. A goal in the design of systems
is Single Point Configuration where access to information occurs on a functional basis,
without requiring knowledge of which database holds the information, or where the data
resides. For example, if I am an engineer who needs to find the last change made to a recipe
and the track the reasons for the changes, I should be able to gain access from anywhere on
the system with the appropriate security level. The information would then be delivered to me
using straight-forward language that I can readily understand. My access to the information
would be independent of my location on the globe or whether I access the information from a
control console, engineering workstation or accounting terminal.

True batch optimization is a realization of the true potential of the Batch Industrys operation.
This means that at least four levels of optimization are working independently and in concert
with each other. These loops are representative of the need to optimize all process units, the
production line, the plant and between companies. Each works off the same mathematical and
conceptual model so that it can be synchronized with the rest of the operation. At the unit
level (1) new ways of modeling the behavior of the unit will allow differential and
statistically-based models optimize and control the system. Statistical methods are already
under development using multivariate statistical process control. At the production line or
process cell level (2), unit-to-unit optimization is applied first in the design and
debottlenecking of the train, and then in the scheduling of the process, reassessing the choices
on a real-time basis with batch schedulers that are yet to be fully applied. At the plant level
(3), the full supply chain incorporates material management options that consider the logistical
areas of procurement and transportation. The plan needs to be close to real-time since these
issues take into consideration such as maintenance and traffic factors. Ultimately an
intercompany or interplant integration plan will help to optimize the constraints that reach
beyond the boundary of the site (4). The consistent use of models across company domains is
perhaps one of the greatest challenges of incorporating supply chain concepts with agreements
between various supplier and customer organizations. Figure 4 shows the three levels of intra-
plant integration with the interplant coordination representing the final level.

11
The Recipe as
the Model
Ver. 2.3
RECIPE
Formula Procedure

Equipment
Safety Requirements

Plant Site
Materials
Supply Chain Production Line

4 4
Suppliers 3 Unit Customers
2
1
Raw Materials Finished Goods
Process Cell

Figure 4: Four Levels of Optimization In a Batch Operation


Needed to achieve True Potential through model-centricity

VII. Conclusions

As companies in the batch industries globalize their research, development, operations,


and strategies, a new look at the potential for overall optimization is required. A large
potential for improvement exists that is not being realized today. For example, today unit-
level batch optimization is a generally performed off-line. However, in the future we
expect that technology and tools will be developed to perform such optimization on-line.
Work is already underway. The line of demarcation between off-line and on-line
technologies is blurring in the continuous process industries, and we expect the same to
occur in the batch industries in the near future as the potential for model-based
optimization is realized at all four levels. Technology exists today which can form the
core of the future overall solution, yet a great deal of creativity, perseverance, and
investment will be required on the part of technology providers and operating companies
to enable companies to reach their true potential.

References

1) Pisano, Gary P. The Development Factory, HBS Press, 1997. Boston, Mass.

12
2) Vardy, Joel M. Integrating Manufacturing into the Corporate Reengineering Effort
for the Batch Process Industries. Paper presentation at the Second World Batch
Forum, Philadelphia, Pennsylvania, May 1995.

3) Vardy, Joel M. with Tovell, Nicholas. Phased Automation: Managing the Human
Factors. Paper presentation at International Conference of ISA, Anaheim, California,
October 1987.

13

You might also like