Save (0)




M.Pharmacy II Semester,

Department of Pharmaceutics,


• Concept of Optimization

• Optimization parameters

• Factorial design, Optimization technology & Screening design.

• Computers in Pharmaceutical Formulation

• Development of Pharmaceutical Emulsions Microemulsion drug

• Legal Protection of Innovative Uses of Computers in R&D

• Reference


Concept of Optimization

• The term Optimize is “to make perfect”.

• It is defined as: choosing the best element from some set of available


• An art, process, or methodology of making something (a design,

system, or decision) as perfect, as functional, as effective as possible.


Terms used

❑ FACTOR: It is an assigned variable such as concentration ,
Temperature etc.,

▪ Quantitative: Numerical factor assigned to it

Ex; Concentration- 1%, 2%,3% etc.,

▪ Qualitative: which are not numerical

Ex; Polymer grade, humidity condition, etc.,

❑ LEVELS: Levels of a factor are the values or designations assigned
to the factor FACTOR LEVELS

Temperature 300 , 500

Concentration 1%, 2%


❑ RESPONSE: It is an outcome of the experiment.

It is the effect to evaluate. Ex: Disintegration time etc..,

❑ EFFECT: It is the change in response caused by varying the levels

▪ It gives the relationship between various factors & levels

❑ INTERACTION: It gives the overall effect of two or more variables

Ex: Combined effect of lubricant and glidants on hardness of the tablet.


Optimization Parameters



Problem type Variable

Constrained Unconstrained Dependent Independent



Constraint – restriction placed on a system by physical limitations or by simple

Ex., make the hardest tablet possible, but it must disintegrate in less than 15 minutes.

Unconstraint – no restriction placed, almost nonexistent.

Ex., make the hardest tablet as possible.


Independent or primary Variables (Input variables):

⁎ Formulation and process variables directly under the control of the formulator. Ex., Level of
a given ingredient , Mixing time for a given process step.

Dependent or secondary Variables (Output variables):

⁎ Responses or the characteristics of the in-progress material or the resulting drug delivery

system. These are a direct result of any change in the formulation or process.


Example of dependent & independent variables

Tablet formulation
Independent variables Dependent variables

X1 Diluent ratio Y1 Disintegration time

X2 compressional force Y2 Hardness

X3 Disintegrant level Y3 Dissolution

X4 Binder level Y4 Friability

X5 Lubricant level Y5 weight uniformity





• Experimental design is a concept used to organize, conduct, and interpret
results of experiments in an efficient way and useful information are
obtained by performing a small number of trials.

• Knowledge of product/process is defined by,

Design space

• It is a multidimensional combination and interaction of input variables and
process parameters, which have been demonstrated to provide assurance of


Purposes of DoE application
• Screening studies to determine the most influential factors affecting the


• Full or fractional designs to quantify factorial effects;

• Response surface studies particularly useful for optimization;

• Mixture designs


• Experimental study is to find the relationships between independent variables
(factors) and dependent variables (results, outcomes) within the experimental

• Used to simultaneously study the effects of multiple independent variables (factors)
on response variable(s); therefore, it is a multivariate analysis technique.


Screening Designs

• Screening designs are used to identify the most influential factors.

• A huge number of factors, f , can be screened.

• Factors are varied on two levels in a relatively small number of experiments N ≥ f + 1.

• Typical two-level screening designs are fractional factorial or Plackett–Burman

• When the number of factors, f , is small, then full factorial design can also be used for
screening purposes.

• Screening designs allow simultaneous investigation of both qualitative (discrete) and
quantitative (continuous) factors.


• When f factors are varied on two levels, all possible combinations of these
variations make up the two-level full factorial design.

• The number of experiments, N , in this design is L f= 2 f .

• The designs are usually denoted as e f, in a 2 f design, 3 factors (f) are varied on 2
levels (e).

• The factor levels are in coded values, the lower factor level is denoted as −1, and 1
stands for the upper factor level.


• The experiments are conducted and results are used to calculate factor effects.

• The factor effects demonstrate to what extent certain factors influence the output (i.e.

studied dependent variable).

• Factor effects are used to build the regression model

where y is the response (dependent variable), β0 the intercept, and βi the regression

coefficients (regression coefficients stand for factor effects).


• Full factorial designs allow identification of factor interactions.

• Independent variables, that is, factors can interact meaning that the influence of one
factor on the response is different at different levels of another factor(s).

• Fractional factorial designs are denoted as 2 f-v, where 2 − v (1/2 v ) represents the
fraction of experiments from the full factorial designs that are omitted ( v = 1, 2, 3 .
. .)

• An example of fractional factorial design for 4 factors, 2 4-1 design. By fractioning
the combinations of factors levels, some of the information is lost.


• Fractional factorial design does not indicate potential factor interactions and if it is
highly fractioned, some factors effects are estimated together (factors are

• The Plackett–Burman design, allows estimation of factor effects for f = N – 1
factors, where N is the number of experiments with a multiple of 4.

• These designs are especially useful for preliminary investigations of huge numbers
of potentially influential factors , for a 2 7-4 design.

• Other special kinds of screening designs are asymmetrical, supersaturated, or
mixed-level designs.

• D-optimal design can also be adapted for screening purposes.


Response surface designs
• Response surface designs are used to analyze effects of the most significant factors that

are recognized by screening studies.

• The number of factors is usually 2 or 3.

• Factors are varied on at least three levels.

• The main goal of response surface designs is optimization.

• Qualitative (discrete) factors cannot be used.

• Response surface designs are accompanied by visual representation of the factors’
influence on the studied response.


Response surface designs

Symmetrical designs Asymmetrical designs

Three-level Central

full factorial composite



Three-level full factorial

• Three-level full factorial design for three factors is represented in figure

• In order to determine the experimental error, the central point is often replicated
several (3–5) times.


Central composite design (CCD)

• Central composite design (CCD) is composed of

– two-level full factorial design (2fexperiments),

– a star design (2 f experiments), and

– A center point.

• N = 2 f + 2 f + 1 experiments to examine the f factors

• The points of the full factorial design are situated at factor levels −1 and +1, those of the
star design at the factor levels 0, − α and + α , and the center point at factor level 0.

• Depending on the value of α , two types of designs exist,

• A face-centered CCD (FCCD) with | α | = 1, and

• A circumscribed CCD (CCCD) with | α | > 1

• In the case of FCCD and CCCD, factors are varied on three or five levels, respectively.


Box-Behnken design

• A BBD is described for a minimum of three factors

• Contains N = (2 f ( f − 1)) + c 0 experiments, c 0 is the number of center points

• Most common alternative to the CCD .

• BBDs are second-order designs based on three-level incomplete factorial designs.

• BBD can be presented in a simplified manner as follows


• When there are 5 or more factors, Box and Behnken recommended using all
possible 23 designs, holding the other factors constant.

• Advantages of BBD is that it does not contain combinations for which all factors
are simultaneously at their highest or lowest levels.


Doehlert (uniform shell) design

• A Doehlert (uniform shell) design has equal distances between all neighbouring

• Factors are varied at different numbers of levels, in the same design.

• Enables to select the number of levels for each factor, depending on its nature and
desired experimental domain.

• The Doehlert design describes a spherical experimental domain and stresses
uniformity in space filling.

• For two variables, the design consists of one central point and six points forming a
regular hexagon, and therefore is situated on a circle.

• One variable is varied on five levels, whereas the other is varied on three levels.

• The efficiency of one experimental design is defined as the number of
coefficients in the estimated model divided by the number of experiments.


• Asymmetrical designs are used for investigation in the asymmetrical experimental domain.

• Typical examples are D-optimal design.

D-Optimal design:

• D-optimal design is an efficient tool in experimental design.

• Detect the best subset of experiments from a set of candidate points.

• N experiments forming the D-optimal design are selected from the candidate points,
forming a grid over the asymmetrical domain.

• These experiments are the best subset of experiments selected from a candidate set.


The best subsets are selected by the criterion that the selected design should maximize
the determinant of the matrix X’X for a given regression model. These designs are
referred to as D (from ‘D’ in determinant)

• In all of the above described response surface designs, the regression model is
defined as:

• where y is the response, β 0 the intercept, β i the main coefficients, β ij the two
factor interaction coefficients, and β ii the quadratic coefficients.


Mixture Designs

• Mixture designs are used to study mixture variables such as excipients in a formulation.

• All mixture components are examined in one design.

• The characteristic feature of a mixture is that the sum of all its components adds up to
100%, hence cannot be manipulated completely independently of one another.

• Data analysis is more complicated, since mixture factors are correlated.


• Simplex lattice mixture designs can be defined with three or six experiments
(experiments 1–6 in table).

• If experiment 7 is included, then it is a simplex lattice-centroid design and

• If all ten experiments are considered, then it is an augmented simplex lattice–centroid
mixture design.

Experimental points for the mixture design


• The three most commonly used mixture designs support

Linear models

– Taken from the axial design.

– Used for screening (or) robustness testing.

Quadratic & Special cubic models

– Derived from simplex centroid designs.

– Used for Optimization purposes.

Three most commonly used mixture desings for three-component mixtures supporting
linear(left), quadratic(center), and spewcwiawl.D cuulobMiixc.c(ormight) models.


• Mixture designs are of K – 1 dimensionality, where K is the number of factors (mixture

• The mixture regions of regular mixtures

– Two component mixtures-line,

– Three component mixtures-triangle,

– Four-component mixtures- tetrahedron.

Constraints are made, due to limitations in mixture components.

• Then constraints are defined (e.g. all three mixture components must be present, and
weight ratio of one of the components should not exceed a certain percentage, etc.).


• Domains of different shape within the triangle (tetrahedron, etc.) are then selected.

– Regular mixture designs no longer apply,

– Irregularity in experimental design is best handled with D-optimal design.

• D-optimal design maps the largest possible experimental design for selected model
(linear, quadratic, or special cubic).

• It is therefore necessary to carefully define the purpose of experimental study
(screening, optimization, or robustness testing) prior to selection of adequate
mixture design.




• Over the past decade a small number of visionary scientists have been experimenting with

and developing advanced computing techniques.

• These include

– Expert and knowledge-based systems

– Neural computing

– Computer simulation

The idea behind this work is

– to assist the formulation of products with the added benefits of consistent decision
making, decreased timelines, and cost savings.


Expert and Knowledge-based Systems

• “An expert system is a knowledge-based system that emulates expert thought to solve
significant problems in a particular domain of expertise.”

• “An expert system is a computer program that draws upon the knowledge of human
experts captured in a knowledge base to solve problems that normally require human

• Expert systems comprise an interface allowing a two way communication between the
user and the system.

• A knowledge base where all the knowledge pertaining to the domain is stored and an
inference engine where the knowledge is extracted and manipulated to solve the
problem in hand.

• Inferencing strategies may be
– Forward chaining
– Backward chaining


• Other representation methodologies may include

– frames or templates; semantic networks; and decision trees or tables for
organizing knowledge in a tree or tabular format.

• Generally, multiple methods are used to express formulation knowledge.

• Expert Systems can be developed with conventional computer languages such as C
or more recently JAVA, with specialized languages such as LISP and PROLOG, or
with the assistance of development shells or toolkits.

• Shells and toolkits are sets of programs written in either conventional or specialized
languages that can form an expert system when loaded with knowledge.

• They compromise on applicability and flexibility but allow the rapid development
of unique systems.


• The technology is generally referred to as rule-based reasoning (RBR) because it
relates to the structuring and use of knowledge in the form of rules abstracted
during the acquisition process.

• Another technology used to develop expert systems is case-based reasoning (CBR):
To solve a problem, remember a similar problem you have solved in the past and
adapt the solution to solve the new problem [7]. CBR directly uses records of
previous solutions both successful and unsuccessful as its principal knowledge base.


Neural Computing

• The properties of a formulation are determined not only by the ratios in which the
ingredients are combined but also by the processing conditions.

• Formulators have tended to use statistical techniques such as a response surface
methodology to investigate the design space, but optimization by such a method can be
misleading, especially if the formulation is complex.

• Recent advances in mathematics and computer science have resulted in the development
of three techniques,

– Neural networks

– Genetic algorithms

– Fuzzy logic


Neural networks
• Like humans, neural networks learn directly from input data.

• The learning algorithms take two main forms,

– Unsupervised Learning:

– Supervised Learning

• The basic component of the neural network is the neuron, a simple mathematical
processing unit that takes one or more inputs and produces an output.

• The neuron simply computes the weighted sum of all the outputs and calculates an output.

• Output is then modified by means of a transformation function (PERCEPTRON- simple
processing unit, a feed-forward system from inputs to outputs, only)


• A neural network consists of many neurons organized into a structure called the network

• Most popular and successful is the multilayer perceptron (MLP) network.

Optimization technique based on concepts of biological evolution.

• Require a concept of “fitness,” which is assessed by user-specified goals.

• Genetic algorithms work with a population of individuals, each of which is a candidate
solution to the problem.


Genetic Algorithms

• Each individual’s “fitness” is assessed, and if an optimum solution is not found, then a
further generation of possible solutions is produced by combining large chunks of the
fittest initial solutions by a crossover operation (mimicking mating and reproduction).

• After many generations, an optimum solution will be found.


Fuzzy logic

• In defining the concept of “fitness,” fuzzy logic provides a useful framework for
describing complex formulation goals.

• Fuzzy logic, as the name implies, blurs the clear-cut “true” and “false” of conventional
“crisp” logic by assigning a noninteger number that describes the “membership” in a
particular set as somewhere between 0 (false) and 1 (true)

• If the formulator is seeking a tablet with a disintegration time of less than 300 seconds,
one with a disintegration time of (say) 310 seconds will not be rejected out, but will be
assigned a desirability of somewhat less than 100% according to its membership in the
Low set.



• Simulation is best described as the process of translating a real system into a
working model in order to run experiments.

• A simulation does not duplicate a system; rather it is an abstraction of reality using
mathematics to express cause-and-effect relationships that determine the behaviour
of the system.

• Software for computer simulation is often customized and based on that developed
in academia.

• There are not many commercial packages available for Pharmaceutical formulation.





• Concept of formulation development assisted by computer applications.

• Due to complex composition, preparation and stability issues of emulsions were selected to
showcase various computer- aided tools in pharmaceutical formulation development.

• Successful development of an emulsion formulation is dependent on both formulation
ingredients and processing parameters.

• The examples provided illustrate techniques

– Used to define a design space,

– Select the appropriate formulation ingredient,

– Optimize the formulation composition and process parameters, according to the quality-
by-design (QbD) concept.

• Importantly, methods that allow simultaneous optimization of multiple factors are also

• Provide a deeper insight into selected inw wswil.DiculooM mix.ecothmods.


Application of Computer- aided techniques in
development of Pharmaceutical Emulsions

• Emulsions are disperse systems made of two immiscible liquids.

• One liquid is dispersed into the other, in the presence of surface active agents, such as

• The two immiscible liquids are usually oil and water.

• The main types of simple emulsions are oil- in-water (o/w) or water- in-oil (w/o).

• Emulsions have a great potential as vehicles for active ingredients for different routes of
administration (topical, parenteral, oral).


Application of different in silico techniques in process
and formulation optimization

• Factorial design methods to optimize the stability and suggested the required hydrophilic
lipophilic balance (HLB) of o/w emulsions prepared with sodium lauryl sulfate as the

• Influence of the processing variables on performance of o/w emulsion gels stabilized by a
polymeric emulsifier (Pemulen® TR-2NF). A two-factor three-level experimental design at
two sets was applied.

• Optimal preservative combination and concentration for preparing topical emulsions by using
a D-optimal experimental design (mixture design).

• Influence of different ratios of individual components on the viscoelastic behaviour of
semisolid lipophilic emulsion systems using ANN technique.

• Formulation optimization of the w/o/w multiple emulsion incorporating insulin was
performed, based on statistical methods such as the orthogonal experimental design and the
response surface evaluation.


Application of computer- aided techniques in
development of microemulsion drug carriers

• Microemulsions are thermodynamically stable and optically isotropic transparent colloidal
systems consisting of water, oil, and surfactant.

• Although they are clear, low viscous liquids, the different types of microstructures are
identified (i.e. w/o, o/w, and bicontinuous), all organized on the level below 100 nm.

• Microemulsions and self- microemulsifying drug delivery systems (SMEDDS) form only
in well balanced mixtures of the selected excipients and within the specific concentration
ranges of the constituents at given temperatures and pressures (i.e. the microemulsion

• ANN models were introduced as useful tools for accurate differentiation and prediction of
the microemulsion area.





⸭ The term “Intellectual Property Rights” is used to describe the legal instruments for

protecting innovation.

⸭ Almost all countries recognize the basic types of laws governing intellectual property


⸭ Member states of the World Trade Organisation have all committed to introduce IPR.


Types of Intellectual Property Rights (IPR)

Type of IPR Protects Maximum Lifetime
(generally—may vary

from country to country)

Patent Technical ideas 20 years from filing

Copyright Literary works including computer programs 70 years from death of author or date of
creation (in the case of joint works)

Database rights Collection of data (only exists in the European 70 years from the date of creation
Union and some other countries—the US is
discussing the proposal)

Trade secrets Secret nondisclosed information Unlimited, as long as access is limited to a
select group.

Design Aesthetic creation (generally not relevant in the Varies from country to country; 25 years in
pharmaceutical field) the European Union from application; 14

years in the United States from grant.

Trademarks Brand name or sign Unlimited, as long as the
designating a trademark remains in use


Enforcement of Rights

• Obtaining IP protection is only the first step.

• The intellectual property rights obtained are only useful if they can be exploited
and—ultimately—unauthorized users of the rights can be stopped from exploiting

• This presents a fairly unique problem in the computer science field.

• IP rights are essentially national rights.

• They are only valid in the country in which they are granted or registered.

• Any scientist using unauthorized patented computer software in pharmaceutical
research and development would be infringing the patent even if the computer (for
example, a database server) was located in another country.



• The use of computers in developing new pharmaceutical products is nowadays
commonplace, and a number of tools and databases have been developed to
improve their use.

• Although intellectual property rights have to date rarely been the subject of court
cases, protection is available and the courts are prepared to enforce these rights,
even in an international context.


• Computer Applications in Pharmaceutical Research and

Development, Sean Ekins, 2006, John Wiley & Sons.

• Computer-Aided Applications in Pharmaceutical Technology, 1st
edition, Jelena Djuris, Woodhead Publishing

• Modern Pharmaceutics; By Gillbert and S. Banker.