tlbo codes

Upload: sinlaugh

Post on 13-Oct-2015

26 views

Category:

Documents


1 download

DESCRIPTION

This is MATLAB code for TLBO algorothm

TRANSCRIPT

  • This file contains information about Teaching-Learning-Based Optimization (TLBO) algorithm and its codes developed by Dr. R. Venkata Rao and his team at S.V.National Institute of Technology, Surat of India for use by the interested research community across the globe.

    Teaching-learning-based optimization (TLBO) algorithm All the evolutionary and swarm intelligence based algorithms require common controlling parameters like population size and number of generations. Besides the common control parameters, different algorithms require their own algorithm-specific control parameters. For example, GA uses mutation rate and crossover rate; PSO uses inertia weight, social and cognitive parameters; ABC uses number of bees (employed, scout and onlookers) and limit; HS requires harmony memory consideration rate, pitch adjusting rate and the number of improvisations; ACO requires exponent parameters, pheromone evaporation rate and reward factor; etc. Sometimes, the difficulty in the selection of algorithm-specific control parameters increases with modifications and hybridization. The proper tuning of the algorithm- specific parameters is a very crucial factor which affects the performance of the optimization algorithms.

    The improper tuning of algorithm-specific parameters either increases the computational effort or yields the local optimal solution. Considering this aspect, we have recently introduced the Teaching-Learning-Based Optimization (TLBO) algorithm which does not require any algorithm-specific control parameters. TLBO requires only common controlling parameters like population size and number of generations (and elite size, if considered) for its working. Thus, TLBO can be said as an algorithm-specific parameter-less algorithm.

    Since 2012 a number of researchers have started publishing papers using TLBO algorithm in various international journals and conference proceedings. One may refer to SCOPUS or any such database for more details. However, a list of papers published by our team in international journals is given on the next page.

  • Parameter optimization of modern machining processes using teachinglearning-based optimization algorithm Engineering Applications of Artificial Intelligence (Elsevier), Volume 26, Issue 1, January 2013, Pages 524-531 R. Venkata Rao, V.D. Kalyankar Multi-pass turning process parameters optimization using teaching-learning-based optimization algorithm Scientia Iranica (Elsevier), Available online 10 January 2013 R.V. Rao, V.D. Kalyankar Multi-objective optimization of heat exchangers using a modified teaching-learning-based optimization algorithm Applied Mathematical Modelling (Elsevier), Volume 37, Issue 3, 1 February 2013, Pages 1147-1162 R. Venkata Rao, Vivek Patel Multi-objective optimization of two stage thermoelectric cooler using a modified teachinglearning-based optimization algorithm Engineering Applications of Artificial Intelligence (Elsevier), Volume 26, Issue 1, January 2013, Pages 430-445 R. Venkata Rao, Vivek Patel Comments on A note on teachinglearning-based optimization algorithm Information Sciences (Elsevier), Available online 27 November 2012 Gajanan Waghmare (This paper is included in this list as it provides a better understanding on TLBO algorithm) An improved teaching-learning-based optimization algorithm for solving unconstrained optimization problems Scientia Iranica (Elsevier), Available online 20 December 2012 R. Venkata Rao, Vivek Patel TeachingLearning-Based Optimization: An optimization method for continuous non-linear large scale problems Information Sciences (Elsevier), Volume 183, Issue 1, 15 January 2012, Pages

  • 1-15 R.V. Rao, V.J. Savsani, D.P. Vakharia Parameter optimization of machining processes using teachinglearning-based optimization algorithm International Journal of Advanced Manufacturing Technology (Springer), 2012, DOI:10.1007/s00170-012-4524-2 P. J. Pawar, R. Venkata Rao, Teachinglearning-based optimization algorithm for unconstrained and constrained real-parameter optimization problems Engineering Optimization (Taylor & Francis), Volume 44, Issue 12, 2012, 1447-1462 R.V. Rao, V.J. Savsani, J. Balic An elitist teaching-learning-based optimization algorithm for solving complex constrained optimization problems. International Journal of Industrial Engineering Computations, Volume 3, Issue 4, 2012, 535-560. R.V. Rao, V. Patel

    This paper contains the elitist TLBO code and the paper may be downloaded from

    www.growingscience.com/ijiec/Vol3/IJIEC_2012_37.pdf Comparative performance of an elitist teaching-learning-based optimization algorithm for solving unconstrained optimization problems. International Journal of Industrial Engineering Computations, Volume 4, 2012, DOI:10.5267/j.ijiec.2012.09.001. R.V. Rao, V. Patel

    This paper contains the elitist TLBO code and the paper may be downloaded from

    www.growingscience.com/ijiec/IJIEC_2012_62.pdf Parameter optimization of machining processes using a new optimization method Materials and Manufacturing Processes, Volume 27, Issue 9, 2012, 978-985 R.V. Rao, V.D. Kalyankar

  • Multi-objective multi-parameter optimization of the industrial LBW process using a new optimization algorithm Journal of Engineering Manufacture, Volume 226, Issue 6, 2012, 1018-1025 R.V. Rao, V.D. Kalyankar Teachinglearning-based optimization: A novel method for constrained mechanical design optimization problems Computer-Aided Design, Volume 43, Issue 3, March 2011, Pages 303-315 R.V. Rao, V.J. Savsani, D.P. Vakharia (This paper has received Elsevier CAD Most Cited Award 2013)

    TLBO is a teaching-learning process inspired algorithm based on the effect of influence of a teacher on the output of learners in a class. Teacher and learners are the two vital components of the algorithm and describes two basic modes of the learning, through teacher (known as teacher phase) and interacting with the other learners (known as learner phase). The output in TLBO algorithm is considered in terms of results or grades of the learners which depend on the quality of teacher. So, teacher is usually considered as a highly learned person who trains learners so that they can have better results in terms of their marks or grades. Moreover, learners also learn from the interaction among themselves which also helps in improving their results.

    TLBO is population based method. In this optimization algorithm a group of learners is considered as population and different design variables are considered as different subjects offered to the learners and learners result is analogous to the fitness value of the optimization problem. In the entire population the best solution is considered as the teacher. The working of TLBO is divided into two parts, Teacher phase and Learner phase. Working of both the phases is explained below. Teacher phase This phase of the algorithm simulates the learning of the students (i.e. learners) through teacher. During this phase a teacher conveys knowledge among the learners and puts efforts to increase the mean result of the class. Suppose there are m number of subjects (i.e. design variables) offered to n number of learners (i.e. population size, k=1,2,,n). At any sequential teaching-learning cycle i, Mj,i be the mean result of the learners in a particular subject j (j=1,2,,m). Since a teacher is the most experienced and knowledge person on a subject, so the best learner in the entire population is considered as a teacher in the algorithm. Let Xtotal-kbest,i is the result of the best learner considering all the subjects, who is identified as a teacher for that cycle. Teacher will put maximum effort to increases the knowledge level of the whole class, but learners will gain knowledge according to the quality of teaching delivered by a teacher and the quality of learners

  • present in the class. Considering this fact the difference between the result of the teacher and mean result of the learners in each subject is expressed as, Difference_Meanj,i = ri (Xj,kbest,i - TFMj,i) (1) where, Xj,kbest,i is the result of the teacher (i.e. best learner) in subject j. TF is the teaching factor which decides the value of mean to be changed, and ri is the random number in the range [0, 1]. Value of TF can be either 1 or 2. The value of TF is decided randomly as, TF = round [1+rand(0,1){2-1}] (2) TF is not a parameter of the TLBO algorithm. The value of TF is not given as an input to the algorithm and its value is randomly decided by the algorithm using Eq. (2).

    Based on the Difference_Meanj,k,i, the existing solution is updated in the teacher phase according to the following expression. X'j,k,i = Xj,k,i + Difference_Meanj,k,i

    (3) where X'j,k,i is the updated value of Xj,k,i. X'j,k,i is accepted if it gives better function value. All the accepted function values at the end of the teacher phase are maintained and these values become the input to the learner phase. Learner phase This phase of the algorithm simulates the learning of the students through interaction among themselves. The students can also gain knowledge by discussing and interacting with the other students. A learner will learn new information if the other learner has more knowledge than him or her. The learning phenomenon of this phase is expressed below.

    Randomly two learners P and Q are selected such that X'total-P,i X'total-Q,i (where, X'total-P,i and X'total-Q,i are the updated values of Xtotal-P,i and Xtotal-Q,i respectively at the end of teacher phase) X''j,P,i = X'j,P,i + ri (X'j,P,i - X'j,Q,i), If X'total-P,i > X'total-Q,i (4) X''j,P,i = X'j,P,i + ri (X'j,Q,i - X'j,P,i), If X'total-Q,I > X'total-P,i (5) (Above equations is for maximization problem, reverse is true for minimization problem)

    X''j,P,i is accepted if it gives a better function value. The flow chart of the TLBO algorithm is shown in Fig. 1.

  • Fig. 1. Flowchart of the TLBO algorithm

    Yes

    No Keep the previous solution

    Yes Accept

    Yes No

    Keep the previous solution

    Accept

    Select the solutions randomly and modify them by comparing with each other

    Evaluate the initial population

    Final value of solution

    Initialize the population, design variables and termination criterion

    Select the best solution

    Calculate the mean of each design variable

    Calculate the Difference_Mean and modify the solutions based on best solution

    Is termination criterion fulfilled?

    Is new solution better than existing?

    No

    Is new solution better than existing?

  • Elitist TLBO

    The concept of elitism is utilized in most of the evolutionary and swarm intelligence algorithms where during every generation the worst solutions are replaced by the elite solutions. In the TLBO algorithm, after replacing the worst solutions with elite solutions at the end of learner phase, if the duplicate solutions exist then it is necessary to modify the duplicate solutions in order to avoid trapping in the local optima. In the present work, duplicate solutions are modified by mutation on randomly selected dimensions of the duplicate solutions before executing the next generation.

    The flow chart of the elitist TLBO algorithm is shown in Fig. 2.

  • Fig. 2. Flowchart of the elitist TLBO algorithm

    No

    Remove duplicate solutions

    Is termination criterion fulfilled?

    Yes Final value of solution

    Initialize the population, design variables and termination criterion

    Evaluate the initial population

    Select the best solution

    Keep the elite solutions

    Calculate the mean of each design variable

    Calculate the Difference_Mean and modify the solutions based on best solution

    Keep the previous solution

    Accept Yes Is new solution better than existing?

    No

    Replace worst solutions with elite solutions

    Select the solutions randomly and modify them by comparing with each other

    Keep the previous solution

    Accept Yes No Is new solution better than existing?

  • The codes for unconstrained and constrained functions using TLBO and elitist TLBO algorithms are given below. The user has to create separate MATLAB files in each case (all saved in a single folder). He or she can define the ranges, variables and the objective function(s) as per his or her requirements. The user may contact Prof. R. Venkata Rao for more details ([email protected]).

    (i). TLBO Code for Unconstrained Functions

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    function [ini_fun, result_fun, result_fun_new, opti_fun, opti_fun_new] = Define_range format long; ini_fun = @Define_rangeDefine_variables ; result_fun = @Define_rangeresult; result_fun_new = @Define_rangeresult_new; opti_fun = @Define_rangeopti; opti_fun_new = @Define_rangeopti_new; return; function [upper_limit, lower_limit, Students, select] = Define_rangeDefine_variables(select) global lower_limit upper_limit ll ul Granularity = 1; lower_limit = ll; upper_limit = ul;

    ll=[-10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10];% -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10]; ul=[10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10];% 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10]; lower_limit = ll; upper_limit = ul; for popindex = 1 : select.classsize for k = 1 : select.var_num

  • mark(k) =(ll(k))+ ((ul(k) - ll(k)) * rand); end Students(popindex).mark = mark; end select.OrderDependent = true; return; function [Students] = Define_rangeresult(select, Students) global lower_limit upper_limit classsize = select.classsize; for popindex = 1 : classsize for k = 1 : select.var_num x(k) = Students(popindex).mark(k); end Students(popindex).result = objective(x); end return function [Studentss] = Define_rangeresult_new(select, Students) global lower_limit upper_limit classsize = select.classsize; for popindex = 1 : size(Students,1) for k = 1 : select.var_num x(k) = Students(popindex,k); end Studentss(popindex) = objective(x); end return function [Students] = Define_rangeopti(select, Students) global lower_limit upper_limit ll ul for i = 1 : select.classsize for k = 1 : select.var_num Students(i).mark(k) = max(Students(i).mark(k), ll(k)); Students(i).mark(k) = min(Students(i).mark(k), upper_limit(k)); end end return; function [Students] = Define_rangeopti_new(select, Students) global lower_limit upper_limit ll ul

  • for i = 1 : size(Students,1) for k = 1 : select.var_num Students(i,k)= max(Students(i,k), ll(k)); Students(i,k) = min(Students(i,k), upper_limit(k)); end end return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students, select, upper_limit, lower_limit, ini_fun, min_result, avg_result, result_fun, opti_fun, result_fun_new, opti_fun_new] = Define_variables(note1, obj_fun, RandSeed) format long; select.classsize = 20; select.var_num = 30; select.itration = 200; if ~exist('RandSeed', 'var') rand_gen = round(sum(100*clock)); end rand('state', rand_gen) [ini_fun, result_fun, result_fun_new, opti_fun, opti_fun_new,] = obj_fun(); [upper_limit, lower_limit, Students, select] = ini_fun(select); Students = remove_duplicate(Students, upper_limit, lower_limit); Students = result_fun(select, Students); Students = sortstudents(Students); average_result = result_avg(Students); min_result = [Students(1).result]; avg_result = [average_result]; return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function yy=objective(x) %clc; D=30; for ikl=1 : D

  • p(ikl)=x(ikl); end sum1=0; for ikl=1 : D z1=(p(ikl))^2; z2=-10*(cos(6.28*p(ikl))); sum1=sum1+z1+z2+10; end z4=sum1; yy=(z4); %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students] = remove_duplicate(Students, upper_limit, lower_limit) format long; global ll ul for i = 1 : length(Students) Mark_1 = sort(Students(i).mark); for k = i+1 : length(Students) Mark_2 = sort(Students(k).mark); if isequal(Mark_1, Mark_2) m_new = floor(1+(length(Students(k).mark)-1)*(rand)); if length(upper_limit)==1 Students(k).mark(m_new) = (lower_limit + (upper_limit - lower_limit) * rand); else Students(k).mark(m_new) = (ll(m_new) + (upper_limit(m_new) - ll(m_new)) * rand); end end end end return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

  • %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [result_av, within_bound] = result_avg(Students) format long; Result = []; within_bound = 0; for i = 1 : length(Students) if Students(i).result < inf Result = [Result Students(i).result]; within_bound = within_bound + 1; end end result_av = mean(Result); return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function run_TLBO() clear all; clc; run=1; format long; for i=1:run TLBO(@Define_range); end %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students, indices] = sortstudents(Students) classsize = length(Students); Result = zeros(1, classsize); indices = zeros(1, classsize); for i = 1 : classsize Result(i) = Students(i).result; end [Result, indices] = sort(Result, 2, 'ascend'); Marks = zeros(classsize, length(Students(1).mark)); for i = 1 : classsize

  • Marks(i, :) = Students(indices(i)).mark; end for i = 1 : classsize Students(i).mark = Marks(i,:); Students(i).result = Result(i); End %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function TLBO(obj_fun, note1, note2) format long; global ll if ~exist('note1', 'var') note1 = true; end if ~exist('note2', 'var') note2 = true; end [Students, select, upper_limit, lower_limit, ini_fun, min_result, avg_result, result_fun, opti_fun, result_fun_new, opti_fun_new] = Define_variables(note1, obj_fun); ul=upper_limit; ll=lower_limit; for COMP = 1 : select.itration for i=1:length(Students) cs(i,:)=Students(i).mark; cs_result(i)=Students(i).result; end cs_new=cs; cs; cs_result; Stdev_cs=std(cs); Mean_cs=mean(cs); for i = 1 : length(Students) mean_result=mean(cs); Stdev_cs=std(cs); TF=round(1+rand*(1));

  • [r1 r2]=sort(cs_result); best=cs(r2(1),:); for k = 1 : select.var_num Stdev_cs(i,k)=std(cs(i,k)); cs_new(i,k)=cs(i,k)+((best(1,k)-TF*mean_result(k))*rand)-(Stdev_cs(1,k)- Stdev_cs(i,k))*rand;%((1)*((cs(hh,k)-(LF)*cs(i,k)))*(rand)); end cs_new(i,:) = opti_fun_new(select, cs_new(i,:)); cs_new_result(i) = result_fun_new(select, cs_new(i,:)); if cs_new_result(i)

  • n = length(Students); Students = opti_fun(select, Students); Students = result_fun(select, Students); Students = sortstudents(Students); if rand

  • function [upper_limit, lower_limit, Students, select] = Define_rangeDefine_variables(select) global lower_limit upper_limit ll ul Granularity = 1; lower_limit = ll; upper_limit = ul;

    ll=[-10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10];% -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10 -10]; ul=[10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10];% 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10 10];

    lower_limit = ll; upper_limit = ul; for popindex = 1 : select.classsize for k = 1 : select.var_num mark(k) =(ll(k))+ ((ul(k) - ll(k)) * rand); end Students(popindex).mark = mark; end select.OrderDependent = true; return; function [Students] = Define_rangeresult(select, Students) global lower_limit upper_limit classsize = select.classsize; for popindex = 1 : classsize for k = 1 : select.var_num x(k) = Students(popindex).mark(k); end Students(popindex).result = objective(x); end return function [Studentss] = Define_rangeresult_new(select, Students) global lower_limit upper_limit classsize = select.classsize;

  • for popindex = 1 : size(Students,1) for k = 1 : select.var_num x(k) = Students(popindex,k); end Studentss(popindex) = objective(x); end return function [Students] = Define_rangeopti(select, Students) global lower_limit upper_limit ll ul for i = 1 : select.classsize for k = 1 : select.var_num Students(i).mark(k) = max(Students(i).mark(k), ll(k)); Students(i).mark(k) = min(Students(i).mark(k), upper_limit(k)); end end return; function [Students] = Define_rangeopti_new(select, Students) global lower_limit upper_limit ll ul for i = 1 : size(Students,1) for k = 1 : select.var_num Students(i,k)= max(Students(i,k), ll(k)); Students(i,k) = min(Students(i,k), upper_limit(k)); end end return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students, select, upper_limit, lower_limit, ini_fun, min_result, avg_result, result_fun, opti_fun, result_fun_new, opti_fun_new] = Define_variables(note1, obj_fun, RandSeed) format long; select.classsize = 20; select.var_num = 30; select.itration = 200; if ~exist('RandSeed', 'var') rand_gen = round(sum(100*clock));

  • end rand('state', rand_gen) [ini_fun, result_fun, result_fun_new, opti_fun, opti_fun_new,] = obj_fun(); [upper_limit, lower_limit, Students, select] = ini_fun(select); Students = remove_duplicate(Students, upper_limit, lower_limit); Students = result_fun(select, Students); Students = sortstudents(Students); average_result = result_avg(Students); min_result = [Students(1).result]; avg_result = [average_result]; return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function ETLBO(obj_fun, note1, note2) format long; global ll if ~exist('note1', 'var') note1 = true; end if ~exist('note2', 'var') note2 = true; end [Students, select, upper_limit, lower_limit, ini_fun, min_result, avg_result, result_fun, opti_fun, result_fun_new, opti_fun_new] = Define_variables(note1, obj_fun); ul=upper_limit; ll=lower_limit; elite=4; for COMP = 1 : select.itration for i = 1 : elite markelite(i,:) = Students(i).mark; resultelite(i) = Students(i).result; end for i=1:length(Students) cs(i,:)=Students(i).mark; cs_result(i)=Students(i).result;

  • end cs_new=cs; cs; cs_result; Stdev_cs=std(cs); Mean_cs=mean(cs); for i = 1 : length(Students) mean_result=mean(cs); Stdev_cs=std(cs); TF=round(1+rand*(1)); [r1 r2]=sort(cs_result); best=cs(r2(1),:); for k = 1 : select.var_num Stdev_cs(i,k)=std(cs(i,k)); cs_new(i,k)=cs(i,k)+((best(1,k)-TF*mean_result(k))*rand)-(Stdev_cs(1,k)- Stdev_cs(i,k))*rand;%((1)*((cs(hh,k)-(LF)*cs(i,k)))*(rand)); end cs_new(i,:) = opti_fun_new(select, cs_new(i,:)); cs_new_result(i) = result_fun_new(select, cs_new(i,:)); if cs_new_result(i)

  • end end cs_new(i,:) = opti_fun_new(select, cs_new(i,:)); cs_new_result(i) = result_fun_new(select, cs_new(i,:)); if cs_new_result(i)

  • function yy=objective(x) %clc;

    D=30; for ikl=1 : D p(ikl)=x(ikl); end sum1=0; for ikl=1 : D z1=(p(ikl))^2; z2=-10*(cos(6.28*p(ikl))); sum1=sum1+z1+z2+10; end z4=sum1; yy=(z4); %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students] = remove_duplicate(Students, upper_limit, lower_limit) format long; global ll ul for i = 1 : length(Students) Mark_1 = sort(Students(i).mark); for k = i+1 : length(Students) Mark_2 = sort(Students(k).mark); if isequal(Mark_1, Mark_2) m_new = floor(1+(length(Students(k).mark)-1)*(rand)); if length(upper_limit)==1 Students(k).mark(m_new) = (lower_limit + (upper_limit - lower_limit) * rand); else Students(k).mark(m_new) = (ll(m_new) + (upper_limit(m_new) - ll(m_new)) * rand); end end end end

  • return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [result_av, within_bound] = result_avg(Students) format long; Result = []; within_bound = 0; for i = 1 : length(Students) if Students(i).result < inf Result = [Result Students(i).result]; within_bound = within_bound + 1; end end result_av = mean(Result); return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function run_ETLBO() clear all; clc; run=1; format long; for i=1:run ETLBO(@Define_range); end %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students, indices] = sortstudents(Students) classsize = length(Students); Result = zeros(1, classsize); indices = zeros(1, classsize); for i = 1 : classsize Result(i) = Students(i).result;

  • end [Result, indices] = sort(Result, 2, 'ascend'); Marks = zeros(classsize, length(Students(1).mark)); for i = 1 : classsize Marks(i, :) = Students(indices(i)).mark; end for i = 1 : classsize Students(i).mark = Marks(i,:); Students(i).result = Result(i); End %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    (iii). TLBO Code for Constrained Functions

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    function [ini_fun, result_fun, result_fun_new, opti_fun, opti_fun_new] = Define_range format long; ini_fun = @Define_rangeDefine_variables ; result_fun = @Define_rangeresult; result_fun_new = @Define_rangeresult_new; opti_fun = @Define_rangeopti; opti_fun_new = @Define_rangeopti_new; return; function [upper_limit, lower_limit, Students, select] = Define_rangeDefine_variables(select) global lower_limit upper_limit ll ul Granularity = 1; lower_limit = ll; upper_limit = ul;

    ll=[0 0 0 0 0 0 0 0 0 0 0 0 0]; ul=[1 1 1 1 1 1 1 1 1 100 100 100 1];

    lower_limit = ll;

  • upper_limit = ul; for popindex = 1 : select.classsize for k = 1 : select.var_num mark(k) =(ll(k))+ ((ul(k) - ll(k)) * rand); end Students(popindex).mark = mark; end select.OrderDependent = true; return; function [Students] = Define_rangeresult(select, Students) global lower_limit upper_limit classsize = select.classsize; for popindex = 1 : classsize for k = 1 : select.var_num x(k) = Students(popindex).mark(k); end Students(popindex).result = objective(x); end return function [Studentss] = Define_rangeresult_new(select, Students) global lower_limit upper_limit classsize = select.classsize; for popindex = 1 : size(Students,1) for k = 1 : select.var_num x(k) = Students(popindex,k); end Studentss(popindex) = objective(x); end return function [Students] = Define_rangeopti(select, Students) global lower_limit upper_limit ll ul for i = 1 : select.classsize for k = 1 : select.var_num Students(i).mark(k) = max(Students(i).mark(k), ll(k)); Students(i).mark(k) = min(Students(i).mark(k), upper_limit(k)); end end

  • return; function [Students] = Define_rangeopti_new(select, Students) global lower_limit upper_limit ll ul for i = 1 : size(Students,1) for k = 1 : select.var_num Students(i,k)= max(Students(i,k), ll(k)); Students(i,k) = min(Students(i,k), upper_limit(k)); end end return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students, select, upper_limit, lower_limit, ini_fun, min_result, avg_result, result_fun, opti_fun, result_fun_new, opti_fun_new] = Define_variables(note1, obj_fun, RandSeed) format long; select.classsize = 50; select.var_num = 13; select.itration = 500; if ~exist('RandSeed', 'var') rand_gen = round(sum(100*clock)); end rand('state', rand_gen) [ini_fun, result_fun, result_fun_new, opti_fun, opti_fun_new,] = obj_fun(); [upper_limit, lower_limit, Students, select] = ini_fun(select); Students = remove_duplicate(Students, upper_limit, lower_limit); Students = result_fun(select, Students); Students = sortstudents(Students); average_result = result_avg(Students); min_result = [Students(1).result]; avg_result = [average_result]; return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function yy=objective(x)

  • %clc; for ikl=1 : 13 p(ikl)=x(ikl); end sum1=0; sum2=0; sum3=0; for ikl=1 : 4 z1=p(ikl); z2=(p(ikl))^2; sum1=sum1+z1; sum2=sum2+z2; end for ikl=5 : 13 z3=p(ikl); sum3=sum3+z3; end z4=5*sum1; z5=5*sum2; z6=z4-z5-sum3; t1=(2*p(1))+(2*p(2))+p(10)+p(11); t2=(2*p(1))+(2*p(3))+p(10)+p(12); t3=(2*p(2))+(2*p(3))+p(11)+p(12); t4=(-8*p(1))+p(10); t5=(-8*p(2))+p(11); t6=(-8*p(3))+p(12); t7=(-2*p(4))-p(5)+p(10); t8=(-2*p(6))-p(7)+p(11); t9=(-2*p(8))-p(9)+p(12);

    nc=9; g1(1)=t1-10; g1(2)=t2-10; g1(3)=t3-10; g1(4)=t4; g1(5)=t5; g1(6)=t6;

  • g1(7)=t7; g1(8)=t8; g1(9)=t9;

    fun=0; vio=0; for io=1:nc if g1(io)>0 fun=fun+g1(io)^2; vio=vio+1; end end

    yy=(z6)+(1e5*fun)+(vio*1e3); %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students] = remove_duplicate(Students, upper_limit, lower_limit) format long; global ll ul for i = 1 : length(Students) Mark_1 = sort(Students(i).mark); for k = i+1 : length(Students) Mark_2 = sort(Students(k).mark); if isequal(Mark_1, Mark_2) m_new = floor(1+(length(Students(k).mark)-1)*(rand)); if length(upper_limit)==1 Students(k).mark(m_new) = (lower_limit + (upper_limit - lower_limit) * rand); else Students(k).mark(m_new) = (ll(m_new) + (upper_limit(m_new) - ll(m_new)) * rand); end end end end return;

  • %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [result_av, within_bound] = result_avg(Students) format long; Result = []; within_bound = 0; for i = 1 : length(Students) if Students(i).result < inf Result = [Result Students(i).result]; within_bound = within_bound + 1; end end result_av = mean(Result); return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function run_TLBO() clear all; clc; run=1; format long; for i=1:run TLBO(@Define_range); end %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students, indices] = sortstudents(Students) classsize = length(Students); Result = zeros(1, classsize); indices = zeros(1, classsize); for i = 1 : classsize Result(i) = Students(i).result; end

  • [Result, indices] = sort(Result, 2, 'ascend'); Marks = zeros(classsize, length(Students(1).mark)); for i = 1 : classsize Marks(i, :) = Students(indices(i)).mark; end for i = 1 : classsize Students(i).mark = Marks(i,:); Students(i).result = Result(i); End %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function TLBO(obj_fun, note1, note2) format long; global ll if ~exist('note1', 'var') note1 = true; end if ~exist('note2', 'var') note2 = true; end [Students, select, upper_limit, lower_limit, ini_fun, min_result, avg_result, result_fun, opti_fun, result_fun_new, opti_fun_new] = Define_variables(note1, obj_fun); ul=upper_limit; ll=lower_limit;

    for COMP = 1 : select.itration for i=1:length(Students) cs(i,:)=Students(i).mark; cs_result(i)=Students(i).result; end cs_new=cs; cs; cs_result; Stdev_cs=std(cs); Mean_cs=mean(cs);

  • for i = 1 : length(Students) mean_result=mean(cs); Stdev_cs=std(cs); TF=round(1+rand*(1)); [r1 r2]=sort(cs_result); best=cs(r2(1),:); for k = 1 : select.var_num Stdev_cs(i,k)=std(cs(i,k)); cs_new(i,k)=cs(i,k)+((best(1,k)-TF*mean_result(k))*rand)-(Stdev_cs(1,k)- Stdev_cs(i,k))*rand;%((1)*((cs(hh,k)-(LF)*cs(i,k)))*(rand)); end cs_new(i,:) = opti_fun_new(select, cs_new(i,:)); cs_new_result(i) = result_fun_new(select, cs_new(i,:)); if cs_new_result(i)

  • cs(i,:)=cs_new(i,:); Students(i).result=cs_new_result(i); end end

    n = length(Students); Students = opti_fun(select, Students); Students = result_fun(select, Students); Students = sortstudents(Students); if rand

  • opti_fun = @Define_rangeopti; opti_fun_new = @Define_rangeopti_new; return; function [upper_limit, lower_limit, Students, select] = Define_rangeDefine_variables(select) global lower_limit upper_limit ll ul Granularity = 1; lower_limit = ll; upper_limit = ul;

    ll=[0 0 0 0 0 0 0 0 0 0 0 0 0]; ul=[1 1 1 1 1 1 1 1 1 100 100 100 1];

    lower_limit = ll; upper_limit = ul; for popindex = 1 : select.classsize for k = 1 : select.var_num mark(k) =(ll(k))+ ((ul(k) - ll(k)) * rand); end Students(popindex).mark = mark; end select.OrderDependent = true; return; function [Students] = Define_rangeresult(select, Students) global lower_limit upper_limit classsize = select.classsize; for popindex = 1 : classsize for k = 1 : select.var_num x(k) = Students(popindex).mark(k); end Students(popindex).result = objective(x); end return function [Studentss] = Define_rangeresult_new(select, Students) global lower_limit upper_limit classsize = select.classsize; for popindex = 1 : size(Students,1)

  • for k = 1 : select.var_num x(k) = Students(popindex,k); end Studentss(popindex) = objective(x); end return function [Students] = Define_rangeopti(select, Students) global lower_limit upper_limit ll ul for i = 1 : select.classsize for k = 1 : select.var_num Students(i).mark(k) = max(Students(i).mark(k), ll(k)); Students(i).mark(k) = min(Students(i).mark(k), upper_limit(k)); end end return; function [Students] = Define_rangeopti_new(select, Students) global lower_limit upper_limit ll ul for i = 1 : size(Students,1) for k = 1 : select.var_num Students(i,k)= max(Students(i,k), ll(k)); Students(i,k) = min(Students(i,k), upper_limit(k)); end end return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students, select, upper_limit, lower_limit, ini_fun, min_result, avg_result, result_fun, opti_fun, result_fun_new, opti_fun_new] = Define_variables(note1, obj_fun, RandSeed) format long; select.classsize = 50; select.var_num = 13; select.itration = 500; if ~exist('RandSeed', 'var') rand_gen = round(sum(100*clock)); end

  • rand('state', rand_gen) [ini_fun, result_fun, result_fun_new, opti_fun, opti_fun_new,] = obj_fun(); [upper_limit, lower_limit, Students, select] = ini_fun(select); Students = remove_duplicate(Students, upper_limit, lower_limit); Students = result_fun(select, Students); Students = sortstudents(Students); average_result = result_avg(Students); min_result = [Students(1).result]; avg_result = [average_result]; return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function ETLBO(obj_fun, note1, note2) format long; global ll if ~exist('note1', 'var') note1 = true; end if ~exist('note2', 'var') note2 = true; end [Students, select, upper_limit, lower_limit, ini_fun, min_result, avg_result, result_fun, opti_fun, result_fun_new, opti_fun_new] = Define_variables(note1, obj_fun); ul=upper_limit; ll=lower_limit; elite=4; for COMP = 1 : select.itration for i = 1 : elite markelite(i,:) = Students(i).mark; resultelite(i) = Students(i).result; end for i=1:length(Students) cs(i,:)=Students(i).mark; cs_result(i)=Students(i).result; end

  • cs_new=cs; cs; cs_result; Stdev_cs=std(cs); Mean_cs=mean(cs); for i = 1 : length(Students) mean_result=mean(cs); Stdev_cs=std(cs); TF=round(1+rand*(1)); [r1 r2]=sort(cs_result); best=cs(r2(1),:); for k = 1 : select.var_num Stdev_cs(i,k)=std(cs(i,k)); cs_new(i,k)=cs(i,k)+((best(1,k)-TF*mean_result(k))*rand)-(Stdev_cs(1,k)- Stdev_cs(i,k))*rand;%((1)*((cs(hh,k)-(LF)*cs(i,k)))*(rand)); end cs_new(i,:) = opti_fun_new(select, cs_new(i,:)); cs_new_result(i) = result_fun_new(select, cs_new(i,:)); if cs_new_result(i)

  • end cs_new(i,:) = opti_fun_new(select, cs_new(i,:)); cs_new_result(i) = result_fun_new(select, cs_new(i,:)); if cs_new_result(i)

  • %clc;

    for ikl=1 : 13 p(ikl)=x(ikl); end sum1=0; sum2=0; sum3=0; for ikl=1 : 4 z1=p(ikl); z2=(p(ikl))^2; sum1=sum1+z1; sum2=sum2+z2; end for ikl=5 : 13 z3=p(ikl); sum3=sum3+z3; end z4=5*sum1; z5=5*sum2; z6=z4-z5-sum3; t1=(2*p(1))+(2*p(2))+p(10)+p(11); t2=(2*p(1))+(2*p(3))+p(10)+p(12); t3=(2*p(2))+(2*p(3))+p(11)+p(12); t4=(-8*p(1))+p(10); t5=(-8*p(2))+p(11); t6=(-8*p(3))+p(12); t7=(-2*p(4))-p(5)+p(10); t8=(-2*p(6))-p(7)+p(11); t9=(-2*p(8))-p(9)+p(12);

    nc=9; g1(1)=t1-10; g1(2)=t2-10; g1(3)=t3-10; g1(4)=t4; g1(5)=t5;

  • g1(6)=t6; g1(7)=t7; g1(8)=t8; g1(9)=t9;

    fun=0; vio=0; for io=1:nc if g1(io)>0 fun=fun+g1(io)^2; vio=vio+1; end end

    yy=(z6)+(1e5*fun)+(vio*1e3); %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students] = remove_duplicate(Students, upper_limit, lower_limit) format long; global ll ul for i = 1 : length(Students) Mark_1 = sort(Students(i).mark); for k = i+1 : length(Students) Mark_2 = sort(Students(k).mark); if isequal(Mark_1, Mark_2) m_new = floor(1+(length(Students(k).mark)-1)*(rand)); if length(upper_limit)==1 Students(k).mark(m_new) = (lower_limit + (upper_limit - lower_limit) * rand); else Students(k).mark(m_new) = (ll(m_new) + (upper_limit(m_new) - ll(m_new)) * rand); end end end end

  • return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [result_av, within_bound] = result_avg(Students) format long; Result = []; within_bound = 0; for i = 1 : length(Students) if Students(i).result < inf Result = [Result Students(i).result]; within_bound = within_bound + 1; end end result_av = mean(Result); return; %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function run_ETLBO() clear all; clc; run=1; format long; for i=1:run ETLBO(@Define_range); end %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %%

    %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% function [Students, indices] = sortstudents(Students) classsize = length(Students); Result = zeros(1, classsize); indices = zeros(1, classsize); for i = 1 : classsize Result(i) = Students(i).result; end

  • [Result, indices] = sort(Result, 2, 'ascend'); Marks = zeros(classsize, length(Students(1).mark)); for i = 1 : classsize Marks(i, :) = Students(indices(i)).mark; end for i = 1 : classsize Students(i).mark = Marks(i,:); Students(i).result = Result(i); End %% Prepared by Dr. R. VENKATA RAO and Dr. VIVEK PATEL %% ________________________________________________