Dr Mehdi Toloo


Reader in Business Analytics
BSc, MSc, PhD, docent
+44 (0)1483 686353
10 MS 01

About

Research

Research interests

Research projects

Research collaborations

Supervision

Postgraduate research supervision

Teaching

Publications

Majid Azadi, Mehdi Toloo, Fahimeh Ramezani, Reza Farzipoor Saen, Farookh Khadeer Hussain, Hajar Farnoudkia (2023)Evaluating efficiency of cloud service providers in era of digital technologies, In: Annals of operations research Springer Nature

The rapid growth of advanced technologies such as cloud computing in the Industry 4.0 era has provided numerous advantages. Cloud computing is one of the most significant technologies of Industry 4.0 for sustainable development. Numerous providers have developed various new services, which have become a crucial ingredient of information systems in many organizations. One of the challenges for cloud computing customers is evaluating potential providers. To date, considerable research has been undertaken to solve the problem of evaluating the efficiency of cloud service providers (CSPs). However, no study addresses the efficiency of providers in the context of an entire supply chain, where multiple services interact to achieve a business objective or goal. Data envelopment analysis (DEA) is a powerful method for efficiency measurement problems. However, the current models ignore undesirable outputs, integer-valued, and stochastic data which can lead to inaccurate results. As such, the primary objective of this paper is to design a decision support system that accurately evaluates the efficiency of multiple CSPs in a supply chain. The current study incorporates undesirable outputs, integer-valued, and stochastic data in a network DEA model for the efficiency measurement of service providers. The results from a case study illustrate the applicability of our new system. The results also show how taking undesirable outputs, integer-valued, and stochastic data into account changes the efficiency of service providers. The system is also able to provide the optimal composition of CSPs to suit a customer's priorities and requirements.

Mahdi Mahdiloo, Amir E. Andargoli, Mehdi Toloo, Charles Harvie, Thach-Thao Duong (2023)Measuring the digital divide: A modified benefit-of-the-doubt approach, In: Knowledge-Based Systems261110191 Elsevier

In this paper, a modified composite index is developed to measure digital inclusion for a group of cities and regions. The developed model, in contrast to the existing benefit-of-the-doubt (BoD) composite index literature, considers the subindexes as non-compensatory. This new way of modeling results in three important properties: (i) all subindexes are taken into account when assessing the digital inclusion of regions and are not removed (substituted) from the composite index, (ii) in addition to an overall composite index (aggregation of the subindexes), partial indexes (aggregated scores for each subindex) are also provided so that weak performances can be detected more effectively than when only the overall index is measured, and (iii) compared with current BoD models, the developed model has improved discriminatory power. To demonstrate the developed model, we use the Australian digital inclusion index as a real-world example.

Mehdi Toloo, Mohammadreza Taghizadeh-Yazdi, Abdolkarim Mohammadi-Balani (2022)Multi-objective centralization-decentralization trade-off analysis for multi-source renewable electricity generation expansion planning: A case study of Iran, In: Computers & Industrial Engineering164107870 Elsevier

Countries need robust long-term plans to keep up with the global pace of transitioning from pollutant fossil fuels towards clean, renewable energies. Renewable energy generation expansion plans can be either centralized, decentralized, or a combination of these two. This paper presents a novel approach to obtain an optimal multi-period plan for generating each type of renewable energy (solar, wind, hydro, geothermal, and biomass) via multi-objective mathematical modeling. The proposed model has integrated with Autoregressive Integrated Moving Average (ARIMA) econometric method to forecast the country’s demand during the planning horizon. The optimal energy mix based on several socio-economic aspects of renewable sources was obtained using the Passive and Active Compensability Multicriteria ANalysis (PACMAN) multi-attribute decision-making method. The model has been solved by a Non-dominated Sorting Genetic Algorithm (NSGA-II) metaheuristic algorithm. Each solution in the Pareto front contains a plan for each electricity generation region under a certain combination of centralization and decentralization strategies.

•Our approach addresses the sharing risk problem between government and investors.•It includes constraints that limit the pollution effects on population centers.•It considers social responsibility, economics factors, and benefits of waste recycling. The public–private partnership (PPP) is a practical and standard model that has been at the center of attention over the past two decades. Sharing risk between government and investors has been a challenging issue over the last year. This study formulates a model that aims to define the investors’ longing and allocate risks to the government in a logical range. Besides, in some real-world conditions, foreign investors with lower cost, higher quality, and better technology than domestic investors partner with the government. Under this condition, it is essential to consider the disruption risks because of sanctions and currency price fluctuations. Furthermore, the limited budget of the government for investing in infrastructure projects is intended. In this paper, the government's disruption risks and limited budget are added to the risk-sharing ratio model for the first time in literature. Moreover, the Pythagorean fuzzy sets (PFSs) are applied to cope with the uncertainty of real-world conditions. The PFSs are more potent than classical and intuitionistic fuzzy sets (IFSs) in dealing with uncertainty. The PFSs provide the membership, non-membership, and hesitancy degree for experts to better address the derived uncertainty of real-world conditions. Also, compared with the IFSs, PFSs prepare more space, consequently providing more freedom to address the uncertainty. Finally, a case study is presented to illustrate the applicability and susceptibility of the suggested model. As disruption risks increase, general utility degree, government utility, and investor’s effort decrease, and the guarantee risk ratio by government increases. Note that, investor’s effort decreases because the government is forced to give the unfinished project to the domestic investor; consequently, exclusive terms arise for the domestic investor.

Ali Haddadi, Mohammad Reza Nikoo, Banafsheh Nematollahi, Ghazi Al-Rawas, Malik Al-Wardy, Mehdi Toloo, Amir H Gandomi (2023)Entropy-based air quality monitoring network optimization using NINP and Bayesian maximum entropy, In: Environmental science and pollution research international30(35)pp. 84110-84125

Effectual air quality monitoring network (AQMN) design plays a prominent role in environmental engineering. An optimal AQMN design should consider stations' mutual information and system uncertainties for effectiveness. This study develops a novel optimization model using a non-dominated sorting genetic algorithm II (NSGA-II). The Bayesian maximum entropy (BME) method generates potential stations as the input of a framework based on the transinformation entropy (TE) method to maximize the coverage and minimize the probability of selecting stations. Also, the fuzzy degree of membership and the nonlinear interval number programming (NINP) approaches are used to survey the uncertainty of the joint information. To obtain the best Pareto optimal solution of the AQMN characterization, a robust ranking technique, called Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE) approach, is utilized to select the most appropriate AQMN properties. This methodology is applied to Los Angeles, Long Beach, and Anaheim in California, USA. Results suggest using 4, 4, and 5 stations to monitor CO, NO , and ozone, respectively; however, implementing this recommendation reduces coverage by 3.75, 3.75, and 3 times for CO, NO , and ozone, respectively. On the positive side, this substantially decreases TE for CO, NO , and ozone concentrations by 8.25, 5.86, and 4.75 times, respectively.

Fariba Goodarzian, Hassan Hoseini-Nasab, Mehdi Toloo, Mohammad Bagher Fakhrzad (2021)Designing a new medicine supply chain network considering production technology policy using two novel heuristic algorithms, In: R.A.I.R.O. Recherche opérationnelle55(2)pp. 1015-1042 Edp Sciences S A

The role of medicines in health systems is increasing day by day. The medicine supply chain is a part of the health system that if not properly addressed, the concept of health in that community is unlikely to experience significant growth. To fill gaps and available challenging in the medicine supply chain network (MSCN), in the present paper, efforts have been made to propose a location-production-distribution-transportation-inventory holding problem for a multi-echelon multi-product multi-period bi-objective MSCN network under production technology policy. To design the network, a mixed-integer linear programming (MILP) model capable of minimizing the total costs of the network and the total time the transportation is developed. As the developed model was NP-hard, several meta-heuristic algorithms are used and two heuristic algorithms, namely, Improved Ant Colony Optimization (IACO) and Improved Harmony Search (IHS) algorithms are developed to solve the MSCN model in different problems. Then, some experiments were designed and solved by an optimization solver called GAMS (CPLEX) and the presented algorithms to validate the model and effectiveness of the presented algorithms. Comparison of the provided results by the presented algorithms and the exact solution is indicative of the high-quality efficiency and performance of the proposed algorithm to find a near-optimal solution within reasonable computational time. Hence, the results are compared with commercial solvers (GAMS) with the suggested algorithms in the small-sized problems and then the results of the proposed meta-heuristic algorithms with the heuristic methods are compared with each other in the large-sized problems. To tune and control the parameters of the proposed algorithms, the Taguchi method is utilized. To validate the proposed algorithms and the MSCN model, assessment metrics are used and a few sensitivity analyses are stated, respectively. The results demonstrate the high quality of the proposed IACO algorithm.

Mehdi Toloo, Soroosh Nalchigar (2011)On Ranking Discovered Rules of Data Mining by Data Envelopment Analysis: Some Models with Wider Applications, In: New Fundamental Technologies in Data Miningpp. 425-442 InTech

The convergence of computing and communication has resulted in a society that feeds on information. There is exponentially increasing huge amount of information locked up in databases—information that is potentially important but has not yet been discovered or articulated (Whitten & Frank, 2005). Data mining, the extraction of implicit, previously unknown, and potentially useful information from data, can be viewed as a result of the natural evolution of Information Technology (IT). An evolutionary path has been passed in database field from data collection and database creation to data management, data analysis and understanding. According to Han & Camber (2001) the major reason that data mining has attracted a great deal of attention in information industry in recent years is due to the wide availability of huge amounts of data and the imminent need for turning such data into useful information and knowledge. The information and knowledge gained can be used for applications ranging from business management, production control, and market analysis, to engineering design and science exploration. In other words, in today’s business environment, it is essential to mine vast volumes of data for extracting patterns in order to support superior decision-making. Therefore, the importance of data mining is becoming increasingly obvious. Many data mining techniques have also been presented in various applications, such as association rule mining, sequential pattern mining, classification, clustering, and other statistical methods (Chen & Weng, 2008).

Emmanuel Kwasi Mensah, Esmaeil Keshavarz, Mehdi Toloo (2022)10 - Finding efficient solutions of the multicriteria assignment problem, In: Multi-Objective Combinatorial Optimization Problems and Solution Methodspp. 193-211 Elsevier Inc

The assignment problem (AP) is one of the well-known and most studied combinatorial optimization problems. The single objective AP is an integer programming problem that can be solved with efficient algorithms such as the Hungarian or the successive shortest paths methods. On the other hand, finding and classifying all efficient assignments for a Multicriteria AP (MCAP) remains a controversial issue in Multicriteria Decision Making (MCDM) problems. In this chapter, we tackle the issue by using data envelopment analysis (DEA) models. Importantly, we focus on identifying the efficiency status of assignments using a two-phase algorithm. In phase I, a mixed-integer linear programming (MILP) based on the Free disposable Hull (FDH) model is used to determine minimal complete set (MCS) of efficient assignments. In Phase II, the DEA-BCC model is used to classify efficient assignments as supported or nonsupported. A numerical example is provided to illustrate the presented approach.

M Toloo, S Ghorbani, Z Molaee (2011)Ranking DMUs on the benchmark line with equal shadow prices, In: Proceedings of the 40th International Conference on Computers & Industrial Engineering (CIE-40)pp. 1-4 Institute of Electrical and Electronics Engineers (IEEE)

Data envelopment analysis (DEA) with considering the best condition for each decision making unit (DMU) assesses the relative efficiency for it and divides a homogenous group of DMUs into two categories: efficient and inefficient, but traditional DEA models can not rank efficient DMUs. Although some models were introduced for ranking efficient DMUs, Franklin Lio & Hsuan peng (2008), proposed a common weights analysis (CWA) approach for ranking them. These DMUs are ranked according to the efficiency score weighted by the common set of weights and shadow prices. This study shows there are some cases that shadow prices of efficient DMUs are equal, hence this method is not applicable for ranking them. Next, we propose a new method for ranking units with equal shadow prices.

Mehdi Toloo (2020)Welcome, In: 2020 7th International Conference on Control, Decision and Information Technologies (CoDIT)1pp. i-i IEEE

Presents the introductory welcome message from the conference proceedings. May include the conference officers' congratulations to all involved with the conference event and publication of the proceedings record.

Mehdi Toloo, Iman Rahimi, Siamak Talatahari (2022)Multi-Objective Combinatorial Optimization Problems and Solution Methods Academic Press

Multi-Objective Combinatorial Optimization Problems and Solution Methods discusses the results of a recent multi-objective combinatorial optimization achievement that considered metaheuristic, mathematical programming, heuristic, hyper heuristic and hybrid approaches. In other words, the book presents various multi-objective combinatorial optimization issues that may benefit from different methods in theory and practice. Combinatorial optimization problems appear in a wide range of applications in operations research, engineering, biological sciences and computer science, hence many optimization approaches have been developed that link the discrete universe to the continuous universe through geometric, analytic and algebraic techniques. This book covers this important topic as computational optimization has become increasingly popular as design optimization and its applications in engineering and industry have become ever more important due to more stringent design requirements in modern engineering practice.

Mehdi Toloo, Siamak Talatahari, Amir H. Gandomi, Iman Rahimi (2022)1 - Multiobjective combinatorial optimization problems: social, keywords, and journal maps, In: Multi-Objective Combinatorial Optimization Problems and Solution Methodspp. 1-9 Elsevier Inc

Multiobjective combinatorial optimization problems appear in a wide range of applications including operations research/management, engineering, biological sciences, and computer science. This work presents a brief analysis of most concepts and studies of solution approaches applied to multiobjective combinatorial optimization problems. A detailed scientometric analysis presents an influential tool for bibliometric analyses that were performed on multiobjective combinatorial optimization problems and the solution approaches data from the Scopus databases. To this end, we address social, keywords, and subject areas by employing two well-known tools: VOSviewer and Mendeley. Finally, the conclusion and discussion are provided with a couple of directions for future researches.

Mehdi Toloo, Samaneh Joshaghani (2012)User guide: GAMS with DEA models
Mehdi Toloo, Zahra Molaei (2012)Operations Research II
Mehdi Toloo, Amir Hossein Akhavan Rahnama (2013)Introduction to Scientific Computing Scholars' Press

Computational science (or scientific computing) is concerned with constructing mathematical models and quantitative analysis techniques and using computers to analyze and solve scientific problems. In practical use, it is typically the application of computer simulation and other forms of computation from numerical analysis and theoretical computer science to problems in various scientific disciplines. Pascal is an influential imperative and procedural programming language, designed in 1968–1969 and published in 1970 by Niklaus Wirth as a small and efficient language intended to encourage good programming practices using structured programming and data structuring. Another object- oriented deviation of it was known as Object Pascal was developed in 1985. Today, Pascal is mainly abandoned in the industry or scientific teams however it has had its influence on both syntax and data structure of Java programming language, the most expressive language for scientific computing up to the time being.

Adel Hatami-Marbini, Aliasghar Arabmaldar, Mehdi Toloo, Ali Mahmoodi Nehrani (2022)Robust non-radial data envelopment analysis models under data uncertainty, In: Expert Systems with Applications207118023 Elsevier

Russell measure (RM) and enhanced Russell measure (ERM) are popular non-radial measures for efficiency assessment of decision-making units (DMUs) in data envelopment analysis (DEA). Input and output data of both original RM and ERM are assumed to be deterministic. However, this assumption may not be valid in some situations because of data uncertainty arising from measurement errors, data staleness, and multiple repeated measurements. Interval DEA (IDEA) has been proposed to measure the interval efficiencies from the optimistic and pessimistic viewpoints while the robustness of the assessment is questionable. This paper draws on a class of robust optimisation models to surmount uncertainty with a high degree of robustness in the RM and ERM models. The contribution of this paper is fivefold; (1) we develop new robust non-radial DEA models to measure the robust efficiency of DMUs under data uncertainty, which are adjustable based upon conservatism levels, (2) we use Monte-Carlo simulation in an attempt to identify an appropriate range for the budget of uncertainty in terms of the highest conformity of ranking results, (3) we introduce the concept of the price of robustness to scrutinise the effectiveness and robustness of the proposed models, (4) we compare the developed robust models in this paper with other existing approaches, both radial and non-radial models, and (5) we explore an application to assess the efficiency of the Master of Business Administration (MBA) programmes where data uncertainties in-fluence the quality and reliability of results.

Mehdi Toloo, Kaoru Tone, Mohammad Izadikhah (2023)Selecting slacks-based data envelopment analysis models, In: European Journal of Operational Research308(3)pp. 1302-1318 Elsevier

Data envelopment analysis (DEA) is a well-known data-driven mathematical modeling approach that aims at evaluating the relative efficiency of a set of comparable decision making units (DMUs) with multiple inputs and multiple outputs. The number of inputs and outputs (performance factors) plays a vital role for successful applications of DEA. There is a statistical and empirical rule in DEA that if the number of performance factors is high in comparison with the number of DMUs, then a large percentage of the units will be determined as efficient, which is questionable and unacceptable in the performance evaluation context. However, in some real-world applications, the number of performance factors is relatively larger than the number of DMUs. To cope with this issue, selecting models have been developed to select a subset of performance factors that lead to acceptable results. In this paper, we extend a pair of optimistic and pessimistic approaches, involving two alternative individual and summative selecting models, based on the slacks-based model. We mathematically validate the proposed models with some theorems and lemmas and illustrate the applicability of our models using 18 active auto part companies in the largest stock exchange in Iran.

In microeconomics, a production function is a mathematical function that transforms all combinations of inputs of an entity, firm or organization into the output. Given the set of all technically feasible combinations of outputs and inputs, only the combinations encompassing a maximum output for a specified set of inputs would constitute the production function. Data Envelopment Analysis (DEA), which has initially been originated by Charnes, Cooper and Rhodes in 1978, is a well-known non-parametric mathematical method with the aim of estimating the production function. In fact, DEA evaluates the relative performance of a set of homogeneous decision making units with multiple inputs and multiple outputs.This book covers some basic DEA models and disregards more complicated ones, such as network DEA, and mainly stresses the importance of weights in DEA and some of their applications. As a result, this book mainly considers the multiplier form of DEA models to extend some new approaches, however, the envelopment forms are introduced in some possible approaches. This book also aims at dealing with some innovative uses of binary variables in extended DEA model formulations. The auxiliary variables enable us to formulate Mixed Integer Programming (MIP) DEA models for addressing the problem of finding a single efficient and ranking efficient DMUs. In some cases, the status of input(s) or output(s) measure is unknown and binary variables are utilized to accommodate these flexible measures. Furthermore, the binary variables approach tackles the problem of selecting input or output measures.The book also stresses the mathematical aspects of selected DEA models and their extensions so as to illustrate their potential uses with applications to different contexts, such as banking industry in the Czech Republic, financing decision problem, technology selection problem, facility layout design problem, and selecting the best tennis player. In addition, the majority of the extended models in this book can be extended to some other DEA models, such as slacks-based measures, hybrids, non-discretionary, and fuzzy DEA which are applicable in some other contexts.This research-based book contains six chapters as follows:The first chapter (General Discussion) starts with a simple numerical example to explain the concept of relative efficiency and to clarify the importance of input and output weights in measuring the efficiency score. Then these basic concepts are extended to some more complex cases. Efficient frontiers and projection points are illustrated by means of some constructive and insightful graphs.The second chapter (Basic DEA Models) presents both envelopment and multiplier forms of the DEA models in the presence of multiple inputs and multiple outputs. However, this book mainly focuses on the multiplier form of DEA models. In addition, this chapter illustrates the role of each axiom to construct the production possibility set (PPS). It is also concerned with some DEA models to deal with pure input data as long as with pure output data sets. Apart from basic input- and output-oriented DEA models with different returns to scale, the chapter includes a model that combines both orientations. Three various case studies involving banking industry, technology selection, and asset financing are provided in this section. In chapter 3 (GAMS Software), we briefly introduce General Algebraic Modeling System (GAMS) software, a modelling system for linear, nonlinear and mixed integer optimization problems for solving DEA models. Chapter 4 (Weights in DEA) treats the weights in DEA and their importance along with various weight restrictions and common set of weights (CSW) approaches. The chapter includes Assurance Region (AR) and Assurance Region Global (ARG) methods to restrict weight flexibility in DEA. Two DEA models with different types of efficiency, i.e. minsum and minimax, with their integrated versions are introduced in this chapter. The evaluation of facility layout design problem is addressed as a numerical example.Chapter 5 (Best Efficient Unit) considers CSW and binary variable approaches as the main tool for developing models that have the capability to find the most efficient DMU and also rank DMUs. We cover WEI/WEO data sets along with multiple input and multiple output data sets. Some epsilon-free DEA models are introduced to overcome the problem of finding a set of positive weights. The problem of finding the most cost-efficient under certain and uncertain input prices is also discussed. Two real data sets involving professional tennis players and a Turkish automotive company are rendered to validate the approaches in this chapter.  Chapter 6 (Data Selection in DEA) closes the book by considering the data selection problem in DEA and presenting some modifications of the standard DEA models to accommodate flexible and selective measures. To deal with these problems, two multiplier and envelopment DEA models are developed where each model contains two alternative approaches: individual and integrated models. The individual approach classifies flexible measures and identifies selective measures for each DMU, and the aggregate approach accommodates these measures using integrated DEA models. We present three case studies to examine and validate the approaches in this chapter.Evidently, my deepest gratitude and love go to my family, Laleh and Arad, for supporting me in writing this book. Ronak Azizi saved me a lot of trouble by tackling all formatting issues in Microsoft. Last, but certainly not least, I would like to extend my thanks to my friend, Dr Adel Hatami-Marbini, for helping me with editing the book and for invaluable ideas and comments.This publication has been elaborated in the framework of the project “Support research and development in the Moravian-Silesian Region 2013 DT 1 - International research teams“ (02613/2013/RRC). Financed from the budget of the Moravian-Silesian Region.

Kristiaan Kerstens, Jafar Sadeghi, Mehdi Toloo, Ignace Van de Woestyne (2022)Procedures for ranking technical and cost efficient units: With a focus on nonconvexity, In: European journal of operational research300(1)pp. 269-281 Elsevier B.V

•Infeasibility under the super-efficiency problem aggravates under nonconvexity.•New super-efficiency cost frontier is feasible under constant returns to scale•Super-efficiency cost frontier may be infeasible under variable returns to scale.•The super-efficiency decomposition is new in the literature.•New cost super-efficiency model under incomplete price data is proposed. This contribution extends the literature on super-efficiency by focusing on ranking cost-efficient observations. To the best of our knowledge, the focus has always been on technical super-efficiency and this focus on ranking cost-efficient observations may well open up a new topic. Furthermore, since the convexity axiom has both an impact on technical and cost efficiency, we pay a particular attention to the effect of nonconvexity on both super-efficiency notions. Apart from a numerical example, we use a secondary data set guaranteeing replication to illustrate these efficiency and super-efficiency concepts. Two empirical conclusions emerge. First, the cost super-efficiency notion ranks differently from the technical super-efficiency concept. Second, both cost and technical super-efficiency notions rank differently under convex and nonconvex technologies.

Mehdi Toloo, Mahnaz Mirbolouki (2019)A new project selection method using data envelopment analysis, In: Computers & industrial engineering138 Elsevier Ltd

•This paper studies the project selection problem.•We develop a new project selection method under resource limitations.•The advantage of the new method is that it accomplishes both individual evaluation and selection.•A case study of information system projects in Iran e-commerce development center validates the new method. The project selection problem plays a vital role in an organization to successfully attain its competitive advantages and corporate strategies. The problem is more exacerbated and compounded if the decision-maker takes the limitation of resources into consideration. As a matter of fact, the project selection problem deals with opting a set of best feasible proposals from a large pool of proposals with making the best use of available resources. It is assumed that each proposal employs various resources, such as personnel, capital, equipment, and facilities. Each subset of feasible proposals constitutes a single, composite project that utilizes a set of available but limited resources to produce various outputs. It is desired to select the best subset of proposals with the aim of using the available resources as much as possible. Data envelopment analysis (DEA) is commonly used as a prioritization method to evaluate each feasible composite project. This paper develops a new project selection method based on the performance of each contained proposals by solving a single linear DEA model. Finally, we provide a real dataset containing 21 information system proposal at Iran e-commerce development center to illustrate the potential application of our suggested method.

Adel Hatami-Marbini, Mehdi Toloo, Mohamad Reza Amini, Adel Azar (2022)Extending a fuzzy network data envelopment analysis model to measure maturity levels of a performance based-budgeting system: A case study, In: Expert systems with applications200116884 Elsevier Ltd

•Propose a framework for measuring a maturity level of performance-based budgeting.•Develop a parallel network data envelopment analysis model.•Consider the hierarchical configuration of performance indicators.•Use fuzzy sets theory to deal with vagueness and ambiguity.•Present a case study to demonstrate the applicability of the developed framework. Performance-based budgeting (PBB) aims to formulate and manage public budgetary resources to improve managerial decisions based on actual performance measures of agencies. Although the PBB system has been overwhelmingly applied by various agencies, the progress and maturity of its implementation process are not satisfactory at large. Therefore, it warrants to find, evaluate and improve the performance of organisations in relation to implementing a PBB system. To do so, the composite indicators (CIs) have been proposed to aggregate multiple indicators associated with the PBB system, but their employment is contentious as they often lean on ad-hoc and troublesome assumptions. Data envelopment analysis (DEA) methods as a powerful and established tool help to contend with key limitations of CIs. Although the original DEA method ignores an internal production process, the knowledge of the internal structure of the PBB systems and indicators is of importance to provide further insights when assessing the performance of PBB systems. In this paper, we present a budget assessment framework by breaking a PBB system into two parallel stages including operations performance (OP) and financial performance enhancement (FPE) to open up the black-box structure of the system and consider the indicator hierarchy configuration of each stage. In situations of the hierarchical configuration of indicators, we develop a multilayer parallel network DEA-based CIs model to measure the PBB maturity levels of the system and its stages. It is shown that the discrimination power of the proposed multilayer model is better than the existing models with one layer and in situations of relatively small number of DMUs the model developed in this paper can be a good solution to the dimension reduction of indicators. Moreover, this research leverages fuzzy logic to surmount the subjective information that is often available in collecting indicators of the PBB systems. The major contribution of this research is to examine a case study of a PBB maturity award in Iran, as a developing country with a myriad of financial challenges, to adopt a PBB maturity model as well as point towards the efficacy and applicability of the proposed framework in practice.

Mehdi Toloo, Emmanuel Kwasi Mensah, Maziar Salahi (2022)Robust optimization and its duality in data envelopment analysis, In: Omega (Oxford)108102583 Elsevier Ltd

•We develop robust equivalents for fractional DEA models.•The proposed models give a proper interpretation of robust efficiency.•The superiorities of our approach models over the existing ones have been investigated.•Duality relation in robust DEA is established according to the “primal worst equal dual best” theorem in robust optimization.•We show an equivalent relation between robust input-and output-oriented models.•We illustrate our proposed models with a study from the largest airports in Europe. Robust Data Envelopment Analysis (RDEA) is a DEA-based conservative approach used for modeling uncertainties in the input and output data of Decision-Making Units (DMUs) to guarantee stable and reliable performance evaluation. The RDEA models proposed in the literature apply robust optimization techniques to the linear and conventional DEA models which lead to the difficulty of obtaining a robust efficient DMU. To overcome this difficulty, this paper tackles uncertainty in DMUs from the original fractional DEA model. We propose a robust fractional DEA (RFDEA) model in both input and output orientation which enables us to overcome the deficiency of existing RDEA models. The linearized models of the fractional DEA are further used to establish duality relations from a pessimistic and optimistic view of the data. We show that the primal worst of the multiplier model is equivalent to the dual best of the envelopment model. Furthermore, we show that the robust efficiency in the input- and output-oriented DEA models are still equivalent in the new approach which is not the case in conventional RDEA models. We finally present a study of the largest airports in Europe to illustrate the efficacy of the proposed models. The proposed RDEA is found to provide an effective management evaluation strategy under uncertain environments.

Mehdi Toloo, Rouhollah Khodabandelou, Amar Oukil (2022)A Comprehensive Bibliometric Analysis of Fractional Programming (1965–2020), In: Mathematics (Basel)10(11)1796

Fractional programming (FP) refers to a family of optimization problems whose objective function is a ratio of two functions. FP has been studied extensively in economics, management science, information theory, optic and graph theory, communication, and computer science, etc. This paper presents a bibliometric review of the FP-related publications over the past five decades in order to track research outputs and scholarly trends in the field. The reviews are conducted through the Science Citation Index Expanded (SCI-EXPANDED) database of the Web of Science Core Collection (Clarivate Analytics). Based on the bibliometric analysis of 1811 documents, various theme-related research indicators were described, such as the most prominent authors, the most commonly cited papers, journals, institutions, and countries. Three research directions emerged, including Electrical and Electronic Engineering, Telecommunications, and Applied Mathematics.

Gholam R. Amin, M. Toloo (2007)Finding the most efficient DMUs in DEA: An improved integrated model, In: Computers & industrial engineering52(1)pp. 71-77 Elsevier

In many applications of DEA finding the most efficient DMUs is desirable. This paper presents an improved integrated DEA model in order to detect the most efficient DMUs. The proposed integrated DEA model does not use the trial and error method in the objective function. Also, it is able to find the most efficient DMUs without solving the model n times (one linear programming (LP) for each DMU) and therefore allows the user to get faster results. It is shown that the improved integrated DEA model is always feasible and capable to rank the most efficient one. To illustrate the model capability the proposed methodology is applied to a real data set consisting of the 19 facility layout alternatives. (c) 2006 Elsevier Ltd. All rights reserved.

Mehdi Toloo, Maryam Allahyar, Jana Hančlová (2018)A non-radial directional distance method on classifying inputs and outputs in DEA: Application to banking industry, In: Expert systems with applications92pp. 495-506 Elsevier Ltd

•A non-radial nor-oriented method is developed to deal with flexible measures.•Two optimistic and pessimistic approaches are proposed.•Each approach contains two individual and integrated models.•A case study of 61 banks in the Visegrad Four region validates the new models. The original Data Envelopment Analysis (DEA) models have required an assumption that the status of all inputs and outputs be known exactly, whilst we may face a case with some flexible performance measures whose status is unknown. Some classifier approaches have been proposed in order to deal with flexible measures. This contribution develops a new classifier non-radial directional distance method with the aim of taking into account input contraction and output expansion, simultaneously, in the presence of flexible measures. To make the most appropriate decision for flexible measures, we suggest two pessimistic and optimistic approaches from both individual and summative points of view. Finally, a numerical real example in the banking system in the countries of the Visegrad Four (i.e. Czech Republic, Hungary, Poland, and Slovakia) is presented to elaborate applicability of the proposed method.

Amir Hossein Akhavan Rahnama, Mehdi Toloo, Nezer Jacob Zaidenberg (2018)An LP-based hyperparameter optimization model for language modeling, In: The Journal of supercomputing74(5)pp. 2151-2160 Springer Nature

In order to find hyperparameters for a machine learning model, algorithms such as grid search or random search are used over the space of possible values of the models' hyperparameters. These search algorithms opt the solution that minimizes a specific cost function. In language models, perplexity is one of the most popular cost functions. In this study, we propose a fractional nonlinear programming model that finds the optimal perplexity value. The special structure of the model allows us to approximate it by a linear programming model that can be solved using the well-known simplex algorithm. To the best of our knowledge, this is the first attempt to use optimization techniques to find perplexity values in the language modeling literature. We apply our model to find hyperparameters of a language model and compare it to the grid search algorithm. Furthermore, we illustrate that it results in lower perplexity values. We perform this experiment on a real-world dataset from SwiftKey to validate our proposed approach.

Mehdi Toloo, Maryam Allahyar (2018)A simplification generalized returns to scale approach for selecting performance measures in data envelopment analysis, In: Measurement : journal of the International Measurement Confederation121pp. 327-334 Elsevier Ltd

Toloo and Tichý (2015) with the aim of holding the rule of thumb in data envelopment analysis, developed a pair of models which optimally chooses some inputs and outputs among selective measures under variable returns to scale assumption. Their approach involves a lower bound for the input and output weights in the multiplier model and a penalty term in the objective function of envelopment model. These models possess an epsilon which on the one hand turns the selecting envelopment model non-linear and on the other hand increases the required computational burden for solving the selecting multiplier models. Selecting an improper value for the epsilon may cause the infeasibility and unboundedness issues for the multiplier and envelopment model, respectively. This paper demonstrates that the method of Toloo and Tichý (2015) is valid even with excluding the epsilon. The method is extended to generalized returns to scale model which considers other returns to scale assumptions, i.e. non-increasing, constant, and non-decreasing. The obtained results point out that the simplified approach is more stable and more reliable and substantially reduces the required calculations.

Mehdi Toloo, Esmaeil Keshavarz, Adel Hatami-Marbini (2018)Dual-role factors for imprecise data envelopment analysis, In: Omega (Oxford)77pp. 15-31 Elsevier

Efficiency analyses are crucial to managerial competency for evaluating the degree to which resources are consumed in the production process of gaining desired services or products. Among the vast available literature on performance analysis, Data Envelopment Analysis (DEA) has become a popular and practical approach for assessing the relative efficiency of Decision-Making Units (DMUs) which employ multiple inputs to produce multiple outputs. However, in addition to inputs and outputs, some situations might include certain factors to simultaneously play the role of both inputs and outputs. Contrary to conventional DEA models which account for precise values for inputs, outputs and dual-role factors, we develop a methodology for quantitatively handling imprecision and uncertainty where a degree of imprecision is not trivial to be ignored in efficiency analysis. In this regard, we first construct a pair of interval DEA models based on the pessimistic and optimistic standpoints to measure the interval efficiencies where some or all observed inputs, outputs and dual-role factors are assumed to be characterized by interval measures. The optimal multipliers associated with the dual-role factors are then used to determine whether a factor is designated as an output, an input, or is in equilibrium even though the status of the dual-role factors may not be unique based upon the pessimistic and optimistic standpoints. To deal with the problem, we present a new model which integrates both pessimistic and optimistic models. The integrated model enables us to identify a unique status of each imprecise dual-role factor as well as to develop a structure for calculating an optimal reallocation model of each dual-role factor among the DMUs. As another method to investigate the role for dual-role factors, we introduce a fuzzy decision making model which evaluates all DMUs simultaneously. We finally present an application to a data set of 20 banks to showcase the applicability and efficacy of the proposed procedures and algorithm. (C) 2017 Elsevier Ltd. All rights reserved.

Mahdi Mahdiloo, Mehdi Toloo, Thach-Thao Duong, Reza Farzipoor Saen, Peter Tatham (2018)Integrated data envelopment analysis: Linear vs. nonlinear model, In: European journal of operational research268(1)pp. 255-267 Elsevier B.V

•Two linear and nonlinear two-stage data envelopment analysis models are compared.•A relationship between these two models is developed.•It is shown that the linear model is more computationally efficient.•The linear model excludes the estimation error of the nonlinear model.•The linear and nonlinear models are compared with real and simulated data. This paper develops a relationship between two linear and nonlinear data envelopment analysis (DEA) models which have previously been developed for the joint measurement of the efficiency and effectiveness of decision making units (DMUs). It will be shown that a DMU is overall efficient by the nonlinear model if and only if it is overall efficient by the linear model. We will compare these two models and demonstrate that the linear model is an efficient alternative algorithm for the nonlinear model. We will also show that the linear model is more computationally efficient than the nonlinear model, it does not have the potential estimation error of the heuristic search procedure used in the nonlinear model, and it determines global optimum solutions rather than the local optimum. Using 11 different data sets from published papers and also 1000 simulated sets of data, we will explore and compare these two models. Using the data set that is most frequently used in the published papers, it is shown that the nonlinear model with a step size equal to 0.00001, requires running 1,955,573 linear programs (LPs) to measure the efficiency of 24 DMUs compared to only 24 LPs required for the linear model. Similarly, for a very small data set which consists of only 5 DMUs, the nonlinear model requires running 7861 LPs with step size equal to 0.0001, whereas the linear model needs just 5 LPs.

Simona Alfiero, Alfredo Esposito, Emmanuel Kwasi Mensah, Mehdi Toloo (2018)A dataset of European banks in performance evaluation under uncertainty, In: Data in brief22pp. 214-217 Elsevier

The dataset contains financial indicators from the financial statements of 250 banks operating in Europe which are collated for the 2015 accounting year. First, the dataset is split into input and outputs measures. Then the preferred number of inputs and outputs in relation to the total number of data is selected according to the rule of thumb in data envelopment analysis (DEA). The dataset is related to the research article entitled “Robust optimization with nonnegative decision variables: A DEA approach” (Toloo and Mensah, 2018) [1]. The dataset can be used to evaluate the performance of banks and bank efficiency under uncertainty.

Mehdi Toloo, Soroosh Nalchigar, Babak Sohrabi (2018)Selecting most efficient information system projects in presence of user subjective opinions: a DEA approach, In: Central European journal of operations research26(4)pp. 1027-1051 Springer Nature

Information System (IS) project selection is a critical decision making task that can significantly impact operational excellence and competitive advantage of modern enterprises and also can involve them in a long-term commitment. This decision making is complicated due to availability of numerous IS projects, their increasing complexities, importance of timely decisions in a dynamic environment, as well as existence of multiple qualitative and quantitative criteria. This paper proposes a Data Envelopment Analysis approach to find most efficient IS projects while considering subjective opinions and intuitive senses of decision makers. The proposed approach is validated by a real world case study involving 41 IS projects at a large financial institution as well as 18 artificial projects which are defined by the decision makers.

Mehdi Toloo, Emmanuel Kwasi Mensah (2019)Robust optimization with nonnegative decision variables: A DEA approach, In: Computers & industrial engineering127pp. 313-325 Elsevier

Robust optimization has become the state-of-the-art approach for solving linear optimization problems with uncertain data. Though relatively young, the robust approach has proven to be essential in many real-world applications. Under this approach, robust counterparts to prescribed uncertainty sets are constructed for general solutions to corresponding uncertain linear programming problems. It is remarkable that in most practical problems, the variables represent physical quantities and must be nonnegative. In this paper, we propose alternative robust counterparts with nonnegative decision variables - a reduced robust approach which attempts to minimize model complexity. The new framework is extended to the robust Data Envelopment Analysis (DEA) with the aim of reducing the computational burden. In the DEA methodology, first we deal with the equality in the normalization constraint and then a robust DEA based on the reduced robust counterpart is proposed. The proposed model is examined with numerical data from 250 European banks operating across the globe. The results indicate that the proposed approach (i) reduces almost 50% of the computational burden required to solve DEA problems with nonnegative decision variables; (ii) retains only essential (non-redundant) constraints and decision variables without alerting the optimal value.

Gholam R. Amin, Saeed Al-Muharrami, Mehdi Toloo (2019)A combined goal programming and inverse DEA method for target setting in mergers, In: Expert systems with applications115pp. 412-417 Elsevier Ltd

•A new method for target setting in mergers is proposed.•The method combines goal programming and inverse data envelopment analysis.•The method allows decision makers to save desired resources.•An application in banking sector is proposed. This paper suggests a novel method to deal with target setting in mergers using goal programming (GP) and inverse data envelopment analysis (InvDEA). A conventional DEA model obtains the relative efficiency of decision making units (DMUs) given multiple inputs and multiple outputs for each DMU. However, the InvDEA aims to identify the quantities of inputs and outputs when efficiency score is given as a target. This study provides an effective method that allows decision makers to incorporate their preference in target setting of a merger for saving specific input(s) or producing certain output(s) as much as possible. The proposed method is validated through an illustrative application in banking industry.

Esmaeil Keshavarz, Mehdi Toloo (2019)Selecting third-party reverse logistics providers under uncertainty, In: 2019 6TH INTERNATIONAL CONFERENCE ON CONTROL, DECISION AND INFORMATION TECHNOLOGIES (CODIT 2019)pp. 1528-1532 IEEE

All models in data envelopment analysis (DEA) have been built on the foundation of performance factors. Performance factors in DEA are divided conventionally into the input and output measures. In some positions, we confront with dual-role factors which can play simultaneously input and output roles. Traditionally, all performance factors are considered as precise values, while in some real-world problems they characterized as imprecise values. In this paper, we evaluate the performance of 18 third-party reverse logistics (3PL) providers in the presence of dual-role factors and under uncertainty. We illustrate the superiority of the employed DEA approach over a suggested approach in the literature.

Dariush Khezrimotlagh, Joe Zhu, Wade D. Cook, Mehdi Toloo (2019)Data envelopment analysis and big data, In: European journal of operational research274(3)pp. 1047-1054 Elsevier

In the traditional data envelopment analysis (DEA) approach for a set of n Decision Making Units (DMUs), a standard DEA model is solved n times, one for each DMU. As the number of DMUs increases, the running-time to solve the standard model sharply rises. In this study, a new framework is proposed to significantly decrease the required DEA calculation time in comparison with the existing methodologies when a large set of DMUs (e.g., 20,000 DMUs or more) is present. The framework includes five steps: (i) selecting a subsample of DMUs using a proposed algorithm, (ii) finding the best-practice DMUs in the selected subsample, (iii) finding the exterior DMUs to the hull of the selected subsample, (iv) identifying the set of all efficient DMUs, and (v) measuring the performance scores of DMUs as those arising from the traditional DEA approach. The variable returns to scale technology is assumed and several simulation experiments are designed to estimate the running-time for applying the proposed method for big data. The obtained results in this study point out that the running-time is decreased up to 99.9% in comparison with the existing techniques. In addition, we illustrate the essential computation time for applying the proposed method as a function of the number of DMUs (cardinality), number of inputs and outputs (dimension), and the proportion of efficient DMUs (density). The methods are also compared on a real data set consisting of 30,099 electric power plants in the United States from 1996 to 2016. (C) 2018 Elsevier B.V. All rights reserved.

Adel Hatami-Marbini, Mehdi Toloo (2019)Data envelopment analysis models with ratio data: A revisit, In: Computers & industrial engineering133pp. 331-338 Elsevier Ltd

•We criticize some developed DEA models to deal with ratio data.•We make modifications to explicitly overcome the flaws.•We provide a case study in the education sector to validate our proposed approach. The performance evaluation of for-profit and not-for-profit organisations is a unique tool to support the continuous improvement of processes. Data envelopment analysis (DEA) is literally known as an impeccable technique for efficiency measurement. However, the lack of the ability to attend to ratio measures is an ongoing challenge in DEA. The convexity axiom embedded in standard DEA models cannot be fully satisfied where the dataset includes ratio measures and the results obtained from such models may not be correct and reliable. There is a typical approach to deal with the problem of ratio measures in DEA, in particular when numerators and denominators of ratio data are available. In this paper, we show that the current solutions may also fail to preserve the principal properties of DEA as well as to instigate some other flaws. We also make modifications to explicitly overcome the flaws and measure the performance of a set of operating units for the input- and output orientations regardless of assumed technology. Finally, a case study in the education sector is presented to illustrate the strengths and limitations of the proposed approach.

Josef Jablonsky, Ali Emrouznejad, Mehdi Toloo (2018)Editorial: Special issue on data envelopment analysis, In: Central European journal of operations research26(4)pp. 809-812 Springer Nature
Sepideh Abolghasem, Mehdi Toloo, Santiago Amezquita (2019)A dataset of healthcare systems for cross-efficiency evaluation in the presence of flexible measure, In: Data in brief25pp. 104239-104239 Elsevier

This article presents the dataset of the healthcare systems indicators of 120 countries during 2010-2017, which is related to the research article "Cross-efficiency evaluation in the presence of flexible measures with an application to healthcare systems" [1]. The data is collected from the World Bank and selected for the 120 countries. Depending on their role in the performance of the healthcare systems, the indicators are categorized into input (I), output (O) and flexible measure (FM) where the FM measure can play either role of input or output in the healthcare system. The dataset can be used to perform efficiency as well as cross-efficiency analysis of the healthcare systems using methods such as data envelopment analysis (DEA) in the presence of flexible measure. (c) 2019 The Authors. Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).

Maziar Salahi, Mehdi Toloo, Zeynab Hesabirad (2019)Robust Russell and enhanced Russell measures in DEA, In: The Journal of the Operational Research Society70(8)pp. 1275-1283 Taylor & Francis

Russell measure is among non-radial measures for efficiency evaluation of decision making units in data envelopment analysis. Due to the nonlinearity of its objective function, an enhanced version of it is proposed that can be linearized using the known Charnes-Cooper change of variables. In this article, we give equivalent formulations of the robust Russell measure and its enhanced models under interval and ellipsoidal uncertainties in their best- and worst-cases. We show that the built formulations stay convex for both best- and worst-cases under interval uncertainty as well as worst-case with ellipsoidal uncertainty. In other words, these formulations are nonconvex only for ellipsoidal uncertainty in their best-case. Some illustrative examples are provided to validate the new models.

Sepideh Abolghasem, Mehdi Toloo, Santiago Amezquita (2019)Cross-efficiency evaluation in the presence of flexible measures with an application to healthcare systems, In: Health care management science22(3)pp. 512-533 Springer Nature

In recent years, most countries around the world have struggled with the consequences of budget cuts in health expenditure, obliging them to utilize their resources efficiently. In this context, performance evaluation facilitates the decision-making process in improving the efficiency of the healthcare system. However, the performance evaluation of many sectors, including the healthcare systems, is, on the one hand, a challenging issue and on the other hand a useful tool for decision- making with the aim of optimizing the use of resources. This study proposes a new methodology comprising two well-known analytical approaches: (i) data envelopment analysis (DEA) to measure the efficiencies and (ii) data science to complement the DEA model in providing insightful recommendations for strategic decision making on productivity enhancement. The suggested method is a first attempt to combine two DEA extensions: flexible measure and cross-efficiency. We develop a pair of benevolent and aggressive scenarios aiming at evaluating cross-efficiency in the presence of flexible measures. Next, we perform data mining cluster analysis to create groups of homogeneous countries. Organizing the data in similar groups facilitates identifying a set of benchmarks that perform similarly in terms of operating conditions. Comparing the benchmark set with poorly performing countries we can obtain attainable goals for performance enhancement which will assist policymakers to strategically act upon it. A case study of healthcare systems in 120 countries is taken as an example to illustrate the potential application of our new method.

Francisco J. Santos Arteaga, Madjid Tavana, Debora Di Caprio, Mehdi Toloo (2019)A dynamic multi-stage slacks-based measure data envelopment analysis model with knowledge accumulation and technological evolution, In: European journal of operational research278(2)pp. 448-462 Elsevier

Dynamic data envelopment analysis (DEA) models are built on the idea that single period optimization is not fully appropriate to evaluate the performance of decision making units (DMUs) through time. As a result, these models provide a suitable framework to incorporate the different cumulative processes determining the evolution and strategic behavior of firms in the economics and business literatures. In the current paper, we incorporate two distinct complementary types of sequentially cumulative processes within a dynamic slacks-based measure DEA model. In particular, human capital and knowledge, constituting fundamental intangible inputs, exhibit a cumulative effect that goes beyond the corresponding factor endowment per period. At the same time, carry-over activities between consecutive periods will be used to define the pervasive effect that technology and infrastructures have on the productive capacity and efficiency of DMUs. The resulting dynamic DEA model accounts for the evolution of the knowledge accumulation and technological development processes of DMUs when evaluating both their overall and per period efficiency. Several numerical examples and a case study are included to demonstrate the applicability and efficacy of the proposed method. (C) 2018 Elsevier B.V. All rights reserved.

Mehdi Toloo, Madjid Tavana, Francisco J. Santos-Arteaga (2019)An integrated data envelopment analysis and mixed integer non-linear programming model for linearizing the common set of weights, In: Central European journal of operations research27(4)pp. 887-904 Springer Nature

The problem of ranking efficient decision making units (DMUs) is of interest from both theoretical and practical points of view. In this paper, we propose an integrated data envelopment analysis and mixed integer non-linear programming (MINLP) model to find the most efficient DMU using a common set of weights. We linearize the MINLP model to an equivalent mixed integer linear programming (MILP) model by eliminating the non-linear constraints in which the products of variables are incorporated. The formulated MILP model is simpler and computationally more efficient. In addition, we introduce a model for finding the value of epsilon, since the improper choice of the non-Archimedean epsilon may result in infeasible conditions. We use a real-life facility layout problem to demonstrate the applicability and exhibit the efficacy of the proposed model.

Reza Babazadeh, Mohammad Khalili, Mehdi Toloo (2020)A Data Envelopment Analysis Method for Location Optimization of Microalgae Cultivation: A Case Study, In: Waste and biomass valorization11(1)pp. 173-186 Springer Nature

Environmental issues and depletion of fossil energy resources have triggered a sense among and practitioners to seek the ways of substituting fossil energy resources with renewable ones. Biodiesel is a green fuel which is produced from different oleaginous biomass. Nevertheless, producing biodiesel from edible feedstock is strongly criticized by Food and Agriculture Organization. Recently, microalgae have been identified as a source that can be the purification factor of wastewater and appropriate feedstock for biodiesel production. The high potential of microalgae to produce biodiesel and low-cost recovery in large-scale production encourage investors to utilize microalgae for biodiesel production. Accordingly, selecting the best locations for microalgae cultivation has a great impact on the economic viability of biodiesel production from microalgae. This paper studies application of a data envelopment analysis (DEA) approach in selecting the best locations for microalgae cultivation through ecological and economic factors. The DEA method is applied to a real case in Iran. Moreover, the well-known principal component analysis and numerical taxonomy methods are used for verification and validation of the applied DEA approach. The results confirm the applicability of the DEA approach in selecting suitable locations for microalgae cultivation areas.

Bohlool Ebrahimi, Mehdi Toloo (2020)Efficiency bounds and efficiency classifications in imprecise DEA: An extension, In: The Journal of the Operational Research Society71(3)pp. 491-504 Taylor & Francis

Park proposed a pair of mathematical data envelopment analysis (DEA) models to estimate the lower and upper bound of efficiency scores in the presence of imprecise data. This article illustrates that his approach suffers from some drawbacks: (i) it may convert weak ordinal data into an incorrect set of precise data; (ii) it utilizes various production frontiers to obtain an interval efficiency score for each decision making unit (DMU); (iii) in the absence of exact output data (pure ordinal output data), the approach leads to a meaningless model; (iv) the built model is infeasible with pure ordinal input data; (v) it may include free or unlimited production output which results in unreliable and suspicious results. Moreover, the utilized models by Park involve a positive lower bound (non-Archimedean epsilon) for the weights to deter them from being zero. However, the author ignored the requirement of determining a suitable value for the epsilon. This study constructs two new DEA models with a fixed and unified production frontier (the same constraint set) to compute the upper and lower bounds of efficiency. It is demonstrated that the suggested models can successfully capture the aforementioned shortcomings. Although these models are also epsilon-based, a new model is developed to obtain a suitable epsilon value for the proposed models. It is proved that the suggested approach effectively eliminates all the weaknesses. Additionally, a case study of Iranian Space Agency (ISA) industry is taken as an example to illustrate the superiority of the new approach over the previous ones.

Mehdi Toloo, Jana Hanclova (2020)Multi-valued measures in DEA in the presence of undesirable outputs, In: Omega (Oxford)94 Elsevier

Data envelopment analysis (DEA) evaluates the relative efficiency of a set of comparable decision making units (DMUs) with multiple performance measures (inputs and outputs). Classical DEA models rely on the assumption that each DMU can improve its performance by increasing its current output level and decreasing its current input levels. However, undesirable outputs (like wastes and pollutants) may often be produced together with desirable outputs in final products which have to be minimized. On the other hands, in some real-world situations, we may encounter some specific performance measures with more than one value which are measured by various standards. In this study, we referee such measures as multi-valued measures which only one of their values should be selected. For instance, unemployment rate is a multi-valued measure in economic applications since there are several definitions or standards to measure it. As a result, selecting a suitable value for a multi-valued measure is a challenging issue and is crucial for successful application of DEA. The aim of this study is to accommodate multi-valued measures in the presence of undesirable outputs. In doing so, we formulate two individual and summative selecting directional distance models and develop a pair of multiplier- and envelopment-based selecting approaches. Finally, we elaborate applicability of the proposed method using a real data on 183 NUTS 2 regions in 23 selected EU-28 countries. (C) 2019 Elsevier Ltd. All rights reserved.

The concept of sustainability consists of three main dimensions: environmental, techno-economic, and social. Measuring the sustainability status of a system or technology is a significant challenge, especially when it needs to consider a large number of attributes in each dimension of sustainability. In this study, we first propose a hybrid approach, involving data envelopment analysis (DEA) and a multi-attribute decision making (MADM) methodologies, for computing an index for each dimension of sustainability, and then we define the overall sustainability index as the mean of the three measured indexes. Towards this end, we define new concepts ofefficiency and cross-efficiency of order(p, q)wherepandqare the number of inputs and outputs, respectively. For a given(p, q), we address the problem of finding efficiency of order(p, q)by developing a novel DEA-based selecting method. Finally, we define the sustainability index as a weighted sum of all possible cross-efficiencies of order(p, q). Form a computational viewpoint, the proposed selecting model significantly decreases the computational burden in comparison with the successive solving of traditional DEA models. A case study of the electricity-generation technologies in the United Kingdom is taken as a real-world example to illustrate the potential application of our method.

Mohammad Izadikhah, Elnaz Azadi, Majid Azadi, Reza Farzipoor Saen, Mehdi Toloo (2020)Developing a new chance constrained NDEA model to measure performance of sustainable supply chains, In: Annals of operations research316(2)pp. 1319-1347 Springer Nature

Owing to the increasing importance of sustainable supply chain management (SSCM), it has received much attention from both corporate and academic over the past decade. SSCM performance evaluation plays a crucial role in organizations success. One of the practical techniques that can be used for SSCM performance assessment is network data envelopment analysis (NDEA). This paper develops a new NDEA for performance evaluation of SSCM in the presence of stochastic data. The proposed model can evaluate the efficiency of SSCM under uncertain conditions. A case study in the soft drinks industry is presented to demonstrate the efficacy of the proposed method.

Lina P. Navas, Felipe Montes, Sepideh Abolghasem, Ricardo J. Salas, Mehdi Toloo, Roberto Zarama (2020)Colombian higher education institutions evaluation, In: Socio-economic planning sciences71pp. 1-11 Elsevier

Over the last twenty years, access to higher education has grown extraordinarily in Latin America. Higher education systems have been challenged to improve their efficiency while strengthening quality assurance processes. In Colombia, the government and the researchers developed models to assess the performance of Higher Education Institutions (HEIs). Nevertheless, the current scholarship does not have a model that allows the system to measure multiple efficiencies in a diverse environment. In this study, we address the challenge of evaluating the efficiency of HEIs taking into account different goals of the Colombian education system. To this aim, we extend a cross-efficiency data envelopment analysis (DEA) approach to evaluate the efficiency of Colombian HEIs in the presence of flexible measures. While some HEIs are efficient in terms of teaching or employment, others are efficient in terms of research. Therefore, the model suggests broader policies to achieve the efficiency of the institutions under multiple goals.

Bohlool Ebrahimi, Madjid Tavana, Mehdi Toloo, Vincent Charles (2020)A novel mixed binary linear DEA model for ranking decision-making units with preference information, In: Computers & industrial engineering149 Elsevier

Several mixed binary linear programming models have been proposed in the literature to rank decision-making units (DMUs) in data envelopment analysis (DEA). However, some of these models fail to consider the decision-makers' preferences. We propose a new mixed binary linear DEA model for finding the most efficient DMU by considering the decision-makers' preferences. The model proposed in this study is motivated by the approach introduced by Toloo and Salahi (2018). We extend their model by introducing additional assurance region type I (ARI) weight restrictions (WRs) based on the decision-makers' preferences. We show that direct addition of assurance region type II (ARII) and absolute WRs in traditional DEA models leads to infeasibility and free production problems, and we prove ARI eliminates these problems. We also show our epsilon-free model is less complicated and requires less effort to determine the best efficient unit compared with the existing epsilon-based models in the literature. We provide two real-life applications to show the applicability and exhibit the efficacy of our model.

Kaoru Tone, Mehdi Toloo, Mohammad Izadikhah (2020)A modified slacks-based measure of efficiency in data envelopment analysis, In: European journal of operational research287(2)pp. 560-571 Elsevier

The slacks-based measure (SBM) model can divide the set of observations into two mutually exclusive and collectively exhaustive sets: efficient and inefficient. However, it fails to provide more details about efficient DMUs, which reveals the lack of discrimination power in the SBM model. With the aim of addressing this issue, the super SBM (SupSBM) model has been suggested which can rank the SBM-efficient DMUs without providing any useful information about SBM-inefficient DMUs. As a result, in order to fully rank both efficient and inefficient DMUs, one needs to run both SBM and SupSBM models which leads to a significant increase in the number of required computations. This paper tackles this problem and modifies the SBM model which measures SBM-efficiency score for inefficient DMUs and SupSBM-efficiency score for strong efficient DMUs, simultaneously. Finally, a simulation study is presented to illustrate the superiority of our proposed model over the existing models with various problem sizes. (C) 2020 Elsevier B.V. All rights reserved.

Mohammad Nemati, Reza Kazemi Matin, Mehdi Toloo (2020)A two-stage DEA model with partial impacts between inputs and outputs: application in refinery industries, In: Annals of operations research295(1)pp. 285-312 Springer Nature

Conventional data envelopment analysis (DEA) methods are useful for estimating the performance measure of decision making units (DMUs) that each DMU uses multiple inputs to produce multiple outputs without considering any partial impacts between inputs and outputs. Nevertheless, there are some real-world situations where DMUs may possess several production lines with a two-stage network structure that each production line use inputs according to their needs. The current paper extends the recent work by Ma (Expert Syst Appl Int J 42:4339-4347, 2015) to consider partial impact between inputs and outputs for two-stage network production systems. Toward this end, we consider several input-output bundles in each stage for production lines. We formulate a couple of new mathematical programming models in the DEA framework with the aim of considering partial impact between inputs and outputs for calculating aggregate, overall, and subunit efficiencies along with resource usage by production lines for a two-stage production system Finally, an application in refinery industries is provided as an example to illustrate the potential application of the proposed method.

Seddigheh Babaee, Mehdi Toloo, Elke Hermans, Yongjun Shen (2021)A new approach for index construction: The case of the road user behavior index, In: Computers & industrial engineering152 Elsevier Ltd

•An index is constructed using a common weight approach representing the hierarchical structure of indicators.•The suggested model successfully addresses the reality, complexity, discrimination power, and fairness issues.•A case study of a road user behavior index for 13 European countries is provided. In recent years, composite indicators have become increasingly recognized as a useful tool for performance evaluation, benchmarking, and decision-making by summarizing complex and multidimensional issues. In this study, we focus on the application of data envelopment analysis (DEA) on index construction in the context of road safety and highlight the shortcomings of using the classical DEA models. The DEA method assigns a weight to each indicator by selecting the best set of weights for the unit under evaluation. The flexibility in selecting the weights in the classical DEA approach may lead to two interrelated problems: compensability and unfairness. These shortcomings are, respectively, overcome traditionally by imposing weight restrictions and applying a common weights approach. However, the problem of evaluating a layered hierarchy of indicators with a common set of weights (CSW) has not been addressed in the literature. To fill this gap, we propose a new approach for index construction to determine an optimal CSW to assess all units simultaneously while reflecting the hierarchical structure of the indicators in the model. The applicability of the suggested common-weight approach is illustrated by a case study on constructing a road user behavior index for a set of European countries. From a theoretical point of view, our approach provides a fair and identical basis for evaluation and comparison of countries in terms of driver’s behaviors and, from a practical point of view, it significantly reduces the required computational burden for solving the formulated model. The obtained results clarify the sharper discrimination power of our model compared to the other methods in the literature.

Mehdi Toloo, Esmaeil Keshavarz, Adel Hatami-Marbini (2021)An interval efficiency analysis with dual-role factors, In: OR SPECTRUM43(1)pp. 255-287 Springer Nature

Data envelopment analysis (DEA) is a data-driven and benchmarking tool for evaluating the relative efficiency of production units with multiple outputs and inputs. Conventional DEA models are based on a production system by converting inputs to outputs using input-transformation-output processes. However, in some situations, it is inescapable to think of some assessment factors, referred to as dual-role factors, which can play simultaneously input and output roles in DEA. The observed data are often assumed to be precise although it needs to consider uncertainty as an inherent part of most real-world applications. Dealing with imprecise data is a perpetual challenge in DEA that can be treated by presenting the interval data. This paper develops an imprecise DEA approach with dual-role factors based on revised production possibility sets. The resulting models are a pair of mixed binary linear programming problems that yield the possible relative efficiencies in the form of intervals. In addition, a procedure is presented to assign the optimal designation to a dual-role factor and specify whether the dual-role factor is a nondiscretionary input or output. Given the interval efficiencies, the production units are categorized into the efficient and inefficient sets. Beyond the dichotomized classification, a practical ranking approach is also adopted to achieve incremental discrimination through evaluation analysis. Finally, an application to third-party reverse logistics providers is studied to illustrate the efficacy and applicability of the proposed approach.

Sajad Kazemi, Madjid Tavana, Mehdi Toloo, Nikolay A. Zenkevich (2021)A common weights model for investigating efficiency-based leadership in the russian banking industry, In: R.A.I.R.O. Recherche opérationnelle55(1)pp. 213-229 Edp Sciences S A

In this race for productivity, the most successful leaders in the banking industry are those with high-efficiency and a competitive edge. Data envelopment analysis is one of the most widely used methods for measuring efficiency in organizations. In this study, we use the ideal point concept and propose a common weights model with fuzzy data and non-discretionary inputs. The proposed model considers environmental criteria with uncertain data to produce a full ranking of homogenous decision-making units. We use the proposed model to investigate the efficiency-based leaders in the Russian banking industry. The results show that the unidimensional and unilateral assessment of leading organizations solely according to corporate size is insufficient to characterize industry leaders effectively. In response, we recommend a multilevel, multicomponent, and multidisciplinary evaluation framework for a more reliable and realistic investigation of leadership at the network level of analysis.

Madjid Tavana, Mehdi Toloo, Nazila Aghayi, Aliasghar Arabmaldar (2021)A robust cross-efficiency data envelopment analysis model with undesirable outputs, In: Expert systems with applications167 Elsevier Ltd

•Data envelopment analysis (DEA) is challenged by imprecise, uncertain, and stochastic data.•Two DEA adaptations (interval and robust) are developed with uncertain data and undesirable outputs.•An epsilon-based robust interval cross-efficiency model is extended.•An example and a real-world application are presented to compare our method with an interval method.•The ability of our method to improve discernibility among DMUs is demonstrated. Degenerate optimal weights and uncertain data are two challenging problems in conventional data envelopment analysis (DEA). Cross-efficiency and robust optimization are commonly used to handle such problems. We develop two DEA adaptations to rank decision-making units (DMUs) characterized by uncertain data and undesirable outputs. The first adaptation is an interval approach, where we propose lower- and upper-bounds for the efficiency scores and apply a robust cross-efficiency model to avoid problems of non-unique optimal weights and uncertain data. We initially use the proposed interval approach and categorize DMUs into fully efficient, efficient, and inefficient groups. The second adaptation is a robust approach, where we rank the DMUs, with a measure of cross-efficiency that extends the traditional classification of efficient and inefficient units. Results show that we can obtain higher discriminatory power and higher-ranking stability compared with the interval models. We present an example from the literature and a real-world application in the banking industry to demonstrate this capability.

Mehdi Toloo, Esmaeil Keshavarz, Adel Hatami-Marbini (2021)Selecting data envelopment analysis models: A data-driven application to EU countries, In: Omega (Oxford)101 Elsevier

Data envelopment analysis (DEA) is a non-parametric data-driven approach for evaluating the efficiency of a set of homogeneous decision-making units (DMUs) with multiple inputs and multiple outputs. The number of performance factors (inputs and outputs) plays a crucial role when applying DEA to real-world applications. In other words, if the number of performance factors is significantly greater than the number of DMUs, it is highly possible to arrive at a large portion of efficient DMUs, which practically may become problematic due to the lack of ample discrimination among DMUs. The current research aims to develop an array of selecting DEA models to narrow down the performance factors based upon a rule of thumb. To this end, we show that the input-and output-oriented selecting DEA models may select different factors and then present the integrated models to identify a set of common factors for both orientations. In addition to efficiency evaluation at the individual level, we study structural efficiency with a single production unit at the industry level. Finally, a case study on the EU countries is presented to give insight into business innovation, social economy and growth with regard to the efficiency of the EU countries and entire EU. (C) 2020 Elsevier Ltd. All rights reserved.

Maziar Salahi, Mehdi Toloo, Narges Torabi (2021)A new robust optimization approach to common weights formulation in DEA, In: The Journal of the Operational Research Society72(6)pp. 1390-1402 Taylor & Francis

Flexibility in selecting the weights of inputs and outputs in data envelopment analysis models and uncertainty associated with the data might lead to unreliable efficiency scores. In this paper, to avoid these problems, first, we discuss robust Charnes, Cooper, Rhodes (CCR) model under Bertsimas and Sim approach. Then, the robust CCR solutions are used to find robust common set of weights under norm-1 and Bertsimas and Sim approach. Finally, on two numerical real-world examples, the performance of the proposed approach is compared by a similar recent approach from the literature to show the advantages of the new method and its applicability.

Linear fractional programming has been an important planning tool for the past four decades. The main contribution of this study is to show, under some assumptions, for a linear programming problem, that there are two different dual problems (one linear programming and one linear fractional functional programming) that are equivalent. In other words, we formulate a linear programming problem that is equivalent to the general linear fractional functional programming problem. These equivalent models have some interesting properties which help us to prove the related duality theorems in an easy manner. A traditional data envelopment analysis (DEA) model is taken, as an instance, to illustrate the applicability of the proposed approach.

Madjid Tavana, Mohammad Izadikhah, Mehdi Toloo, Razieh Roostaee (2021)A new non-radial directional distance model for data envelopment analysis problems with negative and flexible measures, In: Omega (Oxford)102 Elsevier Ltd

•Traditional data envelopment analysis measures technical (radial) efficiency.•The input and output status of each performance measure is assumed known.•The data associated with each performance measure is assumed non-negative.•A new non-radial directional distance model is proposed to relax these assumptions.•A case study in the automotive industry demonstrates the efficacy of this approach. Data envelopment analysis (DEA) is a mathematical approach for evaluating the efficiency of decision-making units that convert multiple inputs into multiple outputs. Traditional DEA models measure technical (radial) efficiencies by assuming the input and output status of each performance measure is known, and the data associated with the performance measures are non-negative. These assumptions are restrictive and limit the applications of DEA to real-world problems. We propose a new extended non-radial directional distance model, which is a variant of the weighted additive model, to cope with negative data. We then extend our model and use flexible measures, which play the role of both inputs and outputs, to cope with the unknown status of the performance measures. We also present a case study in the automotive industry to exhibit the efficacy of the models proposed in this study.

Mehdi Toloo, Bohlool Ebrahimi, Gholam R. Amin (2021)New data envelopment analysis models for classifying flexible measures: The role of non-Archimedean epsilon, In: European journal of operational research292(3)pp. 1037-1050 Elsevier B.V

•The role of epsilon in classifier data envelopment analysis models has been investigated.•An epsilon-finder model in the envelopment form has been formulated.•A pair of epsilon-based multiplier and envelopment classifier models have been developed.•An approach for finding a suitable epsilon value for our developed classifier models has been extended.•A case study of the Iranian Space Research Center has been employed to illustrate the applicability of our new epsilon-based approach. Some input-output classifier data envelopment analysis (DEA) models in multiplier and envelopment forms were developed to designate the status of flexible measures, playing either input or output roles. These models ignore the role of non-Archimedean epsilon in the input-output classification process. We show that these epsilon-free models may ignore some flexible measures in the performance evaluation process and hence the status of such flexible measure(s) can be randomly and inappropriately identified. To fill this gap, we develop a pair of epsilon-based multiplier and envelopment classifier models. We also develop an approach to find a suitable epsilon value for our developed classifier models. A case study of the supplier selection problem in the Iranian Space Research Center (ISRC) is provided to illustrate the potential application of our new epsilon-based approach.

Siamak Talatahari, Mahdi Azizi, Mehdi Toloo (2021)Fuzzy Adaptive Charged System Search for global optimization, In: Applied soft computing109 Elsevier B.V

This study proposes a new fuzzy adaptive Charged System Search (CSS) for global optimization. The suggested algorithm includes a parameter tuning process based on fuzzy logic with the aim of improving its performance. In this regard, four linguistic variables are defined which configures a fuzzy system for parameter identification of the standard CSS algorithm. This process provides a focus for the algorithm on higher levels of global searching in the initial iterations while the local search is considered in the last iterations. Twenty mathematical benchmark functions, the Competitions on Evolutionary Computation (CEC) regarding CEC 2020 benchmark, three well-known constrained, and two engineering problems are utilized to validate the new algorithm. Moreover, the performance of the new algorithm is compared and contrasted with other metaheuristic algorithms. The obtained results reveal the superiority of the proposed approach in dealing with different unconstraint, constrained, and engineering design problems. •Fuzzy Adaptive Charged System Search (FACSS) algorithm is presented.•FACSS is investigated through mathematical and engineering problems.•Statistical analysis proves the superiority of the FACSS algorithm.

Xianhua Wu, Zhiyong Ji, Yeming Gong, Yufeng Chen, Mehdi Toloo (2021)Haze emission efficiency assessment and governance for sustainable development based on an improved network data envelopment analysis method, In: Journal of cleaner production317 Elsevier Ltd

Accurate evaluation of emission governance efficiency can build fundament to develop haze control strategy towards sustainable development. By features of the haze, we view the haze formation stage as the first sub-process and the haze control stage as the second sub-process. This paper proposes an additive aggregation network data envelopment analysis (DEA) model with undesirable intermediate measures and undesirable outputs, which have not been thoroughly studied in previous literature. We found the newly developed network DEA model was nonlinear and cannot be converted into a linear program, and then developed an improved second-order cone programming approach to solve this problem. After analyzing the data of haze control in China, we drew the following conclusions: Firstly, different weights of preference for two sub-process can lead to the variation in the overall efficiency. Under different weights of preference, although the efficiency of the haze formation has a very small change in some provinces, the efficiency of the haze control has a large change. Secondly, decision makers can achieve the adjust goal of reducing haze by adjusting their preferences on the information of the haze formation and haze control stages, which are helpful for policy making in haze control strategy and sustainable development.

Aliasghar Arabmaldar, Emmanuel Kwasi Mensah, Mehdi Toloo (2021)Robust worst-practice interval DEA with non-discretionary factors, In: Expert systems with applications182 Elsevier

Traditionally, data envelopment analysis (DEA) evaluates the performance of decision-making units (DMUs) with the most favorable weights on the best practice frontier. In this regard, less emphasis is placed on non-performing or distressed DMUs. To identify the worst performers in risk-taking industries, the worst-practice frontier (WPF) DEA model has been proposed. However, the model does not assume evaluation in the condition that the environment is uncertain. In this paper, we examine the WPF-DEA from basics and further propose novel robust WPF-DEA models in the presence of interval data uncertainty and non-discretionary factors. The proposed approach is based on robust optimization where uncertain input and output data are constrained in an uncertainty set. We first discuss the applicability of worst-practice DEA models to a broad range of application domains and then consider the selection of worst-performing suppliers in supply chain decision analysis where some factors are unknown and not under varied discretion of management. Using the Monte-Carlo simulation, we compute the conformity of rankings in the interval efficiency as well as determine the price of robustness for selecting the worst-performing suppliers.

Mohammad Izadikhah, Majid Azadi, Mehdi Toloo, Farookh Khadeer Hussain (2021)Sustainably resilient supply chains evaluation in public transport: A fuzzy chance-constrained two-stage DEA approach, In: Applied soft computing113 Elsevier B.V

Owing to today’s highly competitive market environments, substantial attention has been focused on sustainably resilient supply chains (SCs) over the last few years. Nevertheless, very few studies have focused on the efficiency evaluation analysis of the sustainability and resilience of SCs as an inevitable essential in any profitable business. This study aims to address this issue by proposing a novel fuzzy chance-constrained two-stage data envelopment analysis (DEA) model as an advanced and rigorous approach in the performance evaluation of sustainably resilient SCs. To the best of our knowledge, the current study is pioneering as it introduces a new fuzzy chance-constrained two-stage method that can be used to undertake the deterministic non-fuzzy programming of the proposed model. The proposed approach is validated and applied to evaluate a real case study including 21 major public transport providers in three megacities. The results demonstrate the advantages of the proposed approach in comparison to the existing approaches in the literature. •A novel fuzzy chance-constrained two-stage data envelopment analysis model is developed.•The sustainably resilient supply chains of 21 major public transport providers in three megacities are investigated.•The results illustrate the superiority of our proposed model over the black-box model.

Mehdi Toloo, Maziar Salahi (2018)A powerful discriminative approach for selecting the most efficient unit in DEA, In: Computers and Industrial Engineering115pp. 269-277 Elsevier

Data envelopment analysis (DEA) is a mathematical approach deals with the performance evaluation problem. Traditional DEA models partition the set of units into two distinct sets: efficient and inefficient. These models fail to get more information about efficient units whereas there are some applications, known as selection-based problems, where the concern is selecting only a single efficient unit. To address the problem, several mixed integer linear/nonlinear programming models are developed in the literature using DEA. The aim of all these approaches is formulating a model with more discriminating power. This paper presents a new nonlinear mixed integer programming model with significantly higher discriminating power than the existing ones in the literature. The suggested model lets the efficiency score of only a single unit be strictly greater than one. It is observed that the discrimination power of the model is high enough for fully ranking all units. More importantly, a linearization technique is used to formulate an equivalent mixed integer linear programming model which significantly decreases the computational burden. Finally, to validate the proposed model and also compare with some recent approaches, two numerical examples are utilized from the literature. Our founding points out the superiority of our model over all the previously suggested models from both theoretical and practical standpoints.

Mehdi Toloo, Madjid Tavana (2017)A novel method for selecting a single efficient unit in data envelopment analysis without explicit inputs/outputs, In: Annals of Operations Research253(1)pp. 657-681 Springer Nature

Data Envelopment Analysis (DEA) is a non-parametric technique for evaluating a set of homogeneous decision-making units (DMUs) with multiple inputs and multiple outputs. Various DEA methods have been proposed to rank all the DMUs or to select a single efficient DMU with a single constant input and multiple outputs [i.e., without explicit inputs (WEI)] as well as multiple inputs and a single constant output [i.e., without explicit outputs (WEO)]. However, the majority of these methods are computationally complex and difficult to use. This study proposes an efficient method for finding a single efficient DMU, known as the most efficient DMU, under WEI and WEO conditions. Two compact forms are introduced to determine the most efficient DMU without solving an optimization model under the DEA-WEI and DEA-WEO conditions. A comparative analysis shows a significant reduction in the computational complexity of the proposed method over previous studies. Four numerical examples from different contexts are presented to demonstrate the applicability and exhibit the effectiveness of the proposed compact forms.

Adel Hatami-Marbini, Mehdi Toloo (2017)An extended multiple criteria data envelopment analysis model, In: Expert Systems with Applications73pp. 201-219 Elsevier

Several researchers have adapted the data envelopment analysis (DEA) models to deal with two inter-related problems: weak discriminating power and unrealistic weight distribution. The former problem arises as an application of DEA in the situations where decision-makers seek to reach a complete ranking of units, and the latter problem refers to the situations in which basic DEA model simply rates units 100% efficient on account of irrational input and/or output weights and insufficient number of degrees of freedom. Improving discrimination power and yielding more reasonable dispersion of input and output weights simultaneously remain a challenge for DEA and multiple criteria DEA (MCDEA) models. This paper puts emphasis on weight restrictions to boost discriminating power as well as to generate true weight dispersion of MCDEA when a priori information about the weights is not available. To this end, we modify a very recent MCDEA models in the literature by determining an optimum lower bound for input and output weights. The contribution of this paper is sevenfold: first, we show that a larger amount for the lower bound on weights often leads to improving discriminating power and reaching realistic weights in MCDEA models due to imposing more weight restrictions; second, the procedure for sensitivity analysis is designed to define stability for the weights of each evaluation criterion; third, we extend a weighted MCDEA model to three evaluation criteria based on the maximum lower bound for input and output weights; fourth, we develop a super-efficiency model for efficient units under the proposed MCDEA model in this paper; fifth, we extend an epsilon-based minsum BCC-DEA model to proceed our research objectives under variable returns to scale (VRS); sixth, we present a simulation study to statistically analyze weight dispersion and rankings between five different methods in terms of non-parametric tests; and seventh, we demonstrate the applicability of the proposed models with an application to European Union member countries.

Mehdi Toloo, Ali Emrouznejad, Placido Moreno (2017)A linear relational DEA model to evaluate two-stage processes with shared inputs, In: Computational and Applied Mathematics36(1)pp. 45-61 Springer

Two-stage data envelopment analysis (DEA) efficiency models identify the efficient frontier of a two-stage production process. In some two-stage processes, the inputs to the first stage are shared by the second stage, known as shared inputs. This paper proposes a new relational linear DEA model for dealing with measuring the efficiency score of two-stage processes with shared inputs under constant returns-to-scale assumption. Two case studies of banking industry and university operations are taken as two examples to illustrate the potential applications of the proposed approach.

Maziar Salahi, Mehdi Toloo (2017)In the determination of the most efficient decision making unit in data envelopment analysis: A comment, In: Computers and Industrial Engineering104pp. 216-218 Elsevier

In order to deal with finding the most efficient unit problem, Lam (2015) recently built a new integrated mixed integer linear programming model which is nearly close to the super-efficiency model. The suggested model involves a non-Archimedean epsilon as the lower bound for the input and output weights. Selecting a suitable value for epsilon is a challenging issue in DEA (Data Envelopment Analysis). Lam (2015) suggested a value for epsilon which guarantees the feasibility of his model; however, this paper illustrates that the model may fail to find the most efficient unit due to unsuitable selected value for epsilon. To cope with this issue, a new model is formulated which provides the maximum epsilon value for the model of Lam (2015). The built model guarantees that when epsilon is maximum, then Lam’s model gives exactly one DMU (Decision Making Unit) as the most efficient unit with the maximum discrimination distance from the other DMUs.

Ales Kresta, Tomas Tichy, Mehdi Toloo (2017)Examination of Market Risk Estimation Models via DEA Approach Modelling, In: Politická Ekonomie65(2)pp. 161-178 Vysoka Skola Ekonomicka

Measuring and managing of financial risks is an essential part of the management of financial institutions. The appropriate risk management should lead to an efficient allocation of available funds. Approaches based on Value at Risk measure have been used as a means for measuring market risk since the late 20th century, although regulators newly suggest to apply more complex method of Expected Shortfall. While evaluating models for market risk estimation based on Value at Risk is relatively simple and involves so-called backtesting procedure, in the case of Expected Shortfall we cannot apply similar procedure. In this article we therefore focus on an alternative method for comprehensive evaluation of VaR models at various significance levels by means of data envelopment analysis (DEA). This approach should lead to the adoption of the model which is also suitable in terms of the Expected Shortfall criterion. Based on the illustrative results from the US stock market we conclude that NIG model and historical simulation should be preferred to normal distribution and GARCH model. We can also recommend to estimate the parameters from the period slightly shorter than two years.

Mehdi Toloo, Maryam Allahyar (2017)A selecting model under constant returns to scale in data envelopment analysis, In: R Nemec, L Chytilova (eds.), PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON STRATEGIC MANAGEMENT AND ITS SUPPORT BY INFORMATION SYSTEMS (SMSIS)pp. 350-357 Vsb-Tech Univ Ostrava

Data envelopment analysis (DEA) is a data-oriented mathematical programming approach that evaluates a set of peer decision making units (DMUs) dealing directly with the observed inputs and outputs (performance measures). Empirically, in order to have a logical assessment, there should be a balance between the number of performance measures and the number of DMUs. Accordingly, applying an appropriate method so that one can select some performance measures is very crucial for successful applications. In this paper, we suggest the envelopment form of selecting model under constant returns to scale (CRS) from both individual and aggregate points of view. We also show that applying these selecting models leads to the maximum discrimination between efficient units.

Emmanuel Kwasi Mensah, Mehdi Toloo (2017)Robust DEA for banks' efficiency under uncertainty, In: R Nemec, L Chytilova (eds.), PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON STRATEGIC MANAGEMENT AND ITS SUPPORT BY INFORMATION SYSTEMS (SMSIS)pp. 304-311 Vsb-Tech Univ Ostrava

The Data Envelopment Analysis (DEA) has been the benchmarked model for measuring the efficiency of banks over the years. However, inherent noise and uncertainties in the data are hardly considered for robust efficiency scores. The disadvantage is that a small perturbation in the uncertain parameters can lead to high infeasibility of the efficient solutions. This paper introduces a robust DEA into the measurement of banks efficiency. The proposed robust approach is based on the robust counterpart optimization of Ben-Tal & Nemirovski (2000), and it is implemented in the traditional DEA models germane to the performance measurement of banks. A preliminary result from data on banks in the Czech Republic indicates that efficiency scores measured with the robust DEA model provides a true and stable performance measure than the normal DEA model.

Behrouz Arabi, Susila Munisamy, Ali Emrouznejad, Mehdi Toloo, Mohammad Sadegh Ghazizadeh (2016)Eco-efficiency considering the issue of heterogeneity among power plants, In: Energy111pp. 722-735 Elsevier

One of the main objectives in restructuring power industry is enhancing the efficiency of power facilities. However, power generation industry, which plays a key role in the power industry, has a noticeable share in emission amongst all other emission-generating sectors. In this study, we have developed some new Data Envelopment Analysis models to find efficient power plants based on less fuel consumption, combusting less polluting fuel types, and incorporating emission factors in order to measure the ecological efficiency trend. We then applied these models to measuring eco-efficiency during an eight-year period of power industry restructuring in Iran. Results reveal that there has been a significant improvement in eco-efficiency, cost efficiency and allocative efficiency of the power plants during the restructuring period. It is also shown that despite the hydro power plants look eco-efficient; the combined cycle ones have been more allocative efficient than the other power generation technologies used in Iran.

Strategic vendor selection problem (VSP) has been investigated in different purchasing literature during the last two decades. Indeed, senior purchasing managers always deal with such crucial decisions. Manufacturing managers in the global market are faced with challenging and complex tasks very similar to VSP. Increasing outsourcing and opportunity provided by automotive industry to the worldwide markets make these decisions, even more, complex. Various methodologies, from simple weighted scoring methods to complex mathematical programming models, are introduced to tackle the VSP. Data envelopment analysis (DEA) is a non-parametric method in operations research and economics for evaluating the productive efficiency of decision-making units (DMUs). This study utilizes the proposed approach in Toloo and Ertay (2014) to develop a method for finding the most cost efficient DMU when the prices are fixed and known. A case study of an automotive company located in Turkey is adapted from the literature to illustrate the potential application of the suggested approach.

Atefeh Masoumzadeh, Mehdi Toloo, Alireza Amirteimoori (2016)Performance assessment in production systems without explicit inputs: an application to basketball players, In: IMA Journal of Management Mathematics27(2)pp. 143-156 Oxford University Press

A new research issue in the context of production theory is production without explicit inputs. In such systems, input consumption is not important to the decision-maker and the focus is on output production. In the presence of desirable and undesirable outputs, modelling undesirable outputs is an important problem. This paper discusses the problem of weak disposability in the absence of explicit inputs. A linear production technology is constructed axiomatically to handle desirable and undesirable outputs in production systems without explicit inputs. A simple linear formulation of weak disposability in such systems is proposed that enables us to reduce undesirable production outputs.

Mehdi Toloo, Rahele Jalili (2016)LU Decomposition in DEA with an Application to Hospitals, In: Computational Economics47(3)pp. 473-488 Springer

A fundamental problem that usually appears in linear systems is to find a vector satisfying . This linear system is encountered in many research applications and more importantly, it is required to be solved in many contexts in applied mathematics. LU decomposition method, based on the Gaussian elimination, is particularly well suited for spars and large-scale problems. Linear programming (LP) is a mathematical method to obtain optimal solutions for a linear system that is more being considered in various fields of study in recent decades. The simplex algorithm is one of the mostly used mathematical techniques for solving LP problems. Data envelopment analysis (DEA) is a non-parametric approach based on linear programming to evaluate relative efficiency of decision making units (DMUs). The number of LP models that has to be solved in DEA is at least the same as the number of DMUs. Toloo et al. (Comput Econ 45(2):323-326, 2015) proposed an initial basic feasible solution for DEA models which practically reduces at least 50 % of the whole computations. The main contribution of this paper is in utlizing this solution to implement LU decomposition technique on the basic DEA models which is more accurate and numerically stable. It is shown that the number of computations in applying the Gaussian elimination method will be fairly reduced due to the special structure of basic DEA models. Potential uses are illustrated with applications to hospital data set.

Mehdi Toloo, Seddigheh Babaee (2015)On variable reductions in data envelopment analysis with an illustrative application to a gas company, In: Applied Mathematics and Computation270pp. 527-533 Elsevier

Data envelopment analysis (DEA) is a non-parametric data oriented method for evaluating relative efficiency of the number of decision making units (DMUs) based on pre-selected inputs and outputs. In some real DEA applications, the large number of inputs and outputs, in comparison with the number of DMUs, is a pitfall that could have major influence on the efficiency scores. Recently, an approach was introduced which aggregates collected inputs and outputs in order to reduce the number of inputs and outputs iteratively. The purpose of this paper is to show that there are three drawbacks in this approach: instability due to existence of an infinitesimal epsilon, iteratively which can be improved to just one iteration, and providing non-radial inputs and outputs and then capturing them. In order to illustrate the applicability of the improved approach, a real data set involving 14 large branches of National Iranian Gas Company (NIGC) is utilized.

Amir Ebrahimi Zade, Sasan Barak, Hamidreza Maghsoudlou, Mehdi Toloo (2016)Multi-objective optimization for periodic preventive maintenance, In: Proceedings of the 2015 International Conference on Industrial Engineering and Systems Management (IESM 2015)pp. 173-182 Institute of Electrical and Electronics Engineers (IEEE)

This article investigates a JIT single machine scheduling problem with a periodic preventive maintenance. Also to maintain the quality of the products, there is a limitation on the maximum number of allowable jobs in each period. The proposed bi-objective mixed integer model minimizes total earliness-tardiness and makespan simultaneously. Due to the computational complexity of the problem, multi-objective particle swarm optimization (MOPSO) algorithm is implemented. Also, as well as MOPSO, two other optimization algorithms are used for comparing the results. Eventually, Taguchi method with metrics analysis is presented to tune the algorithms' parameters and a multiple criterion decision making (MCDM) technique based on the technique for order of preference by similarity to ideal solution (TOPSIS) is applied to choose the best algorithm. Comparison results confirmed supremacy of MOPSO to the other algorithms.

Mehdi Toloo, Mona Barat (2015)On considering dual-role factor in supplier selection problem, In: Mathematical Methods of Operations Research82(1)pp. 107-122 Springer

Conventional data envelopment analysis evaluates the relative efficiency of a set of homogeneous decision making units (DMUs), where DMUs are evaluated in terms of a specified set of inputs and outputs. In some situations, however, a performance factor could serve as either an output or an input. These factors are referred to as dual-role factors. The presence of dual-role factor among performance factors gives rise to the issue of how to fairly designate the input/output status to such factor. Several studies have been conducted treating a dual-role factor in both methodological and applied nature. One approach taken to address this problem is to view the dual-role factor as being nondiscretionary and connect it to the returns to scale concepts. It is argued that the idea of classifying a factor as an input or an output within a single model cannot consider the causality relationships between inputs and outputs. In this paper we present a mixed integer linear programming approach with the aim at dealing with the dual-role factor. Model structure is developed for finding the status of a dual-role factor via solving a single model while considering the causality relationships between inputs and outputs. It is shown that the new model can designate the status of a dual-role factor with half calculations as the previous model. Both individual and aggregate points of view are suggested for deriving the most appropriate designation of the dual-role factor. A data set involving 18 supplier selections is adapted from literature review to illustrate the efficacy of the proposed models and compare the new approach with the previous ones.

Mehdi Toloo, Ameneh Zandi, Ali Emrouznejad (2015)Evaluation efficiency of large-scale data set with negative data: an artificial neural network approach, In: Journal of Supercomputing71(7)pp. 2397-2411 Springer

Data envelopment analysis (DEA) is the most widely used methods for measuring the efficiency and productivity of decision-making units (DMUs). The need for huge computer resources in terms of memory and CPU time in DEA is inevitable for a large-scale data set, especially with negative measures. In recent years, wide ranges of studies have been conducted in the area of artificial neural network and DEA combined methods. In this study, a supervised feed-forward neural network is proposed to evaluate the efficiency and productivity of large-scale data sets with negative values in contrast to the corresponding DEA method. Results indicate that the proposed network has some computational advantages over the corresponding DEA models; therefore, it can be considered as a useful tool for measuring the efficiency of DMUs with (large-scale) negative data.

Esmail Keshavarz, Mehdi Toloo (2015)Efficiency status of a feasible solution in the Multi-Objective Integer Linear Programming problems: A DEA methodology, In: Applied Mathematical Modelling39(12)pp. 3236-3247 Elsevier

Efficient solutions in Multi-Objective Integer Linear Programming (MOILP) problems are categorized into two distinct types, supported and non-supported. Many researchers try to gain some conditions to determine whether a feasible solution is efficient, nevertheless there is no attempt to identify the efficiency status of a given efficient solution, i.e. supported and non-supported. In this paper, we first verify the relationships between Data Envelopment Analysis (DEA) and MOILP and then design two distinct practical procedures: the first one specifies whether or not an arbitrary feasible solution is efficient, meanwhile the second one as the main aim of this study, determines the efficiency status of an efficient solution. Finally, as a contribution of the suggested approach, we illustrate the drawback of Chen and Lu's methodology (Chen and Lu, 2007) which is developed for solving an extended assignment problem. (C) 2014 Elsevier Inc. All rights reserved.

You et al. (2013) indicated two errors in Amin and Toloo (2007). The first error was the infeasibility of Amin and Toloo's (2007) model and the second drawback was the lack of a suitable value for the non-Archimedean epsilon in the proposed approach of Amin and Toloo (2007). This paper deals with the raised issues and proves that the model of Amin and Toloo (2007) is always feasible. In addition, we also formulate a new model for finding a suitable value for the epsilon.

Mehdi Toloo, Tomas Tichy (2015)Two alternative approaches for selecting performance measures in data envelopment analysis, In: Measurement65pp. 29-40 Elsevier

Data envelopment analysis seeks a frontier to envelop all data with data acting in a critical role in the process and in such a way measures the relative efficiency of each decision making unit in comparison with other units. There is a statistical and empirical rule that if the number of performance measures is high in comparison with the number of units, then a large percentage of the units will be determined as efficient, which is obviously a questionable result. It also implies that the selection of performance measures is very crucial for successful applications. In this paper, we extend both multiplier and envelopment forms of data envelopment analysis models and propose two alternative approaches for selecting performance measures under variable returns to scale. The multiplier form of selecting model leads to the maximum efficiency scores and the maximum discrimination between efficient units is achieved by applying the envelopment form. Also individual unit and aggregate models are formulated separately to develop the idea of selective measures. Finally, in order to illustrate the potential of the proposed approaches a case study using a data from a banking industry in the Czech Republic is utilized. (C) 2014 Elsevier Ltd. All rights reserved.

Mehdi Toloo, Reza Farzipoor Saen, Majid Azadi (2015)Obviating some of the theoretical barriers of data envelopment analysis-discriminant analysis: an application in predicting cluster membership of customers, In: The Journal of the Operational Research Society66(4)pp. 674-683 Taylor & Francis

Data envelopment analysis-discriminant analysis (DEA-DA) has been used for predicting cluster membership of decision-making units (DMUs). One of the possible applications of DEA-DA is in the marketing research area. This paper uses cluster analysis to cluster customers into two clusters: Gold and Lead. Then, to predict cluster membership of new customers, DEA-DA is applied. In DEA-DA, an arbitrary parameter imposing a small gap between two clusters (η) is incorporated. It is shown that different η leads to different prediction accuracy levels since an unsuitable value for η leads to an incorrect classification of DMUs. We show that even the data set with no overlap between two clusters can be misclassified. This paper proposes a new DEA-DA model to tackle this issue. The aim of this paper is to illustrate some computational difficulties in previous DEA-DA approaches and then to propose a new DEA-DA model to overcome the difficulties. A case study demonstrates the efficacy of the proposed model.

Mehdi Toloo, Mona Barat, Atefeh Masoumzadeh (2015)Selective measures in data envelopment analysis, In: Annals of Operations Research226(1)pp. 623-642 Springer

Data envelopment analysis (DEA) is a data based mathematical approach, which handles large numbers of variables, constraints, and data. Hence, data play an important and critical role in DEA. Given a set of decision making units (DMUs) and identified inputs and outputs (performance measures), DEA evaluates each DMU in comparison with all DMUs. According to some statistical and empirical rules, a balance between the number of DMUs and the number of performance measures should exist. However, in some situations the number of performance measures is relatively large in comparison with the number of DMUs. These cases lead us to choose some inputs and outputs in a way that produces acceptable results. We refer to these selected inputs and outputs as selective measures. This paper presents an approach toward a large number of inputs and outputs. Individual DMU and aggregate models are recommended and expanded separately for developing the idea of selective measures. The practical aspect of the new approach is illustrated by two real data set applications.

Mehdi Toloo (2015)Alternative minimax model for finding the most efficient unit in data envelopment analysis, In: Computers and Industrial Engineering81pp. 186-194 Elsevier

Data envelopment analysis (DEA) deals with the evaluation of efficiency score of peer decision making units (DMUs) and divides them in two mutually exclusive sets: efficient and inefficient. There are various ranking methods to get more information about the efficient units. Nevertheless, finding the most efficient unit is a scientific challenge and hence has been the subject of numerous studies. Here, the main contribution is an integrated model that is able to determine the most efficient unit under a common condition is developed. The current research formulates a new minimax mixed integer linear programming (MILP) model for fining the most efficient DMU. Three different case studies from different contexts are taken as numerical examples to compare the proposed model with other methods. These numerical examples also illustrate the various potential applications of the suggested model.

Mehdi Toloo, Atefeh Masoumzadeh, Mona Barat (2015)Finding an Initial Basic Feasible Solution for DEA Models with an Application on Bank Industry, In: Computational Economics45(2)pp. 323-336 Springer Nature

Nowadays, algorithms and computer programs, which are going to speed up, short time to run and less memory to occupy have special importance. Toward these ends, researchers have always regarded suitable strategies and algorithms with the least computations. Since linear programming (LP) has been introduced, interest in it spreads rapidly among scientists. To solve an LP, the simplex method has been developed and since then many researchers have contributed to the extension and progression of LP and obviously simplex method. A vast literature has been grown out of this original method in mathematical theory, new algorithms, and applied nature. Solving an LP via simplex method needs an initial basic feasible solution (IBFS), but in many situations such a solution is not readily available so artificial variables will be resorted. These artificial variables must be dropped to zero, if possible. There are two main methods that can be used to eliminate the artificial variables: two-phase method and Big-M method. Data envelopment analysis (DEA) applies individual LP for evaluating performance of decision making units, consequently, to solve these LPs an IBFS must be on hand. The main contribution of this paper is to introduce a closed form of IBFS for conventional DEA models, which helps us not to deal with artificial variables directly. We apply the proposed form to a real-data set to illustrate the applicability of the new approach. The results of this study indicate that using the closed form of IBFS can reduce at least 50 % of the whole computations.

Sasan Barak, Mehdi Toloo (2015)Prioritizing strategies and ranking executive methods by QFD and fuzzy QSPM-Gap analysis, In: R Nemec, F Zapletal (eds.), Proceedings of the 11th International Conference on Strategic Management and its Support by Information Systemspp. 167-175 Technical University of Ostrava, Faculty of Economics

Our peripheral environment is changing rapidly and globalization of organizations has made them more complex. Therefore, organizations should codify their strategic plans and executive methods more accurately. However, some executive methods are not properly fulfilling the organization's strategic priorities. This paper proposes a comprehensive framework in order to evaluate and prioritize strategies and rank executive methods. To do this, firstly, the strategic plans are developed with SWOT (Strength, Weakness, Opportunity, Threat) analysis and then plans are weighted and diminished by using FQSPM-Gap (Fuzzy Quantitative Strategic Planning Matrix) model. Finally, the executive methods of the company are prioritized by QFD (Quality Function Deployment) matrix to accomplish its strategic plans. The model is implemented in a textile and clothing Company.

Esmaiel Keshavarz, Mehdi Toloo (2014)Finding efficient assignments: An innovative DEA approach, In: Measurement58pp. 448-458 Elsevier

Finding and classifying all efficient assignments for a Multi-Criteria Assignment Problem (MCAP) is one of the controversial issues in Multi-Criteria Decision Making (MCDM) problems. The main aim of this study is to utilize Data Envelopment Analysis (DEA) methodology to tackle this issue. Toward this end, we first state and prove some theorems to clarify the relationships between DEA and MCAP and then design a new two-phase approach to find and classify a set of efficient assignments. In Phase I, we formulate a new Mixed Integer Linear Programming (MILP) model, based on the Additive Free Disposal Hull (FDH) model, to gain an efficient assignment and then extend it to determine a Minimal Complete Set (MCS) of efficient assignments. In Phase II, we use the BCC model to classify all efficient solutions obtained from Phase I as supported and non-supported. A 4 x 4 assignment problem, containing two cost-type and single profit-type of objective functions, is solved using the presented approach. (C) 2014 Elsevier Ltd. All rights reserved.

Esmaiel Keshavarz, Mehdi Toloo (2015)Solving the Bi-Objective Integer Programming: A DEA methodology, In: 2014 International Conference on Control, Decision and Information Technologies (CoDIT 2014)pp. 060-064 Institute of Electrical and Electronics Engineers (IEEE)

Finding and classifying all efficient solutions for a Bi-Objective Integer Linear Programming (BOILP) problem is one of the controversial issues in Multi-Criteria Decision Making problems. The main aim of this study is to utilize the well-known Data Envelopment Analysis (DEA) methodology to tackle this issue. Toward this end, we first state some propositions to clarify the relationships between the efficient solutions of a BOILP and efficient Decision Making Units (DMUs) in DEA and next design a new two-stage approach to find and classify a set of efficient solutions. Stage I formulates a two-phase Mixed Integer Linear Programming (MILP) model, based on the Free Disposal Hull (FDH) model in DEA, to gain a Minimal Complete Set of efficient solutions. Stage II uses a variable returns to scale DEA model to classify the obtained efficient solutions from Stage I as supported and non-supported. A BOILP model containing 6 integer variables and 4 constraints is solved as an example to illustrate the applicability of the proposed approach.

Mehdi Toloo (2014)The role of non-Archimedean epsilon in finding the most efficient unit: With an application of professional tennis players, In: Applied Mathematical Modelling38(21-22)pp. 5334-5346 Elsevier

The determination of a single efficient decision making unit (DMU) as the most efficient unit has been attracted by decision makers in some situations. Some integrated mixed integer linear programming (MILP) and mixed integer nonlinear programming (MINLP) data envelopment analysis (DEA) models have been proposed to find a single efficient unit by the optimal common set of weights. In conventional DEA models, the non-Archimedean infinitesimal epsilon, which forestalls weights from being zero, is useless if one utilizes the well-known two-phase method. Nevertheless, this approach is inapplicable to integrated DEA models. Unfortunately, in some proposed integrated DEA models, the epsilon is neither considered nor determined. More importantly, based on this lack some approaches have been developed which will raise this drawback. In this paper, first of all some drawbacks of these models are discussed. Indeed, it is shown that, if the non-Archimedean epsilon is ignored, then these models can neither find the most efficient unit nor rank the extreme efficient units. Next, we formulate some new models to capture these drawbacks and hence attain assurance regions. Finally, a real data set of 53 professional tennis players is applied to illustrate the applicability of the suggested models.

Mehdi Toloo, Ales Kresta (2014)Finding the best asset financing alternative: A DEA-WEO approach, In: Measurement55pp. 288-294 Elsevier

Measurement of performance is an important activity in identifying weaknesses in managerial efficiency and devising goals for improvement. Data envelopment analysis (DEA) is a mathematical quantitative approach for measuring the performance of a set of similar units. Toloo (2013) extended a DEA approach for finding the most efficient unit considering a data set without explicit inputs. The aim of this paper is to develop DEA models without explicit outputs, henceforth called DEA-WEO, to find the most efficient unit when outputs are not directly considered. The suggested models directly utilize the data without the need of adding a virtual output, whose value is equal to for all units. A real data set involving 139 different alternatives for long-term asset financing provided by Czech banks and leasing companies is taken to illustrate the potential application of the proposed approach.

Mehdi Toloo (2014)An epsilon-free approach for finding the most efficient unit in DEA, In: Applied Mathematical Modelling38(13)pp. 3182-3192 Elsevier

Data envelopment analysis (DEA), considering the best condition for each decision making unit (DMU), assesses the relative efficiency and partitions DMUs into two sets: efficient and inefficient. Practically, in traditional DEA models more than one efficient DMU are recognized and these models cannot rank efficient DMUs. Some studies have been carried out aiming at ranking efficient DMUs, although in some cases only discrimination of the most efficient unit is desirable. Furthermore, several investigations have been done for finding the most CCR-efficient DMU. The basic idea of the majority of them is to introduce an integrated model which achieves an optimal common set of weights (CSW). These weights help us identify the most efficient unit in an identical condition. Recently, Toloo (2012) [13] proposed a new mixed integer programming (MIP) model to find the most BCC-efficient unit. Based on this study, we propose a new basic integrated linear programming (LP) model to identify candidate DMUs for being the most efficient unit; next a new MIP integrated DEA model is introduced for determining the most efficient DMU. Moreover, these models exclude the non-Archimedean epsilon and consequently the optimal solution of these models can be obtained, straightforwardly. We claim that the most efficient unit, which could be obtained from all other integrated models, has to be one of the achieved candidates from the basic integrated LP model. Two numerical examples are illustrated to show the variant use of these models in different important cases. (C) 2013 Elsevier Inc. All rights reserved.

Mehdi Toloo (2014)Notes on classifying inputs and outputs in data envelopment analysis: A comment, In: European Journal of Operational Research235(3)pp. 810-812 Elsevier

Cook and Zhu (2007) introduced an innovative method to deal with flexible measures. Toloo (2009) found a computational problem in their approach and tackled this issue. Amirteimoori and Emrouznejad (2012) claimed that both Cook and Zhu (2007) and Toloo (2009) models overestimate the efficiency. In this response, we prove that their claim is incorrect and there is no overestimate in these approaches.

Mehdi Toloo, Tijen Ertay (2014)The most cost efficient automotive vendor with price uncertainty: A new DEA approach, In: Measurement52(1)pp. 135-144 Elsevier

Vendor’s performance evaluation is an important subject which has strategic implications for managing an efficient company. However, there are many important criteria for prospering company. These criteria may contradict together. In other words, while a criterion is improved, the other may worsen. Indeed, similar to manufacturing manager in global market, purchasing manager who has significant practical implications deals with this issue. The vendor selection problem (VSP) is obviously affected by the complexity and uncertainty due to the lack of information associated with related business environment of countries in a global market. On the other hand, in the automotive industry which plays an important role in the worldwide market, these decisions will be exacerbated by increasing the outsourcing and opportunities. There are varieties of techniques, from simple weighted scoring methods to complex mathematical programming, for handling VSP. In this study, we propose a new cost efficiency data envelopment analysis (CE–DEA) approach with price uncertainty for finding the most cost efficient unit. Potential uses are then illustrated with an application to automotive industry involving 73 vendors in Turkey.

Mehdi Toloo (2014)Selecting and full ranking suppliers with imprecise data: A new DEA method, In: International Journal of Advanced Manufacturing Technology74(5-8)pp. 1141-1148 Springer

Supplier selection, a multi-criteria decision making (MCDM) problem, is one of the most important strategic issues in supply chain management (SCM). A good solution to this problem significantly contributes to the overall supply chain performance. This paper proposes a new integrated mixed integer programming ‐ data envelopment analysis (MIP‐DEA) model for finding the most efficient suppliers in the presence of imprecise data. Using this model, a new method for full ranking of units is introduced. This method tackles some drawbacks of the previous methods and is computationally more efficient. The applicability of the proposed model is illustrated, and the results and performance are compared with the previous studies.

Mehdi Toloo (2013)The most efficient unit without explicit inputs: An extended MILP-DEA model, In: Measurement : journal of the International Measurement Confederation46(9)pp. 3628-3634 Elsevier

Data envelopment analysis (DEA) has been a very popular method for measuring and benchmarking relative efficiency of each decision making units (DMUs) with multiple inputs and multiple outputs. DEA and Discriminant Analysis (DA) are similar in classifying units to exhibit either good or poor performance. On the other hand, selecting the most efficient unit between several efficient ones is one of the main issues in multi-criteria decision making (MCDM). Some proponents have suggested some approaches and claimed their methodologies involve discriminating power to determine the most efficient DMU without explicit input. This paper focuses on the weakness of a recent methodology of these approaches and to avoid this drawback presents a mixed integer programming (MIP) approach. To illustrate this drawback and compare discriminating power of the recent methodology to our new approach, a real data set containing 40 professional tennis players is utilized.

Mehdi Toloo (2012)On finding the most BCC-efficient DMU: A new integrated MIP–DEA model, In: Applied Mathematical Modelling36(11)pp. 5515-5520 Elsevier

This paper proposes a new integrated mixed integer programing – data envelopment analysis (MIP–DEA) model to improve the integrated DEA model which was introduced by Toloo & Nalchigar [M. Toloo, S. Nalchigar. A new integrated DEA model for finding most BCC–efficient DMU. Appl. Math. Model. 33 (2009) 597–60]. In this study some problems of applying Toloo & Nalchigar’s model are addressed. A new integrated MIP–DEA model is then introduced to determine the most BCC-efficient decision making unit (DMU). Moreover, it is mathematically proved that the new model identifies only a single BCC-efficient DMU by a common set of optimal weights. To show applicability of proposed models, a numerical example is used which contains a real data set of nineteen facility layout designs (FLDs).

Mehdi Toloo (2012)Alternative solutions for classifying inputs and outputs in data envelopment analysis, In: Computers and Mathematics with Applications63(6)pp. 1104-1110 Elsevier

In conventional data envelopment analysis (DEA) models, a performance measure whether as an input or output usually has to be known. Nevertheless, in some cases, the type of a performance measure is not clear and some models are introduced to accommodate such flexible measures. In this paper, it is shown that alternative optimal solutions of these models has to be considered to deal with the flexible measures, otherwise incorrect results might occur. Practically, the efficiency scores of a DMU could be equal when the flexible measure is considered either as input or output. These cases are introduced and referred as share cases in this study specifically. It is duplicated that share cases must not be taken into account for classifying inputs and outputs. A new mixed integer linear programming (MILP) model is proposed to overcome the problem of not considering the alternative optimal solutions of classifier models. Finally, the applicability of the proposed model is illustrated by a real data set.

Mehdi Toloo, Soroosh Nalchigar (2011)A new DEA method for supplier selection in presence of both cardinal and ordinal data, In: Expert Systems with Applications38(12)pp. 14726-14731 Elsevier

The success of a supply chain is highly dependent on selection of best suppliers. These decisions are an important component of production and logistics management for many firms. Little attention is given in the literature to the simultaneous consideration of cardinal and ordinal data in supplier selection process. This paper proposes a new integrated data envelopment analysis (DEA) model which is able to identify most efficient supplier in presence of both cardinal and ordinal data. Then, utilizing this model, an innovative method for prioritizing suppliers by considering multiple criteria is proposed. As an advantage, our method identifies best supplier by solving only one mixed integer linear programming (MILP). Applicability of proposed method is indicated by using data set includes specifications of 18 suppliers.

Gholam R. Amin, Mehdi Toloo, M. Sheikhan (2010)Input and output scaling in advanced manufacturing technology: theory and application, In: International Journal of Advanced Manufacturing Technology50(50)pp. 1235-1241 Springer Nature

This paper suggests new data envelopment analysis (DEA) models for input and output scaling in advanced manufacturing technology (AMT). For a given group of AMT observations using the traditional DEA models, it is not possible to evaluate the units when a specified input (or specified output) is required to be scaled for all units. The paper provides theoretical results for obtaining the relationship between the original AMT observations and the corresponding scaled data. Also, the paper uses numerical illustrations to show the usefulness of the suggested contribution.

Mehdi Toloo, Babak Sohrabi, Soroosh Nalchigar (2009)A new method for ranking discovered rules from data mining by DEA, In: Expert systems with applications36(4)pp. 8503-8508 Elsevier

Data mining techniques, extracting patterns from large databases have become widespread in business. Using these techniques, various rules may be obtained and only a small number of these rules may be selected for implementation due, at least in part. to limitations of budget and resources. Evaluating and ranking the interestingness or usefulness of association rules is important in data mining. This paper proposes a new integrated data envelopment analysis (DEA) model which is able to find most efficient association rule by solving only one mixed integer linear programming (MILP). Then, utilizing this model, a new method for prioritizing association rules by considering Multiple criteria is proposed. As an advantage, the proposed method is computationally more efficient than previous works. Using an example of market basket analysis, applicability of our DEA based method for measuring the efficiency of association rules with multiple criteria is illustrated. (C) 2008 Elsevier Ltd. All rights reserved.

Mehdi Toloo (2009)On classifying inputs and outputs in DEA: A revised model, In: European Journal of Operational Research198(1)pp. 358-360 Elsevier

Cook and Zhu [Cook, W.D., Zhu, J., 2007. Classifying inputs and outputs in data envelopment analysis. European Journal of Operational Research 180, 692–699] introduced a new method to determine whether a measure is an input or an output. In practice, however, their method may produce incorrect efficiency scores due to a computational problem as result of introducing a large positive number to the model. This note introduces a revised model that does not need such a large positive number.

Mehdi Toloo, Samaneh Joshaghani (2009)Centralized additive model, In: CIE: 2009 INTERNATIONAL CONFERENCE ON COMPUTERS AND INDUSTRIAL ENGINEERING, VOLS 1-3pp. 420-425 Institute of Electrical and Electronics Engineers (IEEE)

While conventional data envelopment analysis (DEA) models set targets separately for each decision making unit (DMU), Lozano and Villa (2004) introduced the concept of "centralized" DEA models, which aim at optimizing the combined resource consumption by all units in an organization rather than considering the consumption by each unit, separately. In these models, there is a centralized decision maker (DM) who supervises all DMUs. The main aim is optimizing total input consumption and output production. In this paper, firstly we present centralized output product model. Then we introduce parametric centralized additive model, which during one phase minimizes total consumption inputs and maximizes total output production simultaneously, in the direction of optimization vector. Some numerical examples of the proposed models and their results are presented.

Mehdi Toloo, Maryam Shadab, Mahta Yekkalam, (2009)Finding the most cost efficient DMU with certain and uncertain input prices, In: CIE: 2009 INTERNATIONAL CONFERENCE ON COMPUTERS AND INDUSTRIAL ENGINEERING, VOLS 1-3pp. 396-401 Institute of Electrical and Electronics Engineers (IEEE)

Cost efficiency (CE) assesses the ability to produce current output at minimal cost. There are some models which are introduced to measure cost efficiency with certain and uncertain input prices. Normally, by using data envelopment analysis (DEA) models, more than one cost efficient decision making units (DMUs) are recognized. The main contribution of this paper consists of development of a model which was proposed by Amin and Toloo (2007) to some models for finding the most cost efficient DMU in various situations of input prices. These models find the most cost efficient DMU by solving only one mixed integer linear programming (MILP) in each case.

Mehdi Toloo, Soroosh Nalchigar (2009)A new integrated DEA model for finding most BCC-efficient DMU, In: Applied Mathematical Modelling33(1)pp. 597-604 Elsevier

In many applications of widely recognized technique, DEA, finding the most efficient DMU is desirable for decision maker. Using basic DEA models, decision maker is not able to identify most efficient DMU. Amin and Toloo [Gholam R. Amin, M. Toloo, Finding the most efficient DMUs in DEA: an improved integrated model. Comput. Ind. Eng. 52 (2007) 71–77] introduced an integrated DEA model for finding most CCR-efficient DMU. In this paper, we propose a new integrated model for determining most BCC-efficient DMU by solving only one linear programming (LP). This model is useful for situations in which return to scale is variable, so has wider range of application than other models which find most CCR-efficient DMU. The applicability of the proposed integrated model is illustrated, using a real data set of a case study, which consists of 19 facility layout alternatives.

Mehdi Toloo, Nazila Aghayi, Mohsen Rostamy-malkhalifeh (2008)Measuring overall profit efficiency with interval data, In: Applied Mathematics and Computation201(1-2)pp. 640-649 Elsevier

This paper presents a framework where data envelopment analysis (DEA) is used to measure overall profit efficiency with interval data. Specifically, it is shown that as the inputs, outputs and price vectors each vary in intervals, the DMUs cannot be easily evaluated. Thus, presenting a new method for computing the efficiency of DMUs with interval data, an interval will be defined for the efficiency score of each unit. As well as, all the DMUs are divided into three groups which are defined according to the interval obtained for the efficiency value of DMUs.

Gholam R. Amin, M. Toloo, B. Sohrabi (2006)An improved MCDM DEA model for technology selection, In: International journal of production research44(13)pp. 2681-2686 Taylor & Francis Group

This paper presents an Improved MCDM Data Envelopment Analysis (DEA) model in order to evaluate the best efficient DMUs in Advanced Manufacturing Technology (AMT). This model is capable of ranking the next most efficient DMUs after removing the previous best one.

Gholam R. Amin, Mehdi Toloo (2004)A polynomial-time algorithm for finding ε in DEA models, In: Computers & Operations Research31(5)pp. 803-805 Elsevier

This paper presents a new algorithm for computing the non-Archimedean ε in DEA models. It is shown that this algorithm is polynomial-time of O(n), where n is the number of decision making units (DMUs). Also it is proved that using only inputs and outputs of DMUs, the non-Archimedean ε can be found such that, the optimal values of all CCR models, which are corresponding to all DMUs, are bounded and an assurance value is obtained.

Siamak Talatahari, Mahdi Azizi, Mehdi Toloo, Milad Baghalzadeh Shishehgarkhaneh (2022)Optimization of Large-Scale Frame Structures Using Fuzzy Adaptive Quantum Inspired Charged System Search, In: International journal of steel structures22pp. 686-707 Korean Society of Steel Construction

In this paper, a metaheuristic-based design approach is developed in which the structural design optimization of large-scale steel frame structures is concerned. Although academics have introduced form-dominant methods, yet using artificial intelligence in structural design is one of the most critical challenges in recent years. However, the Charged System Search (CSS) is utilized as the primary optimization approach, which is improved by using the main principles of quantum mechanics and fuzzy logic systems. In the proposed Fuzzy Adaptive Quantum Inspired CSS algorithm, the position updating procedure of the standard algorithm is developed by implementing the center of potential energy presented in quantum mechanics into the general formulation of CSS to enhance the convergence capability of the algorithm. Simultaneously, a fuzzy logic-based parameter tuning process is also conducted to enhance the exploitation and exploration rates of the standard optimization algorithm. Two 10 and 60 story steel frame structures with 1026 and 8272 structural members, respectively, are utilized as design examples to determine the performance of the developed algorithm in dealing with complex optimization problems. The overall capability of the presented approach is compared with the Charged System Search and other metaheuristic optimization algorithms. The proposed enhanced algorithm can prepare better results than the other metaheuristics by considering the achieved results.

Additional publications