Recent Submissions

  • A classification and new benchmark instances for the multi-skilled resource-constrained project scheduling problem

    Snauwaert, Jakob; Vanhoucke, Mario (European Journal of Operational Research, 2022)
    This paper studies and analyses the multi-skilled resource-constrained project scheduling problem (MSRCPSP). We present a new classification scheme based on an existing classification scheme for project scheduling problems. This allows researchers to classify all multi-skilled project scheduling problems and its extensions. Furthermore, we propose a new data generation procedure for the MSRCPSP and introduce multiple artificial datasets for varying research purposes. The new datasets are generated based on new multi-skilled resource parameters and are compared to existing benchmark datasets in the literature. A set of 7 empirical multi-skilled project instances from software and railway construction companies are collected in order to validate the quality of the artificial datasets. Solutions are obtained through a genetic algorithm and by solving a mixed-integer linear programming formulation with CPLEX 12.6. The hardness of the multi-skilled project instances is investigated in the computational experiments. An experimental analysis studies the impact of skill availability, workforce size and multi-skilling on the makespan of the project.
  • Stuck between me: A psychodynamic view into career inaction

    Rogiers, Philip; Verbruggen, Marijke; D'Huyvetter, Paulien; Abraham, Elisabeth (Journal of Vocational Behavior, 2022)
    We all know people who want to make a change in their careers but do not act on this desire. Yet this phenomenon, recently labeled “career inaction” (Verbruggen & De Vos, 2020), has received almost no research attention to date. To address this gap and enrich our understanding of career inaction, this paper explores the lived experiences of 43 individuals characterized by inaction. Employing a qualitative research design and informed by the broader literature on psychodynamics, we find that people's experience of inaction is emotionally tense and situated among the interaction of three psychodynamic “me”-identifications: the “striving me,” the “comfortable me,” and the “uncertain me.” Our study further identifies various tension-easing strategies that help people ease the psychological strain of career inaction, even though their inaction often continued. Altogether, our study enriches and extends extant theorizing on career inaction and calls for a renewed focus on bounded rationality and emotionality in contemporary careers.
  • An efficient genetic programming approach to design priority rules for resource-constrained project scheduling problem

    Luo, Jingyu; Vanhoucke, Mario; Coelho, José; Guo, Weikang (Expert Systems with Applications, 2022)
    In recent years, machine learning techniques, especially genetic programming (GP), have been a powerful approach for automated design of the priority rule-heuristics for the resource-constrained project scheduling problem (RCPSP). However, it requires intensive computing effort, carefully selected training data and appropriate assessment criteria. This research proposes a GP hyper-heuristic method with a duplicate removal technique to create new priority rules that outperform the traditional rules. The experiments have verified the efficiency of the proposed algorithm as compared to the standard GP approach. Furthermore, the impact of the training data selection and fitness evaluation have also been investigated. The results show that a compact training set can provide good output and existing evaluation methods are all usable for evolving efficient priority rules. The priority rules designed by the proposed approach are tested on extensive existing datasets and newly generated large projects with more than 1,000 activities. In order to achieve better performance on small-sized projects, we also develop a method to combine rules as efficient ensembles. Computational comparisons between GP-designed rules and traditional priority rules indicate the superiority and generalization capability of the proposed GP algorithm in solving the RCPSP.
  • Mathematical formulations for project scheduling problems with categorical and hierarchical skills

    Snauwaert, Jakob; Vanhoucke, Mario (Computers & Industrial Engineering, 2022)
    In this paper, we present six extensions to the multi-skilled resource-constrained project scheduling problem (MSRCPSP) by introducing hierarchical levels of skills. These hierarchical skills can impact the MSRCPSP in multiple different ways. This paper studies efficiency differences, cost differences, quality differences and more. For each of these problems we propose and analyse seven continuous and time-indexed (mixed-)integer linear programming formulations. A modular artificial dataset is generated that assembles instances of the presented problems as well as combinations of these problems. In the computational experiments, we solve these instances using the proposed mathematical formulations with the CPLEX solver. Finally, we compare the results of the different formulations for the resource-constrained project scheduling problems with hierarchical levels of skills in order to explain their inherent similarities and differences.
  • Smart metering interoperability issues and solutions: Taking inspiration from other ecosystems and sectors

    Reif, Valerie; Meeus, Leonardo (Utilities Policy, 2022)
    Interoperability in the context of smart electricity metering is high on the European policy agenda, but its essence has been challenging to capture. This paper looks at experiences in other ecosystems (electromobility and buildings), in other sectors (healthcare and public administration), and at the national level in the Netherlands and the UK. We show that the definition of interoperability depends on the context, that there are common solutions to different issues across sectors and that cross-sectoral factors must be increasingly considered. We recommend adopting a broader view in smart metering beyond the interoperability of devices, considering solutions that have worked in other sectors and exploiting synergies across sectors. Our analysis of experiences provides a comparison that can help move the debate at the EU level forward.
  • A reduction tree approach for the discrete time/cost trade-off problem

    Van Eynde, Rob; Vanhoucke, Mario (Computers & Operations Research, 2022)
    The Discrete Time/Cost Trade-Off Problem is a well studied problem in the project scheduling literature. Each activity has multiple execution modes, a solution is obtained by selecting a mode for each activity. In this manuscript we propose an exact algorithm to obtain the complete curve of non-dominated time/cost alternatives for the project. Our algorithm is based on the network reduction approach in which the project is reduced to a singular activity. We develop the reduction tree, a new datastructure that tracks the modular decomposition structure of an instance at each iteration of the reduction sequence. We show how it is related to the complexity graph of the instance. Several exact and heuristic algorithms to construct a good reduction tree are proposed. Our computational experiments show that the use of the reduction tree provides significant speedups when compared to the existing reduction plan approach. Although the new approach does not outperform the best performing branch-and-bound procedure from the literature, the experiments show that incorporating modular decomposition can provide significant performance improvements for solution algorithms, showing potential for developing improved hybridized procedures to solve this challenging problem type.
  • Lending when relationships are scarce: The role of information spread via bank networks

    Alperovych, Yan; Divakaruni, Anantha; Manigart, Sophie (Journal of Corporate Finance, 2022)
    We investigate how information flows within bank networks facilitate syndicate formation and lending in the leveraged buyout (LBO) market, where relationships between banks and borrowers are scarce and borrower opacity is high. Using novel measures that characterize a bank’s ability to source and disseminate information within its loan syndication network, we show that the extent of this capability influences which banks join the syndicate, the share the lead bank holds, and LBO borrowing terms. Banks’ ability to source and disseminate network-based information is particularly useful when ties to prospective borrowers are lacking, with the information flows extending beyond knowledge on PE firms and LBO targets.
  • A joint replenishment production-inventory model as an MMAP[K]/PH[K]/1 queue

    Noblesse, Ann M.; Sonenberg, Nikki; Boute, Robert; Lambrecht, Marc R.; Van Houdt, Benny (Stochastic Models, 2022)
    In this paper we analyse a continuous review finite capacity production-inventory system with two products in inventory. With stochastic order quantities and time between orders, the model reflects a supply chain that operates in an environment with high levels of volatility. The inventory is replenished using an independent order-up-to (s, S) policy or a can-order (s, c, S) joint replenishment policy in which the endogenously determined lead times drive the parameters of the replenishment policy. The production facility is modelled as a multi-type MMAP[K]/PH[K]/1 queue in which there are K possible inventory positions when the order is placed and the age process of the busy queue has matrix-exponential distribution. We characterize the system and determine the steady state distribution using matrix analytic methods. Using numerical methods we obtain the inventory parameters that minimize the total ordering and inventory related costs. We present numerical comparisons of independent and joint replenishment policies with varying lead times, order quantities, and cost reductions. We further demonstrate the interplay between the two products in terms of lead times, order quantities and costs.
  • A resampling method to improve the prognostic model of end-stage kidney disease: A better strategy for imbalanced data

    Shi, Xi; Qu, Tingyu; Van Pottelbergh, Gijs; van den Akker, Marjan; De Moor, Bart (Frontiers in Medicine, 2022)
    Background: Prognostic models can help to identify patients at risk for end-stage kidney disease (ESKD) at an earlier stage to provide preventive medical interventions. Previous studies mostly applied the Cox proportional hazards model. The aim of this study is to present a resampling method, which can deal with imbalanced data structure for the prognostic model and help to improve predictive performance. Methods: The electronic health records of patients with chronic kidney disease (CKD) older than 50 years during 2005–2015 collected from primary care in Belgium were used (n = 11,645). Both the Cox proportional hazards model and the logistic regression analysis were applied as reference model. Then, the resampling method, the Synthetic Minority Over-Sampling Technique-Edited Nearest Neighbor (SMOTE-ENN), was applied as a preprocessing procedure followed by the logistic regression analysis. The performance was evaluated by accuracy, the area under the curve (AUC), confusion matrix, and F3 score. Results: The C statistics for the Cox proportional hazards model was 0.807, while the AUC for the logistic regression analysis was 0.700, both on a comparable level to previous studies. With the model trained on the resampled set, 86.3% of patients with ESKD were correctly identified, although it was at the cost of the high misclassification rate of negative cases. The F3 score was 0.245, much higher than 0.043 for the logistic regression analysis and 0.022 for the Cox proportional hazards model. Conclusion: This study pointed out the imbalanced data structure and its effects on prediction accuracy, which were not thoroughly discussed in previous studies. We were able to identify patients with high risk for ESKD better from a clinical perspective by using the resampling method. But, it has the limitation of the high misclassification of negative cases. The technique can be widely used in other clinical topics when imbalanced data structure should be considered.
  • Can deep reinforcement learning improve inventory management? Performance on lost sales, dual-sourcing, and multi-echelon problems

    Gijsbrechts, Joren; Boute, Robert; Van Mieghem, Jan A.; Zhang, Dennis J. (Manufacturing & Service Operations Management, 2022)
    Problem definition: Is deep reinforcement learning (DRL) effective at solving inventory problems? Academic/practical relevance: Given that DRL has successfully been applied in computer games and robotics, supply chain researchers and companies are interested in its potential in inventory management. We provide a rigorous performance evaluation of DRL in three classic and intractable inventory problems: lost sales, dual sourcing, and multi-echelon inventory management. Methodology: We model each inventory problem as a Markov decision process and apply and tune the Asynchronous Advantage Actor-Critic (A3C) DRL algorithm for a variety of parameter settings. Results: We demonstrate that the A3C algorithm can match the performance of the state-of-the-art heuristics and other approximate dynamic programming methods. Although the initial tuning was computationally demanding and time demanding, only small changes to the tuning parameters were needed for the other studied problems. Managerial implications: Our study provides evidence that DRL can effectively solve stationary inventory problems. This is especially promising when problem-dependent heuristics are lacking. Yet, generating structural policy insight or designing specialized policies that are (ideally provably) near optimal remains desirable.
  • Volume flexibility at responsive suppliers in reshoring decisions: Analysis of a dual sourcing inventory model

    Gijsbrechts, Joren; Boute, Robert; Disney, Stephen M.; Van Mieghem, Jan A. (Production and Operations Management, 2022)
    We investigate how volume exibility, de ned by a sourcing cost premium beyond a base capacity, at a local responsive supplier impacts the decision to reshore supply. The buyer also has access to a remote supplier that is cheaper with no restrictions on volume exibility. We show that with unit lead time difference between both suppliers, the optimal dual sourcing policy is a modi ed dual base-stock policy with three base-stock levels Sf2 , Sf1 , and Ss. The replenishment orders are generated by rst placing a base order from the fast supplier of at most k units to raise the inventory position to Sf 1 , if that is possible. After this base order, if the adjusted inventory position is still below Sf 2 , additional units are ordered from the fast supplier at an overtime premium to reach Sf 2 . Finally, if the adjusted inventory position is below Ss, an order from the slow supplier is placed to bring the nal inventory position to Ss. Surprisingly, in contrast to single sourcing with limited volume exibility, a more complex dual sourcing model often results in a \simpler" policy that replaces demand in each period. The latter allows analytical insights into the sourcing split between the responsive and the remote supplier. Our analysis shows how increased volume exibility at the responsive supplier promotes the decision to reshore operations and effectively serves as a cost bene t. It also shows how investing in base capacity or additional volume exibility act as strategic substitutes.
  • Helping organizations and individuals develop conflict wisdom

    Jordaan, Barney (Conflict Resolution Quarterly, 2022)
    Burgess et al. (BBK) propose to address hyper-polarized, society-wide conflicts through what they call a “massively parallel” approach “seeking to cultivate large numbers of independent but mutually reinforcing projects each addressing particular aspects of hyper-polarization in specific contexts.” The authors propose that these goals be pursued in two mutually reinforcing activity streams. The first involves traditional multiparty conflict resolution processes (e.g., community dialogues and multiparty negotiations) conducted under the guidance of third-party interveners. The second addresses the causes of hyperpolarized conflicts, for example, by instituting changes to electoral systems to try to minimize opportunities for hyperpolarization to occur. This commentary focuses on particular aspects of the second stream, that is, addressing what BBK call “the real energy behind hyperpolarized politics.” Chief among these are the emotional triggers that typically fuel conflicts such as anger, fear, and desperation. Initiatives such as showing people how a better understanding of conflict dynamics can help them defend their legitimate interests, while also pointing out the dangers of allowing conflict to escalate and cause polarization.
  • New summary measures and datasets for the multi-project scheduling problem

    Van Eynde, Rob; Vanhoucke, Mario (European Journal of Operational Research, 2022)
    In recent years, more researchers have devoted their attention to the resource-constrained multi-project scheduling problem, resulting in a growing body of knowledge on solution procedures. A key factor in the comparison of these procedures is the availability of benchmark datasets that cover a large part of the feature space. Otherwise, one risks that the conclusions from experiments on these sets do not hold when they are repeated on a different set. In this paper we propose new multi-project datasets that contain instances with a wide variety of characteristics. We first develop several new summary measures that describe three types of portfolio characteristics, two of the three types are not present in any of the existing datasets. Second, an algorithm is developed that can generate instances with the desired parameter values in a controlled manner. With this procedure, we create three datasets that each focus on one of the characteristics and a fourth dataset that contains all combinations. The computational results show (a) that these sets cover a significantly larger part of the feature space than existing benchmark libraries and (b) that they are more challenging for advanced algorithms.
  • The digital future of internal staffing: A vision for transformational electronic human resource management

    Rogiers, Philip; Viaene, Stijn; Leysen, Jan (Intelligent Systems in Accounting Finance & Management, 2020)
    Through an international Delphi study, this article explores the new electronic human resource management regimes that are expected to transform internal staffing. Our focus is on three types of information systems: human resource management systems, job portals, and talent marketplaces. We explore the future potential of these new systems and identify the key challenges for their implementation in governments, such as inadequate regulations and funding priorities, a lack of leadership and strategic vision, together with rigid work policies and practices and a change-resistant culture. Tied to this vision, we identify several areas of future inquiry that bridge the divide between theory and practice.
  • Identifying digital transformation paradoxes: A design perspective

    Danneels, Lieselot; Viaene, Stijn (Business & Information Systems Engineering, 2022)
    In turbulent contexts, organizations face contradictory challenges which give rise to management tensions and paradoxes. Digital transformation is one such context where the disruptive potential of digital technologies demands radical responses from existing organizations. While prior research has recognized the importance of coping with organizational paradoxes, little is known about how to identify them. Although it may be apparent in some settings which paradoxes are at play, other more ambivalent contexts require explicit identification. This study takes a design perspective to identify the relevant paradoxes in a digital transformation context. It presents the results of a 2-year action design research study in collaboration with an organization that chose to explicitly focus on paradoxical tensions for managing its digital transformation. The study's main contribution is twofold: (1) it presents design knowledge to identify organizational paradoxes; (2) it provides a better understanding of the organizational paradoxes involved in digital transformation. The design knowledge will help others to identify paradoxes when working with an organization and highlights dynamic and collaborative aspects of the identification process. The study also enhances the descriptive understanding of digital transformation paradoxes by showing the importance of learning and belonging tensions and by expressing a different view on what knowledge about paradoxes is, and how it is created and used.
  • The joint replenishment problem: Optimal policy and exact evaluation methody

    Creemers, Stefan; Boute, Robert (European Journal of Operational Research, 2022)
    We propose a new method to evaluate any stationary joint replenishment policy under compound Poisson demand. The method makes use of an embedded Markov chain that only considers the state of the system after an order is placed. The resulting state space reduction allows exact analysis of instances that until now could only be evaluated using approximation procedures. In addition, the size of the state space is not affected if we include nonzero lead times, backlog, and lost sales. We characterize the optimal joint replenishment policy, and use these characteristics to develop a greedy-optimal algorithm that generalizes the can-order policy, a well-known family in the class of joint replenishment policies. We numerically show that this generalized can-order policy only marginally improves the best conventional can-order policy. For sizeable systems with multiple items, the latter can now be found using our exact embedded Markov-chain method. Finally, we use our method to improve and extend the well-known decomposition approach.
  • Variability drivers of treatment costs in hospitals: A systematic review

    Jacobs, Karel; Roman, Erin; Lambert, Jo; Moke, Lieven; Scheys, Lennart; Kesteloot, Katrien; Roodhooft, Filip; Cardoen, Brecht (Health Policy, 2022)
    Objectives Studies on variability drivers of treatment costs in hospitals can provide the necessary information for policymakers and healthcare providers seeking to redesign reimbursement schemes and improve the outcomes-over-cost ratio, respectively. This systematic literature review, focusing on the hospital perspective, provides an overview of studies focusing on variability in treatment cost, an outline of their study characteristics and cost drivers, and suggestions on future research methodology. Methods We adhered to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses and Cochrane Handbook for Systematic Reviews of Interventions. We searched PubMED/MEDLINE, Web of Science, EMBASE, Scopus, CINAHL, Science direct, OvidSP and Cochrane library. Two investigators extracted and appraised data for citation until October 2020. Results 90 eligible articles were included. Patient, treatment and disease characteristics and, to a lesser extent, outcome and institutional characteristics were identified as significant variables explaining cost variability. In one-third of the studies, the costing method was classified as unclear due to the limited explanation provided by the authors. Conclusion Various patient, treatment and disease characteristics were identified to explain hospital cost variability. The limited transparency on how hospital costs are defined is a remarkable observation for studies wherein cost variability is the main focus. Recommendations relating to variables, costs, and statistical methods to consider when designing and conducting cost variability studies were provided.
  • A dynamic “predict, then optimize” preventive maintenance approach using operational intervention data

    van Staden, Heletjé E.; Deprez, Laurens; Boute, Robert (European Journal of Operational Research, 2022)
    We investigate whether historical machine failures and maintenance records may be used to derive future machine failure estimates and, in turn, prescribe advancements of scheduled preventive maintenance interventions. We model the problem using a sequential predict, then optimize approach. In our prescriptive optimization model, we use a finite horizon Markov decision process with a variable order Markov chain, in which the chain length varies depending on the time since the last preventive maintenance action was performed. The model therefore captures the dependency of a machine’s failures on both recent failures as well as preventive maintenance actions, via our prediction model. We validate our model using an original equipment manufacturer data set and obtain policies that prescribe when to deviate from the planned periodic maintenance schedule. To improve our predictions for machine failure behavior with limited to no past data, we pool our data set over different machine classes by means of a Poisson generalized linear model. We find that our policies can supplement and improve on those currently applied by 5%, on average.
  • Willingness to disclose personal information in the context of addressable TV advertising. What is the role of personal and situational factors?

    De Schaepdrijver, Leen; Baecke, Philippe; Tackx, Koen (Journal of Advertising Research, 2022)
    The new technology of addressable advertising on TV opens the door to better targeting and measurement of TV advertising campaigns. However, gaining access to consumer data is paramount for this new technology. This article aims to understand consumers’ willingness to disclose personal information in the context of addressable advertising by applying privacy calculus theory. The authors administered a survey to 1,858 participants, examining the influence of both personal and situational factors on consumers’ willingness to disclose information. Personalization value is the strongest antecedent of willingness to disclose data, followed by privacy concerns and institutional trust. Moreover, the authors suggest how situational factors such as type of data and customer benefits—controllable by companies—influence individuals’ willingness to disclose information and how they might balance out each other.
  • A time-driven activity-based costing approach for identifying variability in costs of childbirth between and within types of delivery

    Dubron, Kathia; Verschaeve, Mathilde; Roodhooft, Filip (BMC Pregnancy and Childbirth, 2021)
    Background: Recently, time-driven activity-based costing (TDABC) is put forward as an alternative, more accurate costing method to calculate the cost of a medical treatment because it allows the assignment of costs directly to patients. The objective of this paper is the application of a time-driven activity-based method in order to estimate the cost of childbirth at a maternal department. Moreover, this study shows how this costing method can be used to outline how childbirth costs vary according to considered patient and disease characteristics. Through the use of process mapping, TDABC allows to exactly identify which activities and corresponding resources are impacted by these characteristics, leading to a more detailed understanding of childbirth cost. Methods: A prospective cohort study design is performed in a maternity department. Process maps were developed for two types of childbirth, vaginal delivery (VD) and caesarean section (CS). Costs were obtained from the financial department and capacity cost rates were calculated accordingly. Results: Overall, the cost of childbirth equals €1894,12 and is mainly driven by personnel costs (89,0%). Monitoring after birth is the most expensive activity on the pathway, costing €1149,70. Significant cost variations between type of delivery were found, with VD costing €1808,66 compared to €2463,98 for a CS. Prolonged clinical visit (+ 33,3 min) and monitoring (+ 775,2 min) in CS were the main contributors to this cost difference. Within each delivery type, age, parity, number of gestation weeks and education attainment were found to drive cost variations. In particular, for VD an age > 25 years, nulliparous, gestation weeks > 40 weeks and higher education attainment were associated with higher costs. Similar results were found within CS for age, parity and number of gestation weeks. Conclusions: TDABC is a valuable approach to measure and understand the variability in costs of childbirth and its associated drivers over the full care cycle. Accordingly, these findings can inform health care providers, managers and regulators on process improvements and cost containment initiatives. [

View more