Conference Keynote: Scott F. Breor
Chair: Björn Johansson (Chalmers University of Technology)
Assessing Critical Infrastructure Dependencies and Interdependencies
Todays infrastructure is connected to many other infrastructure assets, systems, and networks that it depends on for normal day-to-day operations. These connections, or dependencies, may be geographically limited or span great distances (NIPP 2013). The many points of infrastructure connections, and their geographic distribution, make the infrastructure environment much more complex. The U.S. Department of Homeland Security (DHS) works to strengthen critical infrastructure security and resilience by generating greater understanding and action across a (largely) voluntary partnership landscape. This is achieved by working with private and public infrastructure stakeholders to resolve infrastructure security and resilience knowledge gaps, inform infrastructure risk management decisions, identify resilience-building opportunities and strategies, and improve information sharing among stakeholders through a collaborative partnership approach. This talk highlights the Departments efforts to present a more comprehensive picture of security and resilience through a system of systems approach.
Titans of Simulation
Titan Talk: Peter Frazier
Chair: Sanjay Jain (The George Washington University)
Bridging the Gap from Academic Research to Industry Impact
Academic methodological research is often done with the hope of creating mathematical methods that will be used in practice. At the same time, there is a significant gap between publishing papers and having the methods described actually be used in industry. In this talk, we offer advice for bridging this gap. We discuss challenges arising from a difference in focus between academic and industry research, and also an incomplete awareness within academia of the full context in which methods are deployed in industry. We then discuss strategies for overcoming these challenges, describing them using examples from the presenter's experiences as a data science manager at Uber working on Uber's carpooling product, UberPOOL, and as an academic developing Bayesian optimization algorithms for use at Yelp and the Bayesian optimization startup company SigOpt. This talk is aimed at academics who want their research to be used in industry, soon-to-graduate PhD students who are making a leap into an industry career, and practitioners interested in exploring ways to be more effective.
Titans of Simulation
Titan Talk: Russell Cheng
Chair: Sanjay Jain (The George Washington University)
Creating a Real Impression: Visual Statistical Analysis
Many powerful statistical methods available for studying simulation output are under-appreciated and consequently under-used, because they are considered to be hard-to-understand, arcanely mathematical, and hard to implement. Such methods can invariably be implemented using data-driven resampling methods, making their underlying rationale quite transparent. There is little need for much formal mathematics, and what there is can be made visually obvious, with method and results explained and presented using figures and graphs, often with dynamic animation. This approach in studying simulation output will be illustrated by a number of examples drawn from simulation studies and real applications. A bonus of the approach is that it is quite easy to create ones own bespoke method of analysis tailored to a particular problem. Such examples will be presented and analyzed in real-time in the talk itself, enabling the results to be immediately displayed.
Track Coordinator - Best Papers: Sanjay Jain (The George Washington University)
Best Applied Paper
Simulation Based Prediction of the Near-Future Emergency Medical Services System State
An ambulance dispatcher decides which ambulances to allocate to new calls, and how to relocate ambulances in order to maintain a good coverage. Doing this, it is valuable to have information about the future expected response times in different parts of the area of responsibility, as well as the expected number of available ambulances. We present a simulation model that can be used to predict this, and compare the results to a naïve forecasting model. The results show that while it is difficult to accurately predict the future system state, the simulation based prediction manages this better than the naïve model.
Best Theoretical Paper
Sampling Uncertain Constraints under Parametric Distributions
We consider optimization problems with uncertain constraints that need to be satisfied probabilistically. When data are available, a common method to obtain feasible solutions for such problems is to impose sampled constraints, following the so-called scenario generation (SG) approach. However, when the data size is small, the sampled constraints may not support a guarantee on the feasibility of the obtained solution. This paper studies how to leverage parametric information and the power of Monte Carlo simulation to obtain feasible solutions even when the data are not sufficient to support the use of SG. Our approach makes use of a distributionally robust optimization formulation that informs the Monte Carlo sample size needed to achieve our guarantee.
Finalists for Best Applied Paper
RH-RT: A Data Analytics Framework for Reducing Wait Time at Emergency Departments and Centres for Urgent Care
Right Hospital Right Time (RH-RT) is the conceptualization of the use of descriptive, predictive and prescriptive analytics with real-time data from Accident & Emergency (A&E)/Emergency Departments (ED) and centers for urgent care; its objective is to derive maximum value from wait time data by using data analytics techniques, and making them available to both patients and healthcare organizations. The paper presents an architecture for the implementation of RH-RT that is specific to the authors current work on a digital platform (NHSquicker) that makes available live waiting time from multiple centers of urgent care (e.g., A&E/ED, Minor Injury Units) in Devon and Cornwall. The focus of the paper is on the development of a Hybrid Systems Model (HSM) comprising of healthcare business intelligence, forecasting techniques and computer simulation. The contribution of the work is the conceptual RH-RT framework and its implementation architecture that relies on near real-time data from NHSquicker.
Stochastic Optimization for Feasibility Determination: An Application to Water Pump Operation in Water Distribution Network
Water Distribution Networks are a particularly critical infrastructure for the high energy costs and frequent failures. Variable Speed Pumps have been introduced to improve the regulation of water pumps, a key for the overall infrastructure performance. This paper addresses the problem of analyzing the effect of the VSPs regulation on the pressure distribution of a WDN, which is highly correlated to leakages and energy costs. Due to the fact that water network behavior can only be simulated, we formulate the problem as a black box feasibility determination, which we solve with a novel stochastic partitioning algorithm, the Feasibility Set Approximation Probabilistic Branch and Bound, that extends the algorithm previously proposed by two of the authors. We use, as black box, EPANet, a widely adopted hydraulic simulator. The preliminary results, over theoretical functions as well as a water distribution network benchmark case, show the viability and advantages of the proposed approach.
Ford's Power Train Operations: Changing the Simulation Environment 2
At the 2001 Winter Simulation Conference, the progress of simulation in Ford Motor Companys PowerTrain Manufacturing Engineering (PTME) department was documented in a paper that focused on the contributions of the department in changing the simulation environment at Ford. This paper reviews the progress, changes, and issues in the intervening years experienced by the UK-based PTME simulation team. It summarizes the development of a toolset from a model building capability to data analysis, to experimentation, and results analysis. It outlines how capabilities have expanded while maintaining quick delivery of results; it references the managements' changing attitude and how academic research has advanced simulation in PTME.
Modeling Bursts in the Arrival Process to an Emergency Call Center
In emergency call centers (for police, firemen, ambulances) a single event can sometimes trigger many incoming calls in a short period of time. Several people may call to report the same fire or the same accident, for example. Such a sudden burst of incoming traffic can have a significant impact on the responsiveness of the call center for other events in the same period of time. We examine data from the SOS Alarm center in Sweden. We also build a stochastic model for the bursts. We show how to estimate the model parameters for each burst by maximum likelihood, how to model the multivariate distribution of those parameters using copulas, and how to simulate the burst process from this model. In our model, certain events trigger an arrival process of calls with a random time-varying rate over a finite period of time of random length.
Finalists for Best Theoretical Paper
Gradient Based Criteria for Sequential Design
Computer simulation experiments are commonly used as an inexpensive alternative to real-world experiments to form a metamodel that approximates the input-output relationship of the real-world experiment. While a user may want to understand the entire response surface, they may also want to focus on interesting regions of the design space, such as where the gradient is large. In this paper we present an algorithm that adaptively runs a simulation experiment that focuses on finding areas of the response surface with a large gradient while also gathering an understanding of the entire surface. We consider the scenario where small batches of points can be run simultaneously, such as with multi-core processors.
Exact Posterior Simulation From The Linear LASSO Regression
The current popular method for approximate simulation from the posterior distribution of the linear Bayesian LASSO is a Gibbs sampler. It is well-known that the output analysis of an MCMC sampler is difficult due to the complex dependence amongst the states of the underlying Markov chain. Practitioners can usually only assess the convergence of MCMC samplers using heuristics. In this paper we construct a method that yields an independent and identically distributed (iid) draws from the LASSO posterior. The advantage of such exact sampling over the MCMC sampling is that there are no difficulties with the output analysis of the exact sampler, because all the simulated states are independent. The proposed sampler works well when the dimension of the parameter space is not too large, and when it is too large to permit exact sampling, the proposed method can still be used to construct an approximate MCMC sampler.
An Improved Simulation of Hybrid Biological Models with Many Stochastic Events and Quasi-Disjoint Subnets
Hybrid simulation, combining exact and approximate algorithms, provides an alternative to a completely stochastic simulation. However, one challenge for the efﬁcient implementation of hybrid simulations is the additional overhead due to frequent switches between the two regimes. The amount of additional overhead considerably increases with the number of discrete events in the stochastic regime. However, reactions that take place rather frequently cannot completely be avoided due to the accuracy requirements. In this paper, we present an improved hybrid simulation method which takes advantage of the Hybrid Rejection-based Stochastic Simulation Algorithm (HRSSA), a variant of the hybrid simulation approach. To reduce the overhead on account of the switches from the stochastic to the deterministic regime, we analyse and record the dependencies of reactions as well as species between the stochastic and deterministic subnetworks. Comparing our technique with existing ones shows a clear improvement in terms of runtime, while preserving accuracy.
Data-Driven Ranking and Selection: High-Dimensional Covariates and General Dependence
This paper considers the problem of ranking and selection with covariates and aims to identify a decision rule that stipulates the best alternative as a function of the observable covariates. We propose a general data-driven framework to accommodate (i) high-dimensional covariates and (ii) general (nonlinear) dependence between the mean performance of an alternative and the covariates. For both scenarios, we design new selection procedures and provide certain statistical guarantees, by leveraging the data-intensive environment and various statistical learning tools. The performances of our procedures are exhibited through simulation experiments.
Track Coordinator - Simulation for a Noble Cause: George Miller (MOSIMTEC), David Poza (University of Valladolid)
Simulation for a Noble Cause
Humanitarianism - I
Chair: David Poza (University of Valladolid)
A Systems Modeling Approach to Analyzing Human Trafficking
This paper offers a process for understanding and analyzing the most effective interventions for eliminating human trafficking. The process consists of several methods for gathering, parsing and testing information about the Overseas Filipino Workers (OFW) economic system as it relates to trafficked persons. The methodology is comprised of a series of methods that include interviews, causal loop analysis, data collection, system dynamics simulation model, and scenario simulation runs. The last method, scenario simulation runs, tests the cause and effect relationships between government policies, overseas workers options, hiring companies, and the economy in which they all operate and interact. The importance of this process is that it is robust, repeatable, and efficient in a real-world setting.
Where Are They Headed Next? Modeling Emergent Displaced Camps in the DRC Using Agent-based Models
The paper describes a prototype agent-based model used to predict the spontaneous settlements that can arise among internally displaced forced migrants in the Democratic Republic of the Congo. The internally displaced personsin the real-world and the modelare constrained by geographic and social forces that dictate their ability to locate and reach organized camps run by humanitarian organizations or instead group with others to establish small temporary settlements. This research is of interest to humanitarian response stakeholders who try to locate self-settlements in order to administer humanitarian assistance and prevent further loss of human life.
How Does Humanitarianism Spread? Modeling the Origins of Citizen Initiatives through Norm Diffusion
This paper describes a prototype agent-based model used to explain why and how a norm of humanitarianism diffuses through a population. The model is constructed on norm diffusion theories as a foundation for developing explaining the emergence of Citizen Initiatives in a humanitarian and development context. We assume that in the model, some agents are already norm adopters (advocates), some have a humanitarian potential that can be activated with persuasion, while others will never adopt the norm of humanitarianism under any condition. In this model, we try to determine whether parameters such as agents values, thresholds for accepting alternative values, values degradation, and peer-pressure affect agents decision to become humanitarian activists.
Simulation for a Noble Cause
Humanitarianism - II
Chair: George Miller (MOSIMTEC)
A Management Tool Based on Discrete Event Simulation for Humanitarian Support
Humanitarian aid is material or logistical assistance provided for humanitarian purposes, typically in response to humanitarian crises including natural disasters and man-made disaster. Humanitarian assistance requiring short response time windows in almost the whole world may be subject to long queues due to managing problems, e.g., the lack of control and/or inefﬁcient infrastructure. This work tackles such challenge by proposing a low-cost planning and managing model and method based on a discrete-event simulation mirror connected through WEB tools to a near or far management level. The usual conﬁguration of parallel servers (for instance, supported by local RFID monitoring) is implemented by a discrete-event simulation model that is validated by Jackson Networks (and vice versa). The results show a ﬂexible model that may identify bottlenecks in advance in order to accommodate trafﬁc ﬂow variations.
Simulation Model and Simulation-based Serious Gaming in Humanitarian Logistics
Humanitarian logistics has recently gained increasing attention from both academics and practitioners. Although various research groups have addressed theoretical and technical developments in humanitarian logistics, only a limited number of those can actually be generalized, extended, accessed, and understood by non-technical practitioners. To tackle these challenges, we develop a simulation model for humanitarian logistics preparedness and a simulation-based serious game to raise awareness and provide accessibility on humanitarian logistics research to a wider audience. The simulation model aims to optimize the network configuration for prepositioning stocks of life-saving goods in Indonesia, while the game aims to provide a risk-free environment where players can craft various strategies to plan and deploy effective humanitarian operations.
Developing an Agent-based Simulation Model of the Use of Different Communication Technologies in Inter-Organizational Disaster Response Coordination
Our research focuses on communications among a variety of organizations that coordinate their rapid responses to catastrophic disasters. Within the context of FEMAs National Response Coordination Center, we constructed an agent-based simulation model of the inter-organizational communications happening via their Web-based Emergency Operations Center, email, phone calls, and face-to-face conversations as the support requests were addressed and fulfilled. We developed our model based on FEMA documentation, observations, interviews, and exercise data. In this paper we outline our model development process and provide details about our simulation model to highlight and address some of the particular challenges one faces when developing simulation models of disaster response activities. We describe what specific aspects of communication media and situational factors our model was developed to test, and also present the design and select results of our first research experiment using this model.
Simulation for a Noble Cause
Chair: George Miller (MOSIMTEC)
Optimal Enterprise Results in the Clinical Research Environment
Even the best scientific minds cannot repeatedly produce desired results when working in sub-optimal systems. Complex enterprises are difficult to understand and manage. Cause-and-effect relationships are often separated in time and space, making real improvements challenging. To understand how complex systems work it is essential that we employ tools that accurately map and quantify the dynamics that drive results. Computer modeling and simulation (CMAS) is a valuable design, planning, management, and overall analytical decision-support tool to achieve effective and efficient results. CMAS could become a ubiquitous tool in the lengthy and complex environment of the clinical research (CR) enterprise. Without the comprehensive understanding gained when applying CMAS, organizations may continue to be overwhelmed by problems such as unnecessary bottlenecks, high costs, low productivity, and the inability of retaining critical staff. The approach explained here may complement or even replace traditional methods when organizations pursue greater enterprise capability.