Maintenance recommendations have been a part of the maintenance decision support system (MDSS) since the winter of 2003-2004. These recommendations resulted from research started in 2002 under a Federal Highway Administration (FHWA) transportation pooled fund study (TPF-5(054)) that evolved into a standalone graphical user interface (GUI) application in 2003. For nearly a decade the members of the Pooled Fund Study (PFS) leadership team focused on expanding the capabilities of MDSS and the GUI, plus improving the communication techniques to exchange data between the department of transportation users and MDSS. In 2011, the technical panel, comprising representatives from all active DOT members in the PFS, opted to initiate a task to evaluate the performance of the MDSS recommendations.
The initial objective of the assessment of recommendations (AoR) study was to determine whether users accepted or declined the recommendations and why. To facilitate user input from the field, an easy-to-use interface was added to the GUI that permitted users to select a current recommendation on their route and select whether they accepted, conditionally accepted or declined that recommendation. Users were also asked to indicate with a yes or no whether the MDSS analysis matched the user-observed conditions on the route (road temperature, road condition, weather type) and whether MDSS had received the most current maintenance treatment. Over the next seven years, users entered their assessments of individual recommendations via the GUI. In 2018, users started to transfer from the GUI to the WebMDSS application. Unfortunately, the AoR module did not get implemented in WebMDSS until 2019. The following graph illustrates the user input over the nine-year period of the study.
In the initial years of the AoR study (2011-2015) the performance analysis merely kept statistics on the users’ yes/no answers on the initial condition matches. When users indicated that conditions matched their observations, the recommendation acceptance rate was close to 90%. However, when users’ observations and the MDSS analysis did not match, the acceptance percentage dropped to 40–45%. So, starting in 2015, the AoR study was modified to add an Iteris review of every AoR report with a declined recommendation. For each declined report an Iteris meteorologist would set the user interface to the same route and time indicated in the user’s report, review the actual weather scenario and evaluate how MDSS related to these conditions. After five years of these detailed reviews, the Iteris team determined that MDSS provided excellent recommendations when all of the input into MDSS was correct. However, when MDSS received inaccurate data, it had a tendency to generate inappropriate recommendations. The five years indicated that the sources for inaccurate input were:
1. Primary sources
- Errant MDSS estimate of snowfall
- Errant MDSS estimate of road temperature
- No input of maintenance actions performed on route sent from the field
- Incorrect analysis of road condition (Often due to one of the above)
- Inaccurate forecast
- Errant MDSS estimate of dew point temperatures (especially in frost situations)
- Improper MDSS configuration settings of route parameters
2. Secondary causes
- Mismatch between MDSS and actual traffic rates
- Mismatch between how user inputs data and how MDSS interprets the terms used by the user
The interesting finding from the review of the MDSS processing is that the MDSS simulation of road conditions is very complex and recommendations can change markedly with minor discrepancies between what MDSS “knows” and what actually exists in the field. Improvements in the performance of MDSS recommendations are highly dependent upon reducing the effects of the factors in the list.
The AoR program will continue during the 2020-2021 winter. Your participation would help improve our understanding of the performance of the MDSS recommendation process and what factors within MDSS need refinement.