With over 1500 intersections in the City and County of Denver, and multiple incidents a day at different times, how could the Operations Center determine what was “normal traffic” and what were abnormal anomalies. Which of the 1500 intersection monitors should be watched and when to help identify traffic disruptions and improve traffic flow and safety?
The MOST Team designed data analytic modeling algorithms to compute baseline models of expected traffic conditions and flow (Fingerprints). Statistical “normal” data characteristics were defined and calculated over specified time frames, variable from days to weeks to months to a year(s). This was calculated in 10 minute increments.
Real-time traffic data characteristics (Snapshots) are then compared to the normalized historical data characteristics and flow (Fingerprints), both historically and in real time.
This comparison identifies statistical “out of norm” situations. Out-of-norm thresholds are used to provide real-time anomaly alerting.
Traffic behavior data is modeled across defined intersections and corridors.
MOST Programming automates data collections, aggregates historical data, and provides anomaly alerting of real-time traffic signal data.
Visual dashboards are developed and designed to reflect traffic behavior.
Anomalies are identified in real-time and provided to traffic engineers and the Denver Operations Center for action.
Traffic flow can be analyzed over time and ongoing anomalies identified to take action for improved traffic flow along corridors and at highly congested intersections.
Traffic Signal Priorities (TSP) are analyzed to predict the best timing of priority signals with the least disruption of traffic at intersections along bus routes and snow plow routes.
Anomaly visualization Example using Power BI