

X-AI techniques applied to a DL model used to check real-time multivariate time series from connected black-boxes installed on cars.
The prediction of human mobility is enabled by the large and diversified availability of data. Insurance companies leverage such spatio- temporal information to develop effective deep learning-based approaches to provide top-quality services to their customers. In Hidden Insurance Company Name (HICN), an automatic decision-making model is used to check real-time multivariate time series and alert if a car crash happened. In such a way, a HICN operator can call the customer to provide first assistance. The high sensitivity of the model used, combined with the fact that the model is not interpretable, might cause the operator to call customers even though a car crash did not happen but only due to a harsh deviation or the fact that the road is bumpy. Our goal is to tackle the problem of interpretability for car crash prediction and propose an eXplainable Artificial Intelligence (XAI) workflow that allows gaining insights regarding the logic behind the deep learning predictive model adopted by HICN. We reach our goal by building an interpretable alternative to the current obscure model that also increases the classification’s precision and reduces the training data usage and the prediction time.
The Industry Partner name (insurance) is temporarily hidden due to double blind submissions constraints.