Papers
arxiv:2605.08914

Transformer autoencoder with local attention for sparse and irregular time series with application on risk estimation

Published on May 9
Authors:

Abstract

A Transformer Autoencoder with local attention mechanism is developed for risk estimation in sparse and irregular time series data, demonstrating superior performance in detecting non-technical losses in electrical power systems compared to existing methods.

AI-generated summary

This paper introduces a framework specifically designed for sparse and irregular time series {risk estimation}. It is based on a Transformer Autoencoder with local attention, which leverages the powerful pattern identification capabilities of transformers complemented by traditional data cleaning and normalization methods. It efficiently captures relevant patterns within irregular sequences suffering from sparse data collection, benefiting from the discriminative ability of the local attention mechanism. The proposed framework is applied to a real-world case study, on the risk estimation of non-technical losses in electrical power systems in a wide area in Greece. Non-technical losses in electrical power systems, primarily stemming from electricity theft, pose significant economic and operational challenges. Detecting these anomalies is particularly challenging due to the inherent sparse and irregular nature of real-world data collection practices. Traditional risk estimation methods struggle with effectively capturing long-range dependencies and robustly handling such data characteristics. We demonstrate that our approach effectively yields highly discriminative latent features, which results in more consistent risk estimation compared with existing state-of-the-art and widely used methods. It achieves high recall and precision, meeting the critical objectives of the problem. As such, our solution offers a robust and effective tool for risk detection in irregular time series datasets.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2605.08914
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2605.08914 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2605.08914 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2605.08914 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.