*Rant for the beginning of the article ahead
Why in the name of god did they try to bring LLM to the pictures. Saying AI/ML is good enough for predictive maintenance tasks, but noooo, it has to be LLM. If they want to be specific then don't be misleading, I think what they mean is the attention layer/operation commonly used in LLM to capture time series data. I understand that the Recurrent style neural network and LSTM has its limitations. And I agree that exploring attention to be used in time series data is an interesting research but LLM? Just no.
Research and development is tricky because you will never know how much more progress you will need before reaching a satisfying result.