Time Series Analysis with Transformers: Fixed vs Learned Time Representation

✨ What makes time series analysis unique compared to other machine learning approaches is the central role of time representation in shaping experiment design. In our latest work, we explore two variations of the Transformer architecture: 🔹 One using a fixed time representation proposed in the literature 🔹 One where the time representation is learned directly from data 👉 Read the full article here: https://lnkd.in/dVhnUREE 𝐘𝐨𝐮 𝐜𝐚𝐧 𝐫𝐞𝐚𝐝 𝐚𝐥𝐥 𝐨𝐮𝐫 𝐩𝐮𝐛𝐥𝐢𝐜𝐚𝐭𝐢𝐨𝐧𝐬 𝐨𝐧 𝐭𝐡𝐞 𝐌𝐀𝐍𝐎𝐋𝐎 𝐰𝐞𝐛𝐬𝐢𝐭𝐞: https://lnkd.in/dWybAK7w #MachineLearning #TimeSeries #Transformers #AIResearch #RenewableEnergy #HumanInTheLoop #MANOLOProject

  • text

To view or add a comment, sign in

Explore content categories