Time series data are inherently functions of time, however current transformers often learn time series by modeling them as mere concatenations of time periods, overlooking their functional properties. In this work, we propose a novel objective for transformers that learn time series by reinterpreting them as temporal functions. We construct an alternative sequence of time series by constructing degradation operators of different strengths in the functional space, creating augmented variants of the original sample that are abstracted or simplified to different degrees. Based on the new set of generated sequences, we train an autoregressive transformer that progressively recovers the original sample from the most simplified variant. Analogous to the next word prediction task in languages that learn narratives by connecting different words, our autoregressive transformer aims to learn Narratives of Time Series (NoTS) by connecting different features in time. Theoretically, we justify the construction of the alternative sequence due to its advantages in approximating functions. When learning time series data with transformers, the construction of sequences of temporal functions allows a broader class of approximate functions (e.g., differentiation) compared to sequences of time periods, leading to a performance improvement of 26 % in synthetic feature regression experiments. Experimentally, we validate NoTS on 3 different tasks on 22 real-world data sets, where we show that NoTS significantly outperforms other pre-training methods by up to 6%. Furthermore, combining NoTS on top of existing transformer architectures can steadily increase performance. Our results demonstrate the potential of NoTS as a general-purpose dynamic learner, offering a viable alternative for developing basic models for time series analysis.