1. 논문 정보제목: A Time Series Is Worth 64 Words: Long-Term Forecasting With Transformers저자: Yuqi Nie (Princeton University), Nam H. Nguyen, Phanwadee Sinthong, Jayant Kalagnanam (IBM Research)소속: Princeton University, IBM Research출판: ICLR 2023 (International Conference on Learning Representations)링크: https://arxiv.org/pdf/2211.14730 2. 연구 배경 및 목적 (Introduction)배경시계열 예측(Time Series Forecasting)은 ..
1. 논문 정보제목: Attention Is All You Need저자: Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, tukasz Kaiser, lllia Polosukhin기관: Google Brain, Google Research, University of Toronto출판: Proceedings of the 31st Conference on Neural Information Processing Systems (NeurIPS 2017)링크: https://arxiv.org/pdf/1706.03762 2. 연구 배경 및 목적 (Introduction)기존 시퀀스 변환 모델(Seque..
1. 논문 정보제목: AnEmpirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling저자: shaojie Bai, J. Zico Kolter, Vladlen Koltun소속: Carnegie Mellon University, Intel Labs논문 링크: https://arxiv.org/pdf/1803.01271 2. 연구 목적 (Introduction)연구 배경순차 데이터(sequence data)를 다루는 대표적인 방법으로는 RNN(Recurrent Neural Network) 기반 모델(LSTM, GRU 등)이 전통적으로 사용되어 왔음.이유는 다음과 같다.RNN은 과거 정보를 요약하여(hidd..