GPT-2-Augmented Sequence Modeling for Short-Term Load Forecasting
编号:77 访问权限:仅限参会人 更新:2025-10-11 22:47:12 浏览:2次 张贴报告

报告开始:暂无开始时间(Asia/Shanghai)

报告时间:暂无持续时间

所在会场:[暂无会议] [暂无会议段]

暂无文件

摘要
Abstract -- Load forecasting serves as the foundation for power system operation and planning. Accurate load forecasting ensures the secure and reliable operation of power systems, reduces generation costs, and enhances economic efficiency. Recent studies demonstrate that large language models (LLMs) exhibit powerful capabilities in pattern recognition and reasoning for complex token sequences. The critical challenge lies in effectively aligning temporal patterns in time-series data with linguistic structures in natural language to leverage these capabilities. This paper proposes a large model-based time-series forecasting method for electrical load prediction. The approach leverages a pre-trained GPT-2 (Generative Pre-trained Transformer 2) model as its foundation while freezing parameters in its self-attention and feed-forward neural network layers. Fine-tuning is applied exclusively to the input embedding layer and output projection layer. Experimental results demonstrate that the proposed method achieves performance comparable to or superior against existing approaches across multiple electrical load forecasting tasks.
 
关键词
Load forecasting, time-series data, LLM, GPT-2, Fine-tuning
报告人
Xu Kun
graduate student Southeast university

稿件作者
Xu Kun Southeast university
Wang Ying Southeast University
发表评论
验证码 看不清楚,更换一张
全部评论
重要日期
  • 会议日期

    11月07日

    2025

    11月09日

    2025

  • 10月12日 2025

    初稿截稿日期

  • 10月20日 2025

    注册截止日期

主办单位
IEEE西南交通大学IAS学生分会
承办单位
西南交通大学电气工程学院
SPACI车网关系研究室
四川大学电力系统稳定与高压直流输电研究团队
移动端
在手机上打开
小程序
打开微信小程序
客服
扫码或点此咨询