Excerpt from course description

Time Series Analysis and Sequential Data Networks

Introduction

Please note that this is a preliminary course description. The final version will be published in June 2026.

Sequence data, in the form of time series or text, is essential in most business domains. For example, Gross Domestic Product is measured across time and is the most widely used indicator of a nation’s economic activity. Similarly, written text follows in a natural order and investors use companies’ financial statements to gauge profitability, growth prospects, and risk.

In this course you will be given a thorough introduction to classical time series analysis, trend and cycle decompositions, the usage of state-space and factor models, and the celebrated Kalman Filter. These tools are used heavily not only in applied economics and finance, but also in all other domains where time series analysis is important.

You will also learn to use text as data. Standard Natural Language Processing (NLP) concepts and techniques are covered alongside recurrent neural networks and more modern Transformer-based models. These types of architectures underlie recent advances in NLP and Large Language Models (LLMs). However, since both text and time series are sequence data, these methodologies can be used for modelling both types of data modalities.   

The course is organized as a blend of theoretical concepts and hands-on practical exercises. Prior exposure to quantitative subjects within programming, statistics and machine learning is mandatory.

Course content

  • Introduction to sequence data
    • Time series
    • Text
  • Time series processes and trend and cycle decompositions
    • Univariate and multivariate time series processes
    • Seasonality
    • Trend and cycle decompositions
  • Signal extraction and dimension reduction
    • State-space models
    • Kalman Filtering
    • Dimension reduction and factor modeling
  • Prediction
    • Evaluation
    • Forecast combination
  • Natural Language Processing
    • Text processing, tokenization, and language modelling
    • Simple Algorithms
  • Sequential data networks
    • Recurrent Neural Networks
    • Transformer-based architectures
    • Large Language Model fundamentals
    • Multimodal learning

 

Disclaimer

This is an excerpt from the complete course description for the course. If you are an active student at BI, you can find the complete course descriptions with information on eg. learning goals, learning process, curriculum and exam at portal.bi.no. We reserve the right to make changes to this description.