Skip to content

Chapter 154: Fourier Neural Operator (FNO) in Algorithmic Trading

Chapter 154: Fourier Neural Operator (FNO) in Algorithmic Trading

Overview

The Fourier Neural Operator (FNO) is a groundbreaking neural network architecture originally designed to solve parametric Partial Differential Equations (PDEs) much faster than traditional solvers. In quantitative finance, PDEs naturally arise in derivative pricing (like the Black-Scholes equation) and order book dynamics tracking.

While Transformers and RNNs operate directly in the time domain, FNO operates in the frequency domain. It learns the mapping between infinite-dimensional function spaces. For financial time series, this means taking a sequence of prices, converting them into fundamental frequencies (sine and cosine waves), applying learnable complex weights to the most important low frequencies, and then converting them back into the time domain.

By filtering out high-frequency noise and focusing on the underlying structural waves of the market, FNO achieves exceptional performance in predicting macro trend reversals, solving option pricing PDEs, and understanding Limit Order Book (LOB) imbalances—often matching the accuracy of traditional solvers while being $1000\times$ faster.

Table of Contents

  1. Mathematical Foundations of FNO
  2. Why FNO for Trading?
  3. Implementation Details (Python)
  4. Implementation Details (Rust)
  5. Backtesting Methodology
  6. References

Mathematical Foundations of FNO

Standard neural networks map finite-dimensional vectors to vectors ($R^n \rightarrow R^m$). Operators, on the other hand, map functions to functions. FNO utilizes the fundamental theorem of the Fourier Transform.

  1. Transform to Frequency Domain: $\mathcal{F}(x)$ converts the input sequence $x$ into its complex frequency components.
  2. Learnable Linear Transformation: Inside the frequency domain, we multiply the signal by a complex weight matrix $R$. We only keep the lowest $K$ modes (frequencies), acting as an implicit low-pass filter over market noise. $Frequency_Output = R \cdot \mathcal{F}(x)[:K]$
  3. Inverse Fourier Transform: $\mathcal{F}^{-1}$ converts the modified frequency spectrum back into the time domain.
  4. Residual Connection & Activation: $Output = \sigma(\mathcal{F}^{-1}(Frequency_Output) + W \cdot x)$

The parameters in $R$ are resolution-invariant, meaning a model trained on 1-hour bars can theoretically be evaluated on 5-minute bars without retraining.

Why FNO for Trading?

  1. Noise Filtering: Financial data is notoriously noisy. By truncating high-frequency modes in the Fourier space, the model inherently acts as an adaptive low-pass filter, focusing on true momentum and structural regime shifts.
  2. Option Pricing: Evaluates the Black-Scholes PDE and fractional jump-diffusion models instantly, bypassing heavy Monte Carlo or finite difference simulations.
  3. Global Receptive Field: Standard convolutions (CNNs) have a local window. FNO has a global convolution property. A single FNO layer processes the entire historical sequence simultaneously across all time steps.

Implementation Details (Python)

We implement the FNO architecture natively in PyTorch utilizing torch.fft.rfft.

  • model.py: Contains the SpectralConv1d and FNO1d modules.
  • train.py: Generates synthetic market data with underlying macro-frequency patterns (simulating business cycles) heavily overlaid with Gaussian noise, training the FNO to extract the clean signal.
  • backtest.py: Runs a rigorous validation simulating how a frequency-based model performs over sequential out-of-sample trading windows.
  • notebooks/example.ipynb: Interactive visualization of the Fourier transforms.

Run Python Stack:

Terminal window
python python/model.py
python python/train.py

Implementation Details (Rust)

For high-frequency or zero-latency environments, we port the spectral convolution logic to Rust. We utilize rustfft to perform Fast Fourier Transforms over limit order book streams. Custom implementations of complex multiplication assure the output perfectly aligns with mathematical specifications.

  • rust/src/lib.rs: Defines the low-level FNO pass.
  • rust/src/main.rs: Execution binary validating discrete Fourier math predictions.

Run Rust Stack:

Terminal window
cd rust
cargo run

Backtesting Methodology

When dealing with FNOs over time-series data, it is crucial to apply the transform only to strictly historically bounded data $[t-H, t]$. Unlike images, market data is strictly causal. If the Fourier transform window includes $t+1$, data leakage occurs instantly. The backtest.py script enforces strict memory isolation utilizing rolling slices.


References

  1. Li, Z., Kovachki, N., Azizzadenesheli, K., Liu, B., Bhattacharya, K., Stuart, A., & Anandkumar, A. (2020). Fourier Neural Operator for Parametric Partial Differential Equations. arXiv:2010.08895.
  2. De Castro, et al. Physics-Informed Neural Networks and FNOs in Modern Finance.