You are viewing a single comment's thread from:

RE: LeoThread 2025-02-09 12:40

in LeoFinance4 months ago

Part 4/8:

  • Multi-tasking: Using unified modeling when applicable enhances performance across tasks.

The VQV Tokenization Process

Central to the Totem methodology is vqv tokenization. This process begins by encoding time series data into a continuous representation, subsequently breaking down the data into discrete tokens.

  1. Input Time Series: The original time series gets encoded into chunks based on a set compression factor.

  2. Learning Representations: The encoder captures a D-dimensional representation, which undergoes tokenization via a discrete codebook.

  3. Reconstruction: The discretized representations are passed through a decoder to regenerate the time series output.