Skip to content

This is the official repository for the EUSIPCO2025 paper "Permutation Learning with only N Parameters"

Notifications You must be signed in to change notification settings

Visual-Computing/ShuffleSoftSort

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ShuffleSoftSort: Scalable Permutation Learning with Only $N$ Parameters

Paper Code Status License: MIT GitHub Stars

This is the official repository for the EUSIPCO 2025 paper: "Permutation Learning with Only N Parameters: From SoftSort to Self-Organizing Gaussians".

Kai Uwe Barthel (1), Florian Barthel (2), Peter Eisert (2) (1: HTW Berlin, Germany, 2: Fraunhofer HHI / HU Berlin, Germany)


💡 The Problem: Scalability in Permutation Learning

  • While Gumbel-Sinkhorn is powerful, its $O(N^2)$ parameter complexity makes it impractical for large datasets.
  • Conversely, SoftSort is memory-efficient but its one-dimensional nature makes it unsuitable for complex, multidimensional data.

🚀 Our Solution: ShuffleSoftSort

We propose ShuffleSoftSort, a novel permutation learning method that achieves both significantly improved efficiency and high sorting quality:

  • Efficiency: ShuffleSoftSort requires only $N$ parameters, a dramatic, linear reduction from prior $N^2$ methods.
  • ShuffleSoftSort: Improves permutation learning by iteratively applying SoftSort to randomly shuffled elements.

This makes the method ideal for large-scale tasks requiring efficient permutation learning, such as the Self-Organizing Gaussians application.


🔬 Principle and Properties

Permutation Learning Overview

The core task of permutation learning is to find a permutation matrix $\mathbf{P}$ that maps an input $\mathbf{X}$ to a desired ordered output $\mathbf{X_{sort}}$.

Diagram showing the general permutation learning process where an input vector X is permuted by P to result in Y.

The ShuffleSoftSort Mechanism

ShuffleSoftSort leverages the efficiency of SoftSort while extending its capabilities to multidimensional data. The image below illustrates the core idea: colors (data points) are sorted in 1D space. By calculating the loss on the reverse-shuffled output, the network is forced to learn a better global permutation that effectively refines the sorting and overcomes the local constraints of SoftSort.

A toy example demonstrating the reverse-shuffling mechanism for refining the permutation.

Algorithm

The full iterative algorithm for ShuffleSoftSort is detailed below:

Flowchart of the ShuffleSoftSort algorithm, detailing the steps of permutation, soft sorting, and loss calculation.

📊 Comparison of Properties

Property Gumbel-Sinkhorn Low-Rank Permutation SoftSort (1D) ShuffleSoftSort
Number of Parameters $N^2$ $2 \cdot N \cdot M$ $N$ $N$
Non-Iterative Normalization no yes yes yes
Achievable HD Sorting Quality ++ o $-$ ++
 Permutation Validity + o ++ ++

📚 Paper and Citation

For complete details on the method, derivation, and experimental results, please refer to our paper:

[2025/09/11] Kai Uwe Barthel, Florian Barthel, Peter Eisert Permutation Learning with Only N Parameters: From SoftSort to Self-Organizing Gaussians

🛠️ Installation and Setup

Example Pytorch notebooks can be found here.

This project is implemented in PyTorch. You can install the necessary dependencies using pip:

# Clone the repository
git clone [https://github.com/Visual-Computing/ShuffleSoftSort.git](https://github.com/Visual-Computing/ShuffleSoftSort.git)
cd ShuffleSoftSort

# Install required packages (assuming a standard requirements.txt for PyTorch)
pip install -r requirements.txt

About

This is the official repository for the EUSIPCO2025 paper "Permutation Learning with only N Parameters"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •