The main purpose of the script is to decompose suprsymmetric tensors into their rank-1 components. This is achieved by defining a loss function measuring the residual between the original tensor and its reconstruction from the estimated rank-1 components, and applying an optimization algorithm (projected gradient descent here) to minimize the loss.
- Generate supersymmetric tensors randomly.
- Perform rank-1 supersymmetric tensor decomposition.
- Implement the projected gradient method to find the maximum eigenvalue and corresponding eigenvector of a tensor.
- Visualization of residual norms over iterative steps.
- Command line interaction for tensor dimension, order, and optimization parameters.
The script requires the following libraries:
torch
(CUDA compiled)numpy
matplotlib
logging
Ensure you have a Python environment with these packages installed. You can install them using pip:
pip install torch numpy matplotlib
Run the script directly from the command line:
python algorithm.py
Follow the on-screen prompts to input parameters such as tensor dimension, order, and optimization settings. Results will be displayed directly in the console, and plots will illustrate the change in Frobenius norms of the residuals.
Generates a random supersymmetric tensor of specified dimension and order.
Computes a rank-1 supersymmetric tensor based on a given vector.
Sums a list of rank-1 tensors each scaled by a corresponding scalar.
performs a successive rank-1 decomposition of a supersymmetric tensor.
Applies the projected gradient method for optimization on tensor spaces. (Good start is alpha=[0.01,0.1] , beta=[0.4,0.8])
Generates random points on the unit sphere for initial guesses in optimization algorithms.
Plots the Frobenius norm of residual tensors as a function of iteration steps in the decomposition process.
based on the paper : https://www.polyu.edu.hk/ama/staff/new/qilq/WQ.pdf
MIT