Skip to content

NiklasLue/shape_correspondence

Repository files navigation

Partial Deformable Shape Correspondence

Visualization of coupled functional maps

Code Submission to Deformable Shapes for the course Case Studies: Nonlinear Optimization at TUM [report] [poster]

Authors: Bodo Lipp, Niklas Lüdtke, Pietro Massetani, Binlan Wu (equal contribution)

Data

We used various data sources. You can find the publicly available datasets in the following table.

Note: Only dataloaders for TOSCA and Shrec 16' are implemented!

Data Website
FAUST http://faust.is.tue.mpg.de
TOSCA https://vision.in.tum.de/data/datasets/partial
Shrec 16' https://www.dais.unive.it/~shrec2016/dataset.php
CP2P https://github.com/pvnieo/cp2p-pfarm-benchmark/tree/master/cp2p
PFARM https://github.com/pvnieo/cp2p-pfarm-benchmark/tree/master/pfarm

The default location for the data folders in all files & notebooks is two levels above the root code folder!

Installation

(1) Create your Virtual Environment

conda create -n shape_correspondence
conda activate shape_correspondence

(2) Install requirements

pip install -r requirements.txt

pyFM

[repo]

The basic functionalities of the pyFM package are displayed in the pyfm_example_notebook file.

DPFM

[repo] [paper]

Implementation of the DPFM network to train descriptors. One can train and evaluate standart DPFM models with two simple commands show in the following sections.

Unsupervised variant of DPFM

We also implemented an unsupervised version of DPFM, which only uses the inherant structure of the descriptors and functional maps in the loss and does not need labelled training data.

DPCFM

There are two version available. The structures can be found in the following diagrams (more details can be found in the report) Visualization of the structure of DPCFM V1 Visualization of the structure of DPCFM V2

Our proposed DPCFM structure is implemented based on the DPFM code. Wee provide training and evaaluation scripts, just like for the other models.

Training

All of the models mentioned above can be trained using the run_training script. To choose the model use the flag -v and choose one of the following

  • dpfm
  • unsupervised
  • dpcfm1
  • dpcfm2

Other flags exist to set the paths to the data, config file, etc. Use -h to see all available options.

Example:

python3 run_training.py -d ../.. -c project/config/tosca_cuts.yaml

The state dicts from the PyTorch models are saved in

project/data/{cfg.dataset.model_name}

They are saved every 5 epochs, you can change that behaviour in the config file.

Evaluation

Trained models can be evaluated with the run_eval script. You have to choose the correct model again and use the other flags to give your respective paths again. Use -h to see all available options.

Example:

python3 run_eval.py  -d ../.. -c project/config/tosca_cuts.yaml -mp project/models/dpcfm.pth -m CFM

This script creates evaluation files, where the results of the evaluation is stored in. These can later be used via torch.load(file_path) to load such that the evaluation script does not have to be run again. The files are stored in the folder

data/eval

We provide pretrained models that can be used for evaluation and recreating the experiments seen in the report. The state dicts can be found in the folder

project/models

Note: For evaluating DPCFM models with coupled functional maps, use the dpfm model, i.e. -v dpfm

Notebooks

We created several notebooks to showcase the models and to create figures seen in the report.

Acknowledgements

This work is based significantly on DPFM (Copyright (c) 2021 Souhaib Attaiki, Gautam Pai and Maks Ovsjanikov) and pyFM (Robin Magnet).

A special thanks to Lennart Bastian, Fabian Schaipp and Mahdi Saleh for their helpful discussions and comments.

This work is distributed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •