- 0
- 0
- 约7.29千字
- 约 14页
- 2026-02-26 发布于山东
- 举报
OptimizationonGraphsPartII:TechnicalModels
ClassicalSolutiontoGraphMatchingProblemedge-edgesimilaritynode-nodesimilarityNP-hardGMProblem:ClassicalGMPipeline:SIFTfeatureextractorcomputenode,edgesimilarityclassicalalgorithmapproximatesolutionmatchingresultFixedmethod,e.g.GaussianKernelFunction(limitedrepresentationcapabilityofSIFT)(limitedcapacityduetofixedsimilarity)(limitedperformanceofclassicalalgorithm)
LearningGMbyGraphEmbeddingModelICCV19/TPAMI20ClassicalGMPipeline:SIFTfeatureextractorcomputenode,edgesimilarityclassicalalgorithmapproximatesolutionmatchingresultDeepGraphEmbeddingGMPipeline:CNNfeatureextractorintra-+cross-graphGNNembeddingSinkhornalgorithmmatchingresultlearnnodesimilarity(differentiableend-to-endlearning)[1]CombinatorialLearningofRobustDeepGraphMatching:anEmbeddingbasedApproach,TPAMI2020[2]LearningCombinatorialEmbeddingNetworksforDeepGraphMatching,ICCV2019
LearningGMbyGraphEmbeddingModelICCV19/TPAMI20Intra-graphNodeEmbedding:GraphConvolutionalNetworkCross-graphNodeEmbedding:Calculatesimilarity→ Computecross-graphweightsbySinkhornAdvantagesofgraphembedding:Includesgraphstructureinformation,comparedwithnodematchingReducescomplexity(NP-hard→O(N3)solvable),comparedwithgraphmatchingTheupdatestepiteratesoverallnodesTheupdatestepiteratesoverallnodes
Sinkhorn:differentiable,exactlinearassignmentalgorithmHowtoinvoke:pipinstallpygmtools(alreadysupportnumpy,pytorch,paddle,jittor;willsupport tensorflow,mindspore)importtorchimportpygmtoolsaspygmpygm.BACKEND=pytorch’np.random.seed(0)#2-dimensional(non-batched)inputs_2d=torch.from_numpy(np.random.rand(5,5))s_2dtensor([[0.5488,0.7152,0.6028,0.5449,0.4237],[0.6459,0.4376,0.8918,0.9637,0.3834],[0.7917,0.5289,0.5680,0.9256,0.0710],[0.0871,0.020
原创力文档

文档评论(0)