Filters








551 Hits in 3.7 sec

Efficient Relaxations for Dense CRFs with Sparse Higher Order Potentials [article]

Thomas Joy, Alban Desmaison, Thalaiyasingam Ajanthan, Rudy Bunel, Mathieu Salzmann, Pushmeet Kohli, Philip H.S. Torr, M. Pawan Kumar
2018 arXiv   pre-print
The presented algorithms can be applied to any labelling problem using a dense CRF with sparse higher-order potentials.  ...  By modelling long-range interactions, dense CRFs provide a labelling that captures finer detail than their sparse counterparts.  ...  The primary contributions of this paper are a quadratic programming and a linear programming relaxation for minimising a dense CRF with sparse higher-order potentials.  ... 
arXiv:1805.09028v2 fatcat:mwvr6qzclbfpzfnevrwj6k7kfa

Efficient Continuous Relaxations for Dense CRF [chapter]

Alban Desmaison, Rudy Bunel, Pushmeet Kohli, Philip H. S. Torr, M. Pawan Kumar
2016 Lecture Notes in Computer Science  
Dense conditional random fields (CRF) with Gaussian pairwise potentials have emerged as a popular framework for several computer vision applications such as stereo correspondence and semantic segmentation  ...  In order to operationalise dense CRFs, Krähenbühl and Koltun [5] made two key observations. First, the pairwise potentials used in computer vision typically encourage smooth labelling.  ...  Discussion Our main contribution are four efficient algorithms for the dense CRF energy minimisation problem based on QP, DC and LP relaxations.  ... 
doi:10.1007/978-3-319-46475-6_50 fatcat:yokqdi5pyjefrb4k6g52r4geoa

Linear programming-based submodular extensions for marginal estimation

Pankaj Pansari, Chris Russell, M. Pawan Kumar
2019 Computer Vision and Image Understanding  
Importantly, unlike TRW, our approach provides the first computationally tractable algorithm to compute an upper bound on dense CRF model with higher-order Potts potentials.  ...  extension based on an LP relaxation for a higher-order diversity model.  ...  As for stereo matching (subsection 7.3), we augment the dense CRF with Gaussian pairwise potentials with our higher-order diversity model.  ... 
doi:10.1016/j.cviu.2019.102824 fatcat:nufbvh5eajaatgu5ji3hkf3jgu

A Study of Lagrangean Decompositions and Dual Ascent Solvers for Graph Matching

Paul Swoboda, Carsten Rother, Hassan Abu Alhaija, Dagmar Kainmuller, Bogdan Savchynskyy
2017 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
Our improvement over state-of-the-art is particularly visible on a new dataset with large-scale sparse problem instances containing more than 500 graph nodes each.  ...  Two leading solvers for this problem optimize the Lagrange decomposition duals with sub-gradient and dual ascent (also known as message passing) updates.  ...  The authors would like to thank Vladimir Kolmogorov for helpful discussions.  ... 
doi:10.1109/cvpr.2017.747 dblp:conf/cvpr/SwobodaRAKS17 fatcat:mxkl6et5jnhmhdeyrchvcktfbi

Efficient Continuous Relaxations for Dense CRF [article]

Alban Desmaison, Rudy Bunel, Pushmeet Kohli, Philip H.S. Torr, M. Pawan Kumar
2016 arXiv   pre-print
Dense conditional random fields (CRF) with Gaussian pairwise potentials have emerged as a popular framework for several computer vision applications such as stereo correspondence and semantic segmentation  ...  By modeling long-range interactions, dense CRFs provide a more detailed labelling compared to their sparse counterparts.  ...  Discussion Our main contribution are four efficient algorithms for the dense CRF energy minimisation problem based on QP, DC and LP relaxations.  ... 
arXiv:1608.06192v1 fatcat:pfvspr5pgfbfvmhehxwxb3vbvu

On Regularized Losses for Weakly-supervised CNN Segmentation [chapter]

Meng Tang, Federico Perazzi, Abdelaziz Djelouah, Ismail Ben Ayed, Christopher Schroers, Yuri Boykov
2018 Lecture Notes in Computer Science  
To obtain such full masks the typical methods explicitly use standard regularization techniques for "shallow" segmentation, e.g. graph cuts or dense CRFs.  ...  This approach simplifies weakly-supervised training by avoiding extra MRF/CRF inference steps or layers explicitly generating full masks, while improving both the quality and efficiency of training.  ...  Alternating schemes (proposal generation) give higher dense CRF loss.  ... 
doi:10.1007/978-3-030-01270-0_31 fatcat:gatucvlifbafzkm6npsdijkq7q

Efficient Linear Programming for Dense CRFs

Thalaiyasingam Ajanthan, Alban Desmaison, Rudy Bunel, Mathieu Salzmann, Philip H. S. Torr, M. Pawan Kumar
2017 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
To alleviate this deficiency, we introduce an efficient LP minimization algorithm for dense CRFs.  ...  The fully connected conditional random field (CRF) with Gaussian pairwise potentials has proven popular and effective for multi-class semantic segmentation.  ...  Altogether, our framework constitutes the first efficient and effective minimization algorithm for dense CRFs with Gaussian pairwise potentials.  ... 
doi:10.1109/cvpr.2017.313 dblp:conf/cvpr/AjanthanDBSTK17 fatcat:fj72lzaomfe3lkdqojhmw2et64

On Regularized Losses for Weakly-supervised CNN Segmentation [article]

Meng Tang, Federico Perazzi, Abdelaziz Djelouah, Ismail Ben Ayed, Christopher Schroers, Yuri Boykov
2018 arXiv   pre-print
To obtain such full masks the typical methods explicitly use standard regularization techniques for "shallow" segmentation, e.g. graph cuts or dense CRFs.  ...  This approach simplifies weakly-supervised training by avoiding extra MRF/CRF inference steps or layers explicitly generating full masks, while improving both the quality and efficiency of training.  ...  With dense Gaussian kernel W pq (4) is a relaxation of DenseCRF [24] .  ... 
arXiv:1803.09569v2 fatcat:fizy3tv32jdb5cbe2opwaelyxm

Beyond Gradient Descent for Regularized Segmentation Losses [article]

Dmitrii Marin and Meng Tang and Ismail Ben Ayed and Yuri Boykov
2019 arXiv   pre-print
Our loss is motivated by well-understood MRF/CRF regularization models in "shallow" segmentation and their known global solvers.  ...  The simplicity of gradient descent (GD) made it the default method for training ever-deeper and complex neural networks.  ...  Interesting fu- ture work is to investigate losses with non-Gaussian pairwise CRF potentials and higher-order segmentation regularizers, e.g.  ... 
arXiv:1809.02322v2 fatcat:c5st63ydkzhlhmvzgzhk6v7rrq

Learning Sparse High Dimensional Filters: Image Filtering, Dense CRFs and Bilateral Neural Networks

Varun Jampani, Martin Kiefel, Peter V. Gehler
2016 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
This derivation allows to learn high dimensional linear filters that operate in sparsely populated feature spaces. We build on the permutohedral lattice construction for efficient filtering.  ...  Further, we show how this algorithm can be used to learn the pairwise potentials in densely connected conditional random fields and apply these to different image segmentation tasks.  ...  Dense CRF: The key observation of [31] is that meanfield inference update steps in densely connected CRFs with Gaussian edge potentials require Gaussian bilateral filtering operations.  ... 
doi:10.1109/cvpr.2016.482 dblp:conf/cvpr/JampaniKG16 fatcat:gfjx6y54yzen7nunorw4b42rde

Learning to cluster using high order graphical models with latent variables

Nikos Komodakis
2011 2011 International Conference on Computer Vision  
To this end, it formulates clustering as a high order energy minimization problem with latent variables, and applies a dual decomposition approach for training this model.  ...  As an additional contribution, we show how our method can be generalized to handle the training of a very broad class of important models in computer vision: arbitrary high-order latent CRFs.  ...  unary potentials u = {u pq ( · ; d)} and higher order potentials φ = {φ pq (·), φ p (·)} that are defined as u pq (x pq ; d) = d p,q x pq (7) φ pq (x pq , x qq ) = δ(x pq ≤ x qq ) (8) φ p (x p ) = δ q  ... 
doi:10.1109/iccv.2011.6126227 dblp:conf/iccv/Komodakis11 fatcat:zhk4a6up4vhtfjm5oj2awdq76m

Worst-case Optimal Submodular Extensions for Marginal Estimation [article]

Pankaj Pansari, Chris Russell, M.Pawan Kumar
2018 arXiv   pre-print
class of metric labeling; and (iii) efficiently compute the marginals for the widely used dense CRF model with the help of a recently proposed Gaussian filtering method.  ...  Importantly, unlike TRW, our approach provides the first practical algorithm to compute an upper bound on the dense CRF model.  ...  For comparison, we restrict ourselves to sparse CRFs, as the code available for TRW does not scale well to dense CRFs.  ... 
arXiv:1801.06490v1 fatcat:5y2b2gx345fr5crvkijqrqxuce

Forming A Random Field via Stochastic Cliques: From Random Graphs to Fully Connected Random Fields [article]

Mohammad Javad Shafiee, Alexander Wong, Paul Fieguth
2015 arXiv   pre-print
The proposed framework allows for efficient structured inference using fully-connected random fields without any restrictions on the potential functions that can be utilized.  ...  Random fields have remained a topic of great interest over past decades for the purpose of structured inference, especially for problems such as image segmentation.  ...  potential under a Gaussian kernel is obtained in order to take advantage of Permutohedral lattices for efficient inference.  ... 
arXiv:1506.09110v1 fatcat:bhajmu3r7zbhbmgaa6sksehq6u

Learning Sparse High Dimensional Filters: Image Filtering, Dense CRFs and Bilateral Neural Networks [article]

Varun Jampani and Martin Kiefel and Peter V. Gehler
2015 arXiv   pre-print
This derivation allows to learn high dimensional linear filters that operate in sparsely populated feature spaces. We build on the permutohedral lattice construction for efficient filtering.  ...  Further, we show how this algorithm can be used to learn the pairwise potentials in densely connected conditional random fields and apply these to different image segmentation tasks.  ...  Acknowledgements We thank Jonas Wulff, Laura Sevilla, Abhilash Srikantha, Christoph Lassner, Andreas Lehrmann, Thomas Nestmeyer, Andreas Geiger, Fatma Güney and Gerard Pons-Moll for their valuable feedback  ... 
arXiv:1503.04949v3 fatcat:wcfekolyyvcmtar6jorisvahli

A Study of Lagrangean Decompositions and Dual Ascent Solvers for Graph Matching [article]

Paul Swoboda, Carsten Rother, Hassan Abu Alhaija, Dagmar Kainmueller, Bogdan Savchynskyy
2017 arXiv   pre-print
Our improvement over state-of-the-art is particularly visible on a new dataset with large-scale sparse problem instances containing more than 500 graph nodes each.  ...  Two leading solvers for this problem optimize the Lagrange decomposition duals with sub-gradient and dual ascent (also known as message passing) updates.  ...  For sparse graphs, the inverse representation becomes too expensive, as the inverse edge set E may be dense even though E is sparse in (3).  ... 
arXiv:1612.05476v2 fatcat:q7svgp37kvdtnechnefe7asil4
« Previous Showing results 1 — 15 out of 551 results