A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Adaptive Label Smoothing To Regularize Large-Scale Graph Training
[article]
2021
arXiv
pre-print
To handle large-scale graphs, most of the existing methods partition the input graph into multiple sub-graphs (e.g., through node clustering) and apply batch training to save memory cost. ...
Specifically, ALS propagates node labels to aggregate the neighborhood label distribution in a pre-processing step, and then updates the optimal smoothed labels online to adapt to specific graph structure ...
However, it is non-trivial to apply the label smoothing to regularize and adapt to the large-scale graph training from two structural levels: local node and global graph. ...
arXiv:2108.13555v1
fatcat:ujyhpronczgmxfi2rc3dw7srym
Interactive Multi-Label CNN Learning With Partial Labels
2020
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
We introduce a new loss function that regularizes the cross-entropy loss with a cost function that measures the smoothness of labels and features of images on the data manifold. ...
We address the problem of efficient end-to-end learning a multi-label Convolutional Neural Network (CNN) on training images with partial labels. ...
While [30] learns an adaptive graph for label propagation, it cannot generalize to novel images due to its transductive nature and cannot scale to large datasets. ...
doi:10.1109/cvpr42600.2020.00944
dblp:conf/cvpr/HuynhE20b
fatcat:ji733zx3uzdjjplk4vazuocjgu
Efficient Graph-Based Semi-Supervised Learning of Structured Tagging Models
2010
Conference on Empirical Methods in Natural Language Processing
The similarity graph is used during training to smooth the state posteriors on the target domain. Standard inference can be used at test time. ...
Our approach is able to scale to very large problems and yields significantly improved target domain accuracy. ...
Here we use the graph as a smoothness regularizer to train CRFs in a semisupervised manner. ...
dblp:conf/emnlp/SubramanyaPP10
fatcat:imoqrbvfv5bk3nstevm2wvop4e
Multi-class regularization parameter learning for graph cut image segmentation
2013
2013 IEEE 10th International Symposium on Biomedical Imaging
We demonstrate the performance of the approach within graph cut segmentation framework via qualitative results on chest x-rays. ...
However, these algorithms depend on parameters which need to be tuned for a meaningful solution. ...
The data term confines the segmentation labels to be close to the observed image. The smoothness term forces the algorithm to assign similar labels to the neighborhood pixels. ...
doi:10.1109/isbi.2013.6556813
dblp:conf/isbi/CandemirPA13
fatcat:buiwwsqp6re6pbfbs5p34uhrjq
Graph-Based Embedding Smoothing Network for Few-Shot Scene Classification of Remote Sensing Images
2022
Remote Sensing
The most popular way to solve scene classification is to train a deep neural network with a large-scale remote sensing dataset. ...
Specifically, GES-Net adopts an unsupervised non-parametric regularizer, called embedding smoothing, to regularize embedding features. ...
In a wide range of realistic scenarios, collecting large-scale remote sensing images and labeling them are quite time-consuming and painstaking tasks. ...
doi:10.3390/rs14051161
fatcat:hvxzgzeg2bblvnpppdblgogd7y
Webly Supervised Image Classification with Self-Contained Confidence
[article]
2020
arXiv
pre-print
A series of SCC-friendly regularization approaches are investigated, among which the proposed graph-enhanced mixup is the most effective method to provide high-quality confidence to enhance our framework ...
The proposed WSL framework has achieved the state-of-the-art results on two large-scale WSL datasets, WebVision-1000 and Food101-N. Code is available at https://github.com/bigvideoresearch/SCC. ...
Introduction Large-scale human-labeled data plays a vital role in deep learning-based applications such as image classification [3] , scene recognition [41] , face recognition [30] , etc. ...
arXiv:2008.11894v1
fatcat:ntclraq52ba3dmsbguar2lnv4e
Graph Neural Network Based Attribute Auxiliary Structured Grouping for Person Re-Identification
2021
IEEE Access
the hard one-hot label into "soft" label with smoothing regularization. ...
Considering the "over-confidence" of inaccurate label may be harmful to the discriminative learning, we regularize the learning of the embeddig model with smoothed pseudo labels (SPL) when training with ...
doi:10.1109/access.2021.3069915
fatcat:xlukop5sdjfjbioycioaynzp6q
Robust and Scalable Graph-Based Semisupervised Learning
2012
Proceedings of the IEEE
Second, to support scalability to the gigantic scale (millions or billions of samples), recent solutions based on anchor graphs are reviewed. ...
It has been shown effective in propagating a limited amount of initial labels to a large amount of unlabeled data, matching the needs of many emerging applications such as image annotation and information ...
This web-scale experimental design corroborates that AGR can be well adapted to cope with web-scale data through training anchor graph models over million-scale data sets and then inductively applying ...
doi:10.1109/jproc.2012.2197809
fatcat:fk66s5zl75d35pn3rjwmhver7a
GraphHop: An Enhanced Label Propagation Method for Node Classification
[article]
2021
arXiv
pre-print
The LP method is not effective in modeling node attributes and labels jointly or facing a slow convergence rate on large-scale graphs. GraphHop is proposed to its shortcoming. ...
This iterative procedure exploits the neighborhood information and enables GraphHop to perform well in an extremely small label rate setting and scale well for very large graphs. ...
For
GraphHop, we apply it to three large-scale graph datasets. ...
arXiv:2101.02326v1
fatcat:b35wqynunrbxjhs2hpuspxgoqi
LEReg: Empower Graph Neural Networks with Local Energy Regularization
[article]
2022
arXiv
pre-print
With the proposed two regularization terms, GNNs are able to filter the most useful information adaptively, learn more robustly and gain higher expressiveness. ...
Existing GNNs treat all parts of the graph uniformly, which makes it difficult to adaptively pass the most informative message for each unique part. ...
It provides a simplified graph convolution framework for large-scale graph datasets with high speed. ...
arXiv:2203.10565v1
fatcat:uj5kgklmajdqpopvkl3dqyt72i
Domain adaptive semantic diffusion for large scale context-based video annotation
2009
2009 IEEE 12th International Conference on Computer Vision
This paper proposes a novel and efficient approach, named domain adaptive semantic diffusion (DASD), to exploit semantic context while considering the domain-shift-ofcontext for large scale video concept ...
The adaptation provides a means to handle domain change between training and test data, which occurs very often in video annotation task. ...
For large scale video annotation which could involve simultaneous labeling of hundreds of concepts, the problem becomes worse when the unlabeled videos are from a domain different from that of the training ...
doi:10.1109/iccv.2009.5459295
dblp:conf/iccv/JiangWCN09
fatcat:vrhzwcuggzeqtcgfpixqpvbhpe
Graph Neural Networks With Lifting-based Adaptive Graph Wavelets
[article]
2022
arXiv
pre-print
However, existing SGNNs are limited in implementing graph filters with rigid transforms (e.g., graph Fourier or predefined graph wavelet transforms) and cannot adapt to signals residing on graphs and tasks ...
In this paper, we propose a novel class of graph neural networks that realizes graph filters with adaptive graph wavelets. ...
Thus, they can not scale to large and varying-size graphs.
C. ...
arXiv:2108.01660v3
fatcat:liunq2ozw5dxlps7s3wcc52vry
A Unified View on Graph Neural Networks as Graph Signal Denoising
[article]
2021
arXiv
pre-print
To demonstrate its promising potential, we instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes. ...
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data. ...
Ideally, for nodes with high local label smoothness, we expect the learned C to be larger, such that a higher-level local smoothness is enforced to this node during model training. ...
arXiv:2010.01777v2
fatcat:7pygu2lukbb55cgcmvahh3kaba
Exploring Object Relation in Mean Teacher for Cross-Domain Detection
[article]
2019
arXiv
pre-print
The whole architecture is then optimized with three consistency regularizations: 1) region-level consistency to align the region-level predictions between teacher and student, 2) inter-graph consistency ...
for matching the graph structures between teacher and student, and 3) intra-graph consistency to enhance the similarity between regions of same class within the graph of student. ...
Mean teacher aims for learning a more smooth domain-invariant function than the model trained with no regularization (Figure 2 (a) ) or only augmented labeled source data (Figure 2 (b) ). ...
arXiv:1904.11245v2
fatcat:ymy6aajapvfb7ohytndmtiue7e
Harmonic Unpaired Image-to-image Translation
[article]
2019
arXiv
pre-print
In this paper, we take a manifold view of the problem by introducing a smoothness term over the sample graph to attain harmonic functions to enforce consistent mappings during the translation. ...
The recent direction of unpaired image-to-image translation is on one hand very exciting as it alleviates the big burden in obtaining label-intensive pixel-to-pixel supervision, but it is on the other ...
We introduce smooth regularization over the graph for unpaired image-to-image translation to attain harmonic translations. 2. ...
arXiv:1902.09727v1
fatcat:aqgsrimnbzfyjc3vazfyghwd5i
« Previous
Showing results 1 — 15 out of 26,473 results