A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Avatar-Net: Multi-scale Zero-shot Style Transfer by Feature Decoration
[article]
2018
arXiv
pre-print
By embedding this module into an image reconstruction network that fuses multi-scale style abstractions, the Avatar-Net renders multi-scale stylization for any style image in one feed-forward pass. ...
In this paper, we resolve this dilemma and propose an efficient yet effective Avatar-Net that enables visually plausible multi-scale transfer for arbitrary style. ...
Multi-scale Zero-shot Style Transfer The proposed Avatar-Net employs a hourglass network with multi-scale style adaptation modules that progressively fuse the styles from the encoded features into the ...
arXiv:1805.03857v2
fatcat:gla2wylxmjerncjncvhcvcl4x4
Attention-Aware Multi-Stroke Style Transfer
2019
2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
By performing multi-scale style swap on content features and style features, we produce multiple feature maps reflecting different stroke patterns. ...
In this paper, we tackle these limitations by developing an attention-aware multi-stroke style transfer model. ...
This work was supported by the Natural Science Foundation of China (61725204). ...
doi:10.1109/cvpr.2019.00156
dblp:conf/cvpr/YaoRX0LW19
fatcat:xdtqe4gdtzgipft2sringb7s5u
Consistent Video Style Transfer via Compound Regularization
2020
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
Combining with the new cost formula, we design a zero-shot video style transfer framework. ...
Recently, neural style transfer has drawn many attentions and significant progresses have been made, especially for image style transfer. ...
Experimental results demonstrate the superiority of the proposed framework. migrated features by patch swapping. Avatar-Net (Sheng et al. 2018) adopted a style-swap based style decorator. ...
doi:10.1609/aaai.v34i07.6905
fatcat:zqascdgmrfhtzb4cmjtbj5lvce
Attention-aware Multi-stroke Style Transfer
[article]
2019
arXiv
pre-print
By performing multi-scale style swap on content features and style features, we produce multiple feature maps reflecting different stroke patterns. ...
In this paper, we tackle these limitations by developing an attention-aware multi-stroke style transfer model. ...
Avatar-Net [19] elevates the feature transfer ability by matching the normalized counterparts features and applying a patch-based style decorator. ...
arXiv:1901.05127v1
fatcat:z2oeazoq4na6rky6hk7lnhule4
ETNet: Error Transition Network for Arbitrary Style Transfer
[article]
2019
arXiv
pre-print
Numerous valuable efforts have been devoted to achieving arbitrary style transfer since the seminal work of Gatys et al. ...
For each refinement, we transit the error features across both the spatial and scale domain and invert the processed features into a residual image, with a network we call Error Transition Network (ETNet ...
Avatar-Net [24] addresses the style distortion issue by introducing a feature decorator module. ...
arXiv:1910.12056v2
fatcat:ozu2ykbnmbgcbkvtxpxmj4x4im
Two-Stage Peer-Regularized Feature Recombination for Arbitrary Image Style Transfer
[article]
2020
arXiv
pre-print
The proposed solution produces high-quality images even in the zero-shot setting and allows for more freedom in changes to the content geometry. ...
This paper introduces a neural style transfer model to generate a stylized image conditioning on a set of examples describing the desired style. ...
More recently, Avatar-Net [33] proposed the use of a "style decorator" to re-create content features by semantically aligning input style features with those derived from the style image. ...
arXiv:1906.02913v3
fatcat:kyw3kvs7qjd7dcmmprsj2tphge
StyleRemix: An Interpretable Representation for Neural Image Style Transfer
[article]
2019
arXiv
pre-print
Multi-Style Transfer (MST) intents to capture the high-level visual vocabulary of different styles and expresses these vocabularies in a joint model to transfer each specific style. ...
By decomposing diverse styles into the same basis, StyleRemix represents a specific style in a continuous vector space with 1-dimensional coefficients. ...
Avatar-net: Multi-
scale zero-shot style transfer by feature decoration. ...
arXiv:1902.10425v3
fatcat:4ibtkxaeenhxnhcvmqs4hbefra
Sketch-based Facial Synthesis: A New Challenge
[article]
2022
arXiv
pre-print
Second, we present the largest-scale FSS study by reviewing 139 classical methods, including 24 handcrafted feature-based facial sketch synthesis approaches, 37 general neural-style transfer methods, 43 ...
With only two straightforward components, i.e., facial-aware masking and style-vector expansion, FSGAN surpasses the performance of all previous state-of-the-art models on the proposed FS2K dataset by ...
[158] proposed an Avatar-Net that enables multi-scale transfer for any styles. The key innovation is a "style decorator" that semantically aligns the content and style features. ...
arXiv:2112.15439v3
fatcat:fl7v2gqdmrc7tc3epdat4xeama
Arbitrary Font Generation by Encoder Learning of Disentangled Features
2022
Sensors
In our experiments, we proved that our method can extract consistent features of text contents and font styles by separating content and style encoders and this works well for generating unseen font design ...
Second, we propose new consistency losses that force any combination of encoded features of the stacked inputs to have the same values. ...
In this decoding process, as the multi-level style transfer of Avatar-net [29] , the content feature map x was transformed by i-th style feature map y i in the i-th AdaIN layer, where the mean and standard ...
doi:10.3390/s22062374
pmid:35336547
pmcid:PMC8950682
fatcat:37c6do4pzfc5zc3aupyw6mzmia