Font Shape-to-Impression Translation [article]

Masaya Ueda, Akisato Kimura, Seiichi Uchida
2022 arXiv   pre-print
Different fonts have different impressions, such as elegant, scary, and cool. This paper tackles part-based shape-impression analysis based on the Transformer architecture, which is able to handle the correlation among local parts by its self-attention mechanism. This ability will reveal how combinations of local parts realize a specific impression of a font. The versatility of Transformer allows us to realize two very different approaches for the analysis, i.e., multi-label classification and
more » ... ranslation. A quantitative evaluation shows that our Transformer-based approaches estimate the font impressions from a set of local parts more accurately than other approaches. A qualitative evaluation then indicates the important local parts for a specific impression.
arXiv:2203.05808v2 fatcat:4334pigvhrd37ftwnz5ej33rha