Filters








1,744 Hits in 1.6 sec

A Comprehensive Overhaul of Feature Distillation [article]

Byeongho Heo, Jeesoo Kim, Sangdoo Yun, Hyojin Park, Nojun Kwak, Jin Young Choi
2019 arXiv   pre-print
We investigate the design aspects of feature distillation methods achieving network compression and propose a novel feature distillation method in which the distillation loss is designed to make a synergy  ...  Our proposed distillation loss includes a feature transform with a newly designed margin ReLU, a new distillation feature position, and a partial L2 distance function to skip redundant information giving  ...  Acknowledgement We appreciate support of Clova AI members, especially Nigel Fernandez for proofreading the manuscript, Dongyoon Han for providing help on implementation and Jung-Woo Ha for insightful comments  ... 
arXiv:1904.01866v2 fatcat:r7yelxbwunezro4rqzfcedsr4e

The State of Knowledge Distillation for Classification [article]

Fabian Ruffy, Karanbir Chahal
2019 arXiv   pre-print
This is especially apparent with methods using some form of feature distillation.  ...  We survey various knowledge distillation (KD) strategies for simple classification tasks and implement a set of techniques that claim state-of-the-art accuracy.  ...  We plan to explore a combination of pruning and feature distillation and understand how to identify the parametric modelling limits of a neural network in future work.  ... 
arXiv:1912.10850v1 fatcat:qiji2n2b6bc5bktdbbzg3jm5su

Matching Guided Distillation [article]

Kaiyu Yue, Jiangfan Deng, Feng Zhou
2020 arXiv   pre-print
Unfortunately, there is a common obstacle - the gap in semantic feature structure between the intermediate features of teacher and student.  ...  We compare three solutions of the assignment problem to reduce channels from teacher features with partial distillation loss.  ...  In a nutshell, the problem of feature distillation seeks for the optimum student network f S that minimizes the loss of the main task together with feature discrepancy penalty: Loss = L task + γL distill  ... 
arXiv:2008.09958v2 fatcat:qk5yxvmotnfkvek2lb7nf2a56e

Applying Principles of Public Administration

Otto K. Engelke, Herman Finer
1954 PAR. Public Administration Review  
FRHAPS the most notable feature of this book is that it represents an effort at applied public administration.  ...  Finer, who has previously made notable contributions in the distillation of administrative doctrines, now turns his at- tention to the development of administrative guide lines for use by a sizable segment  ... 
doi:10.2307/972970 fatcat:5cjm77c5u5dlzi2wybqgo74s5m

Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation [article]

Sumanth Chennupati, Mohammad Mahdi Kamani, Zhongwei Cheng, Lin Chen
2021 arXiv   pre-print
Knowledge Distillation is becoming one of the primary trends among neural network compression algorithms to improve the generalization performance of a smaller student model with guidance from a larger  ...  Despite this advancement in different techniques for distilling the knowledge, the aggregation of different paths for distillation has not been studied comprehensively.  ...  In addition to Attention Transfer (AT) [34] and Soft Target (ST) [22] distillation methods, we use the recent and advanced distillation methods like Overhaul of Feature Distillation (OFD) [21] , Feature  ... 
arXiv:2110.09674v2 fatcat:h7kjo57luvhphpykysp4balpwu

Page 111 of Technical Book Review Index Vol. 18, Issue 6 [page]

1952 Technical Book Review Index  
and a new chapter on fractional distillation has been added.”  ...  A valuable feature... is the large selection of illustrative problems.” William Rarita. Science, April 18, 1952, p.425. 1 col.  ... 

Page 300 of Chemical Engineering Vol. 52, Issue 11 [page]

1945 Chemical Engineering  
Hercules Powder ( »., Wil n gton, De 6-page booklet featur ng cellu : etate and including comprehensive t t on manutacture of this product together wit ts chemical and physical properties I he various  ...  —Catalog No. 35-A. 32-page booklet describ ng the ill steel sectional water tube boilers built yy ti s company Features of design together with lustrat s of a large number of appli t s and imstallations  ... 

Distilling Object Detectors with Feature Richness [article]

Zhixing Du, Rui Zhang, Ming Chang, Xishan Zhang, Shaoli Liu, Tianshi Chen, Yunji Chen
2021 arXiv   pre-print
However, most of the existing distillation-based detection methods mainly imitating features near bounding boxes, which suffer from two limitations.  ...  As a model compression and acceleration method, knowledge distillation effectively improves the performance of small models by transferring the dark knowledge from the teacher detector.  ...  A comprehensive overhaul of feature distillation.  ... 
arXiv:2111.00674v4 fatcat:7zxjl2cvhvgurave3qd5ngrjcm

Channel-wise Knowledge Distillation for Dense Prediction [article]

Changyong Shu, Yifan Liu, Jianfei Gao, Zheng Yan, Chunhua Shen
2021 arXiv   pre-print
To this end, we first transform the feature map of each channel into a probabilty map using softmax normalization, and then minimize the Kullback-Leibler (KL) divergence of the corresponding channels of  ...  Knowledge distillation (KD) has been proven to be a simple and effective tool for training compact models.  ...  A comprehensive overhaul of feature distillation. In Int. Conf. Comput. Vis., pages 1921-19302, 2019. [16] Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. Distilling the knowledge in a neural network.  ... 
arXiv:2011.13256v4 fatcat:dyflddtr5jas5mliowjwudqoom

Interpretable Embedding Procedure Knowledge Transfer via Stacked Principal Component Analysis and Graph Neural Network [article]

Seunghyun Lee, Byung Cheol Song
2021 arXiv   pre-print
This paper proposes a method of generating interpretable embedding procedure (IEP) knowledge based on principal component analysis, and distilling it based on a message passing neural network.  ...  Knowledge distillation (KD) is one of the most useful techniques for light-weight neural networks.  ...  (Heo et al. 2018) , relational knowledge distillation (RKD) , and comprehensive overhaul (CO) (Heo et al. 2019) . were adopted for comparison with the proposed method.  ... 
arXiv:2104.13561v1 fatcat:o4kzvnwjovgo5n6xqzwjtamkbq

Management of power transformers reliability by technologies of control on resource characteristics of liquid dielectric

Svetlana Vysogorets, Alexandr Nazarychev, Ilia Gorec, Alexey Tadjibaev, N. Voropai, S. Senderov, A. Michalevich, H. Guliev
2018 E3S Web of Conferences  
The relationship between the resource of a power transformer and the resource characteristics of a liquid dielectric is shown.  ...  For the first time, "The method of experimental determination of the liquid dielectric resource and measures for its restoration" was developed.  ...  Transformer oils are a complex multicomponent system obtained by cleaning distillates of petroleum at a certain temperature.  ... 
doi:10.1051/e3sconf/20185802007 fatcat:75xtgdcq6zeojce7xlraa4mhdq

An Essential Introductory Text

Amanda Norvell
2004 Cell Biology Education  
For those familiar with the previous edition of ECB, the new version is not simply an update, but rather an overhaul of the first edition.  ...  FLOW AND AUDIENCE By distilling the foundational concepts and methods of modern molecular and cellular biology, while making minimal assumptions about a student's previous background in biology, ECB serves  ...  MBoC is a sophisticated and comprehensive extension of ECB, and I anticipate that the familiar feel of the writing and design of MBoC will mean a smooth transition for those students who had ECB as their  ... 
doi:10.1187/cbe.04-07-0052 pmcid:PMC533123 fatcat:fqcvy2wjlbh7hnrkbtlbashqfm

Progress in Chemical Engineering

J. F. RICHARDSON
1957 Nature  
Two of the attractive features of the book occur at the end of each chapter, where the comprehensive lists of references will delight the serious student of the subject.  ...  This does not mean that the book has merely been expanded ; it has had a proper overhaul and has, in fact, been shortened a little.  ... 
doi:10.1038/179503b0 fatcat:bhrdydbp5zhbhlhoks2n4p4i5y

Improving the Energy Efficiency of Petrochemical Plant Operations: A Measurement and Verification Case Study Using a Balanced Wave Optimizer

Man Hin Eve Chan, Kar-Kit Chu, Hin-Fung Chow, Chi-Wing Tsang, Chi Kuen Danny Ho, Shuk-Kei Ho
2019 Energies  
Other uses of water pumps in these industries include producing steam for heating, preparing reaction media or absorptive reagents, rinsing products, and distilling.  ...  An average saving of energy of about 10.46% is recorded in a 5-week reporting period.  ...  The pumps stopped in a complete overhaul interval of about 42 days just after 10 weeks of BWTs had started.  ... 
doi:10.3390/en12214136 fatcat:s6457japyjf6vdxmfadly5jxta

CHARACTERISTICS OF URBAN PARK TREES IN HONG KONG IN RELATION TO GREENSPACE PLANNING AND DEVELOPMENT

C.Y. Jim
2004 Acta Horticulturae  
Trees in ten major urban parks in the city core were comprehensively surveyed.  ...  Seventeen specific recommendations have been proposed to improve the quantity and quality of the park tree population to meet modern and changing demands of the city.  ...  A systematic survey of the park trees could furnish the baseline information to prepare a comprehensive overhaul plan.  ... 
doi:10.17660/actahortic.2004.643.14 fatcat:dm5eh44bjngvbg2myg3urm5phi
« Previous Showing results 1 — 15 out of 1,744 results