A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Addressing the Under-Translation Problem from the Entropy Perspective
2019
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
In fine-grained phase, we propose three methods, including pre-training method, multitask method and two-pass method, to encourage the neural model to correctly translate these high-entropy words. ...
Through analysis, we observe that a source word with a large translation entropy is more inclined to be dropped. To address this problem, we propose a coarse-to-fine framework. ...
If the translation entropy of a source word s exceeds the predefined threshold e 0 , i.e., E(s) > e 0 , we treat this word as a high-entropy word. ...
doi:10.1609/aaai.v33i01.3301451
fatcat:7kzp35j3n5fpdc24u74mh7natu
Target and Task specific Source-Free Domain Adaptive Image Segmentation
[article]
2022
arXiv
pre-print
However, due to domain shift, these pseudo-labels are usually of high entropy and denoising them still does not make them perfect labels to supervise the model. ...
In the first stage, we focus on generating target-specific pseudo labels while suppressing high entropy regions by proposing an Ensemble Entropy Minimization loss. ...
However, using just one input data to generate the entropy map might not give us all the accurate locations of high entropy. ...
arXiv:2203.15792v1
fatcat:vwlupjizyjhp7pbnf7ofz3trnm
Design Principles for True Random Number Generators for Security Applications
2019
Proceedings of the 56th Annual Design Automation Conference 2019 on - DAC '19
The generation of high quality true random numbers is essential in security applications. ...
For secure communication, we also require high quality true random number generators (TRNGs) in embedded and IoT devices. ...
Stochastic models of physical noise sources are used to guarantee entropy of the raw random bits. ...
doi:10.1145/3316781.3323482
dblp:conf/dac/GrujicRJKV19
fatcat:zmo6yqiqizchbpsokagvremqmq
Source-Relaxed Domain Adaptation for Image Segmentation
[article]
2020
arXiv
pre-print
Domain adaptation (DA) has drawn high interests for its capacity to adapt a model trained on labeled source data to perform well on unlabeled or weakly labeled target data from a different domain. ...
We show the effectiveness of our prior-aware entropy minimization in adapting spine segmentation across different MRI modalities. ...
Without adaptation, a model trained on source data only can't recover the structure of the IVD on the target data, and is very uncertain, as revealed by the high activations in the prediction entropy maps ...
arXiv:2005.03697v1
fatcat:37fikv6cgbc3ljdsfxstg27roq
ADVENT: Adversarial Entropy Minimization for Domain Adaptation in Semantic Segmentation
2019
2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
To this end, we propose two novel, complementary methods using (i) an entropy loss and (ii) an adversarial loss respectively. ...
In this work, we address the task of unsupervised domain adaptation in semantic segmentation with losses based on the entropy of the pixel-wise predictions. ...
We start from a simple observation: models trained only on source domain tend to produce over-confident, i.e., low-entropy, predictions on source-like images and under-confident, i.e., high-entropy, predictions ...
doi:10.1109/cvpr.2019.00262
dblp:conf/cvpr/VuJBCP19
fatcat:ywxufgupfvfejjfuzngweqcbeq
ADVENT: Adversarial Entropy Minimization for Domain Adaptation in Semantic Segmentation
[article]
2019
arXiv
pre-print
To this end, we propose two novel, complementary methods using (i) entropy loss and (ii) adversarial loss respectively. ...
In this work, we address the task of unsupervised domain adaptation in semantic segmentation with losses based on the entropy of the pixel-wise predictions. ...
We start from a simple observation: models trained only on source domain tend to produce over-confident, i.e., low-entropy, predictions on source-like images and under-confident, i.e., high-entropy, predictions ...
arXiv:1811.12833v2
fatcat:e7ox63x7svbzvlvnalfbleotp4
Multifaceted Uncertainty Estimation for Label-Efficient Deep Learning
2020
Neural Information Processing Systems
We present a novel multi-source uncertainty prediction approach that enables deep learning (DL) models to be actively trained with much less labeled data. ...
Experiments conducted over both synthetic and real data and comparison with competitive AL methods demonstrate the effectiveness of the proposed ADL model. ...
Evidence-Aware Entropy Decomposition As discussed earlier, a high entropy may be contributed by difference sources of uncertainty with distinct characteristics. ...
dblp:conf/nips/ShiZ0020
fatcat:5h3yhravpjcqfi7sjqmm6ld2km
Hierarchical Decomposition Thermodynamic Approach for the Study of Solar Absorption Refrigerator Performance
2016
Entropy
Under the hypothesis of an endoreversible model, the effects of the generator, the solar concentrator and the solar converter temperatures, on the coefficient of performance (COP), are presented and discussed ...
In fact, the coefficient of performance variations, according to the ratio of the heat transfer areas of the high temperature part (the thermal engine 2) A h and the heat transfer areas of the low temperature ...
The Carnot model is an ideal model far from the reality as it doesn't take into account the entropy production. ...
doi:10.3390/e18030082
fatcat:ssgk2vwl4vaztmzvyhoeylpccm
Physical and conceptual identifier dispersion: Measures and relation to fault proneness
2010
2010 IEEE International Conference on Software Maintenance
We show statistically that methods containing terms with high entropy and context coverage are more fault-prone than others. ...
Entropy measures the physical dispersion of terms in a program: the higher the entropy, the more scattered across the program the terms. ...
reduce entropy and high context coverage. ...
doi:10.1109/icsm.2010.5609748
dblp:conf/icsm/ArnaoudovaEOGA10
fatcat:mgs2n2lrc5gqflqwnn56jhgiou
Predicting the Lossless Compression Ratio of Remote Sensing Images with Configurational Entropy
2021
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
For lossless compression, the upper and lower limits of compression ratio are defined by Shannon's Source Coding Theorem with Shannon entropy as the metric, which measures the statistical information of ...
This study provides a new direction for building a theoretical prediction model with configurational entropy. ...
prediction models. ...
doi:10.1109/jstars.2021.3123650
fatcat:pgl3ejgj65citd3xlaylufxoxq
Convolution Neural Network-Based Sensitive Security Parameter Identification and Analysis
2022
Wireless Communications and Mobile Computing
We identify noise sources that are used as entropy sources with our convolution neural network model. ...
To ensure the security, the noise sources used to construct the entropy source must be securely collected. ...
As a result of training the model with images of entropy sources, the patterns of bit strings that are difficult to identify are distinguished with the naked eye with high accuracy. ...
doi:10.1155/2022/9584894
fatcat:r2yz3ltcnbazfbcc4vut6rqkd4
An Optimal Framework for SDN Based on Deep Neural Network
2022
Computers Materials & Continua
The initial inspection module detects the suspicious network traffic by computing the information entropy value of the data packet's source and destination Internet Protocol (IP) addresses, and then identifies ...
The false alarm rate (FAR) is much lower than that of the information entropy-based detection method. ...
This model makes up for the shortcomings of the detection algorithm based on information entropy of low DR and high FAR. ...
doi:10.32604/cmc.2022.025810
fatcat:aya5eehdvrhxpgd3xvf7fg5vuy
Modelling the effects of internal heating in the core and lowermost mantle on the earth's magnetic history
2006
Physics of the Earth and Planetary Interiors
This scenario has been discussed based on parameterized thermal and magnetic models of the core [Buffett, B.A., 2002. ...
Recently, an incompatible-element enriched reservoir, bearing a high degree of radioactive heating, has been proposed to exist at the base of the mantle. ...
The D series models explore the effects of high concentrations of radioactive internal heat sources in the lowermost 200 km of the mantle, while model H1 is used to explore the effects of high internal ...
doi:10.1016/j.pepi.2006.03.009
fatcat:3n5u5nkj6fbnfofm6dcpug27ou
Detection of High Frequency Sources in Random/Uncertain Media
2004
AIP Conference Proceedings
band, high frequency source. ...
Maximum entropy method (MEM) is used to incorporate essential uncertainty into model. Maximum entropy method uses what is known in its model, but models what is not known with maximum uncertainty. ...
MODELING OF HIGH-FREQUENCY PROPAGATION IN TIME-VARYING RANDOM MEDIA We assume that both the source (or target) and receiver are in motion in a propagation medium with a random boundary and an inhomogeneous ...
doi:10.1063/1.1843018
fatcat:vu3x4nqgzzc5nn65gfmbqdzj7q
Frequency-specific brain dynamics related to prediction during language comprehension
2019
NeuroImage
We show that theta-band source dynamics are increased in high relative to low entropy states, likely reflecting lexical computations. ...
Using trigram statistical language models, we estimated for every word in a story its conditional probability of occurrence. ...
model) with 7842 source locations per hemisphere. ...
doi:10.1016/j.neuroimage.2019.04.083
pmid:31100432
fatcat:e4xm63ji3vd7znoi35i4eov5nq
« Previous
Showing results 1 — 15 out of 277,840 results