Filters








462 Hits in 9.9 sec

Call Redistribution for a Call Center Based on Speech Emotion Recognition

Milana Bojanić, Vlado Delić, Alexey Karpov
<span title="2020-07-06">2020</span> <i title="MDPI AG"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/smrngspzhzce7dy6ofycrfxbim" style="color: black;">Applied Sciences</a> </i> &nbsp;
The proposed recognition of call urgency and consequent call ranking and redistribution is based on emotion recognition in speech, giving greater priority to calls featuring emotions such as fear, anger  ...  This research aims to improve the functionality of call centers by recognition of call urgency and redistribution of calls in a queue.  ...  In the experiments three acted databases were used: the Danish Emotional speech (DES), the Berlin database (Emo-DB) and the Serbian GEES database, and one spontaneous database (Aibo corpus).  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/app10134653">doi:10.3390/app10134653</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/y2mjhoptbrbyvktz35jfhg6riu">fatcat:y2mjhoptbrbyvktz35jfhg6riu</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20201128093540/https://res.mdpi.com/d_attachment/applsci/applsci-10-04653/article_deploy/applsci-10-04653.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/d3/24/d324bfbdf7d391f34c34aa18119ebc3f03b661f8.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/app10134653"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> mdpi.com </button> </a>

User-awareness and adaptation in conversational agents

Vlado Delic, Milan Gnjatovic, Niksa Jakovljevic, Branislav Popovic, Ivan Jokic, Milana Bojanic
<span title="">2014</span> <i title="National Library of Serbia"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/i5izkhv2lvg4blwkmbvmyjratu" style="color: black;">Facta universitatis - series Electronics and Energetics</a> </i> &nbsp;
It focuses particularly on the development of speech recognition modules in cooperation with both modules for emotion recognition and speaker recognition, as well as the dialogue management module.  ...  The conversational agent is user-adaptive to the extent that it dynamically adapts its dialogue behavior according to the user and his/her emotional state.  ...  Preliminary experiments conducted on the GEES database confirmed that, e.g., the emotion of anger changes the speaker's voice (i.e., timbre) to the greater extent than the emotion of sadness.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.2298/fuee1403375d">doi:10.2298/fuee1403375d</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/lpt5vokizrauhnunbb44degx6a">fatcat:lpt5vokizrauhnunbb44degx6a</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170921232542/http://casopisi.junis.ni.ac.rs/index.php/FUElectEnerg/article/download/266/146" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/d1/81/d18142c35e21b0778a5a76313d6706c4b94f59b1.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.2298/fuee1403375d"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> Publisher / doi.org </button> </a>

Mapping Discrete Emotions in the Dimensional Space: An Acoustic Approach

Marián Trnka, Sakhia Darjaa, Marian Ritomský, Róbert Sabo, Milan Rusko, Meilin Schaper, Tim H. Stelkens-Kobsch
<span title="2021-11-27">2021</span> <i title="MDPI AG"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/ikdpfme5h5egvnwtvvtjrnntyy" style="color: black;">Electronics</a> </i> &nbsp;
Mapping of categorical emotions to the dimensional space is tested on another pool of eight categorically annotated databases.  ...  The system is trained on a pool of three publicly available databases with dimensional annotation of emotions. The quality of regression is evaluated on the test sets of the same databases.  ...  However, the inner image of emotion in a person's mind and the idea of how it is to be presented in speech depends to a large extent on his experience, education, and to a large extent on the culture in  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/electronics10232950">doi:10.3390/electronics10232950</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/plkjj6j345dvxfufoe4i25sgga">fatcat:plkjj6j345dvxfufoe4i25sgga</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220503173227/https://mdpi-res.com/d_attachment/electronics/electronics-10-02950/article_deploy/electronics-10-02950.pdf?version=1638001770" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/06/79/06798492caf813a0414ebf217b55c2d166027557.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.3390/electronics10232950"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> mdpi.com </button> </a>

Analysis of Real-World Driver's Frustration

Lucas Malta, Chiyomi Miyajima, Norihide Kitaoka, Kazuya Takeda
<span title="">2011</span> <i title="Institute of Electrical and Electronics Engineers (IEEE)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/in6o6x6to5e2dls4y2ff52dy6u" style="color: black;">IEEE transactions on intelligent transportation systems (Print)</a> </i> &nbsp;
While driving, drivers also interact with an automatic speech recognition (ASR) system to retrieve and play music.  ...  Using a Bayesian network, we combine knowledge on the driving environment, assessed through data annotation, speech recognition errors, driver's emotional state (frustration), and driver's responses measured  ...  The inclusion of speech recognition errors was effective in increasing the overall accuracy of results, suggesting that ASR systems have an impact on the driver's emotional state and, thus, should be considered  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tits.2010.2070839">doi:10.1109/tits.2010.2070839</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/m53dj5lcqneqpl7rnaxa3gpvv4">fatcat:m53dj5lcqneqpl7rnaxa3gpvv4</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170830045345/http://ir.nul.nagoya-u.ac.jp/jspui/bitstream/2237/14601/1/1054.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/40/ee/40ee16f144b230c1d2166a0d02519f44c7ef80fa.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1109/tits.2010.2070839"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> ieee.com </button> </a>

TOWARD AN UNDERSTANDING OF THE ROLE OF SPEECH RECOGNITION IN NONNATIVE SPEECH ASSESSMENT

Klaus Zechner, Isaac I. Bejar, Ramin Hemat
<span title="">2007</span> <i title="Wiley"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/jf5pdoipu5gyfag6ffep34ce3e" style="color: black;">ETS Research Report Series</a> </i> &nbsp;
We then adapted a speech engine to the language backgrounds and proficiency ranges of the speakers and developed a classification and regression tree (CART) for each of five prompts based on features computed  ...  In this investigation, we evaluated the feasibility of using an off-the-shelf speech-recognition system for scoring speaking prompts from the LanguEdge field test of 2002.  ...  By contrast, speaker-dependent speech recognition is oriented to recognizing with a high level of accuracy speech based on a small vocabulary, where the system has been trained to the idiosyncrasies of  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1002/j.2333-8504.2007.tb02044.x">doi:10.1002/j.2333-8504.2007.tb02044.x</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/tsnlv5zc3bhr5bxf3ghz5uitrm">fatcat:tsnlv5zc3bhr5bxf3ghz5uitrm</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20081121200020/http://www.ets.org/Media/Research/pdf/RR-07-02.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/d4/3f/d43fe0c6cf05f4142d7fd75328cf895cdd225ece.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1002/j.2333-8504.2007.tb02044.x"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> wiley.com </button> </a>

Embodied conversational agents in computer assisted language learning

Preben Wik, Anna Hjalmarsson
<span title="">2009</span> <i title="Elsevier BV"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/yx5bmukg7revlpm4yd5obfwtsa" style="color: black;">Speech Communication</a> </i> &nbsp;
The feedback a learner of a new language (L2) receives when talking to a language teacher differs dramatically from the feedback one usually gets when talking to a native speaker.  ...  The role-plays developed for DEAL should also include an aspect of drama, and the agent in DEAL should thus be able to display emotions such as surprise, anger, or joy.  ...  This research was carried out at the Centre for Speech Technology, KTH. The research is also supported by the Graduate School for Language Technology (GSLT).  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1016/j.specom.2009.05.006">doi:10.1016/j.specom.2009.05.006</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/4cscbltcjbg4bfhi7zfofxmwiu">fatcat:4cscbltcjbg4bfhi7zfofxmwiu</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20131126120755/http://www.speech.kth.se:80/prod/publications/files/3350.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/7a/20/7a20a3c31bf313f1c91784bd10b136b67231569c.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1016/j.specom.2009.05.006"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> elsevier.com </button> </a>

Investigating the Psychological and Emotional Dimensions in Instructed Language Learning: Obstacles and Possibilities

JEAN-MARC DEWAELE
<span title="2005-08-22">2005</span> <i title="Wiley"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/wzavi2p66benbnx7pkdgkp2uny" style="color: black;">The Modern Language Journal</a> </i> &nbsp;
of individual differences in SLA, and research on the expression of emotion in the L2.  ...  teaching materials and foreign language teachers to pay increased attention to the communication of emotion and the development of sociocultural competence in a L2.  ...  One such potential cause is the extraverts' superior capacity in shortterm memory, allowing them to maintain automatic speech production in stressful situations.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1111/j.1540-4781.2005.00311.x">doi:10.1111/j.1540-4781.2005.00311.x</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/igssoioyfnc6vhq46sa4ahzg2i">fatcat:igssoioyfnc6vhq46sa4ahzg2i</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170808190612/http://www.gwinnett.k12.ga.us/HopkinsES/Alfonso_Web/ESOL%20Modification%20Research/Dewaele_Affective_filter.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/cb/a0/cba0fa1ada427649fb9bf0cc09c3e11e4e1168a0.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1111/j.1540-4781.2005.00311.x"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

Mental Stress Detection using Data from Wearable and Non-wearable Sensors: A Review [article]

Aamir Arsalan, Syed Muhammad Anwar, Muhammad Majid
<span title="2022-02-07">2022</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
Whereas, methods based on non-wearable sensors include strategies such as analyzing pupil dilation and speech, smartphone data, eye movement, body posture, and thermal imaging.  ...  A wide range of studies has attempted to establish a relationship between these stressful situations and the response of human beings by using different kinds of psychological, physiological, physical,  ...  The performance of automatic speech recognition could be improved if the speaker stress is accurately identified (Kadambe, 2007) .  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2202.03033v1">arXiv:2202.03033v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/ejzj3wn7drcdnidtbghfiv4x4a">fatcat:ejzj3wn7drcdnidtbghfiv4x4a</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220209091424/https://arxiv.org/pdf/2202.03033v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/21/8c/218ced83a4acc3fd90b4993c7816c80c3b92de67.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2202.03033v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Acoustics of ancient Greek and Roman theaters in use today

Anders Christian Gade, Konstantinos Angelakis
<span title="">2006</span> <i title="Acoustical Society of America (ASA)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/hwn3tbm3t5cpnjcflcmmwnprc4" style="color: black;">Journal of the Acoustical Society of America</a> </i> &nbsp;
As a result of evaluation experiments, we could confirm that the ASR ͑automatic speech recognition͒ performance of distant-talking speech was improved on the well-known condition of noise source directions  ...  Evaluation of the performance of the speech event separation by automatic speech recognition is also presented. 3aSC39. Native speakers and Korean-speaking advanced learners of 8:25 3aSP2.  ...  During the ONR Main Acoustic Clutter Experiment of 2003 on the New Jersey Continental Shelf, short duration broadband pulses with 50-to 150-Hz bandwidths were transmitted at 50-s intervals from a moored  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1121/1.4787803">doi:10.1121/1.4787803</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/3lczeegofreblpjmfixt7gdxv4">fatcat:3lczeegofreblpjmfixt7gdxv4</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170922023437/http://orbit.dtu.dk/files/4453277/Gade.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/c8/bc/c8bceca422bdeb449a277d7a17c3d381c107f375.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1121/1.4787803"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

Techniques and Challenges in Speech Synthesis [article]

David Ferris
<span title="2017-09-22">2017</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
A method of automatically identifying and extracting diphones from prompted speech was designed, allowing for the creation of a diphone database by a speaker in less than 40 minutes.  ...  This involved a study of mechanisms of human speech production, a review of techniques in speech synthesis, and analysis of tests used to evaluate the effectiveness of synthesized speech.  ...  One of the most common approaches is to perform speech recognition on a large amount of recorded speech. Distinct diphones are extracted and saved into the database with some redundancy.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1709.07552v1">arXiv:1709.07552v1</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/o75yc226ubanppunnqy37jhdua">fatcat:o75yc226ubanppunnqy37jhdua</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20191018141351/https://arxiv.org/pdf/1709.07552v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/c3/2b/c32b8174b72bc82e7c9e478cd3ce3031625d2449.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1709.07552v1" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Computational Sociolinguistics: A Survey [article]

Dong Nguyen, A. Seza Doğruöz, Carolyn P. Rosé, Franciska de Jong
<span title="2016-04-06">2016</span> <i > arXiv </i> &nbsp; <span class="release-stage" >pre-print</span>
We aim to provide a comprehensive overview of CL research on sociolinguistic themes, featuring topics such as the relation between language and social identity, language use in social interaction and multilingual  ...  We hope to convey the possible benefits of a closer collaboration between the two communities and conclude with a discussion of open challenges.  ...  Acknowledgments Thanks to Mariët Theune, Dirk Hovy and Marcos Zampieri for helpful comments on the draft. Thanks also to the anonymous reviewers for their valuable and detailed feedback.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1508.07544v2">arXiv:1508.07544v2</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/lrcu4ypncrbungqtnw56ggmnfu">fatcat:lrcu4ypncrbungqtnw56ggmnfu</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200902074707/https://arxiv.org/pdf/1508.07544v2.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/ab/be/abbed4f1d48f62a3db1a672d034d6189661cdd5f.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener" href="https://arxiv.org/abs/1508.07544v2" title="arxiv.org access"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> arxiv.org </button> </a>

Quantitative assessment of spatial sound distortion by the semi-ideal recording point of a hear-through device

Pablo Hoffmann, Flemming Christensen, Dorte Hammersho/i
<span title="">2013</span> <i title="Acoustical Society of America (ASA)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/hwn3tbm3t5cpnjcflcmmwnprc4" style="color: black;">Journal of the Acoustical Society of America</a> </i> &nbsp;
Recognition of emotions in spoken speech, spoken language translation system, etc.  ...  The method, based on articulatory phonology is tested on the MOCHA database for recognizing a male speaker and a female speaker.  ...  We carried out an evaluation experiment in various measuring points to verify the effectiveness of the proposed method.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1121/1.4805375">doi:10.1121/1.4805375</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/u4bvxh6karet3avpjdhpfwqpf4">fatcat:u4bvxh6karet3avpjdhpfwqpf4</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20180720055747/http://vbn.aau.dk/files/78642078/JAS003223.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/fb/b5/fbb55578215dd83a01d1be0defcdd121c6711a3d.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1121/1.4805375"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

Non-traditional prosodic features for automated phrase break prediction

C. Brierley, E. Atwell
<span title="2011-05-13">2011</span> <i title="Oxford University Press (OUP)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/wppgyltofzexlb6eckv6hzfvl4" style="color: black;">Literary and Linguistic Computing</a> </i> &nbsp;
The candidate confirms that the work submitted is her own and that appropriate credit has been given where reference has been made to the work of others.  ...  Also, for doctoral degrees:- This copy has been supplied on the understanding that it is copyright material and that no quotation from the thesis may be published without proper acknowledgement.  ...  TIMIT, the Acoustic-Phonetic Continuous Speech Corpus (Garofolo et al., 1993) , is a corpus of read speech designed to provide data for Automatic Speech Recognition (ASR).  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1093/llc/fqr023">doi:10.1093/llc/fqr023</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/je7ou3vkz5eaznt4zzqhjm5hfu">fatcat:je7ou3vkz5eaznt4zzqhjm5hfu</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170705064943/http://etheses.whiterose.ac.uk/2038/1/thesisBrierleySeptember2011.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/50/71/5071c5f38348f548ee1ea6dae16df6e745205bf6.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1093/llc/fqr023"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> oup.com </button> </a>

Analysis of Features and Classifiers in Emotion Recognition Systems: Case Study of Slavic Languages

<span title="">2020</span>
Today's human-computer interaction systems have a broad variety of applications in which automatic human emotion recognition is of great interest.  ...  This work emerged as an attempt to clarify which speech features are the most informative, which classification structure is the most convenient for this type of tasks, and the degree to which the results  ...  Acknowledgement The authors would like to thank Professor Slobodan Jovičić from the School of Electrical Engineering, University of Belgrade, for providing access to the GEES database, and Dr.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.24425/aoa.2020.132489">doi:10.24425/aoa.2020.132489</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/deibtmkwhffo3hgdtsd2tg3wim">fatcat:deibtmkwhffo3hgdtsd2tg3wim</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200709050414/https://acoustics.ippt.pan.pl/index.php/aa/article/download/2479/pdf_469" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/c0/d0/c0d0a732f65bec4362a47309f7eb04f91e01c57e.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.24425/aoa.2020.132489"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>

Objective evaluation of vowel pronunciation

Mark J. Bakkum, Reinier Plomp, Louis C. W. Pols
<span title="">1991</span> <i title="Acoustical Society of America (ASA)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/hwn3tbm3t5cpnjcflcmmwnprc4" style="color: black;">Journal of the Acoustical Society of America</a> </i> &nbsp;
Collection of corresponding sediment samples was carefully integrated with these seismic experiments, to better determine the geologic nature and mechanical properties of the gas-charged materials.  ...  Effects of sediment gas on chirp sonar reflection profiles.  ...  The Chairs of the respective U.S. Technical Advisory Groups for Isofrc 43 (H. E. yon Gierke), and IEC/TC 29 (V. Nedzelnitsky), will report on current activities of these Technical Committees.  ... 
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1121/1.2029328">doi:10.1121/1.2029328</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/qntsmkrqkjbkpaxn35nuxutuaa">fatcat:qntsmkrqkjbkpaxn35nuxutuaa</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20180718233450/https://pure.uva.nl/ws/files/2127712/29696_132501y.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1121/1.2029328"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> Publisher / doi.org </button> </a>
&laquo; Previous Showing results 1 &mdash; 15 out of 462 results