57,911 Hits in 5.9 sec

Efficient-FedRec: Efficient Federated Learning Framework for Privacy-Preserving News Recommendation [article]

Jingwei Yi, Fangzhao Wu, Chuhan Wu, Ruixuan Liu, Guangzhong Sun, Xing Xie
2021 arXiv   pre-print
Instead of training and communicating the whole model, we decompose the news recommendation model into a large news model maintained in the server and a light-weight user model shared on both server and  ...  The server updates its global user model with the aggregated gradients, and further updates its news model to infer updated news representations.  ...  For example, propose PLM-NR to empower news modeling by applying pre-trained language.  ... 
arXiv:2109.05446v2 fatcat:64wwolcfdfd7dadv3lcdhr5v6i

EmPOWER: An Adaptable Writing Intervention

Carly Dinnes
2020 ˆThe ‰Nebraska educator  
EmPOWER is a six-stage writing intervention designed by speech-language pathologists to improve the expository writings of school-aged children with language learning and executive function disabilities  ...  have led to the development of a variety of recommendations for scaffolding writing development such as providing students with models of writing to review, more opportunities to write, and the option  ...  Thus, the purpose of this analysis is to (a) examine EmPOWER, a structured approach to writing intervention, (b) compare EmPOWER to a new writing model to identify areas of the writing process that are  ... 
doi:10.32873/unl.dc.ne001 fatcat:xeovlvyaszeslpcl22n3kgjvsq

NewsBERT: Distilling Pre-trained Language Model for Intelligent News Application [article]

Chuhan Wu, Fangzhao Wu, Yang Yu, Tao Qi, Yongfeng Huang, Qi Liu
2021 arXiv   pre-print
Pre-trained language models (PLMs) like BERT have made great progress in NLP.  ...  However, existing language models are pre-trained and distilled on general corpus like Wikipedia, which has some gaps with the news domain and may be suboptimal for news intelligence.  ...  In addition, the corpus for pre-training and distilling these language models usually has some domain shifts with news texts.  ... 
arXiv:2102.04887v2 fatcat:j757cp3gqrbe7pvzg74yudk47u

Learning Word and Sub-word Vectors for Amharic (Less Resourced Language)

Abebawu Eshetu, Getenesh Teshome, Tewodros Abebe
2020 International Journal of Advanced Engineering Research and Science  
The availability of pre-trained word embedding models (also known as word vectors) empowered many tasks in natural language processing, leading to state-of-the-art performance.  ...  We also introduced new word analogy dataset to evaluate word vectors for Amharic language.  ...  ., 2013) also provides pre-trained embedding trained on Wikipedia. While word vectors with about 3K vocabularies of Amharic language is included with Al-Rfou et al.  ... 
doi:10.22161/ijaers.78.39 fatcat:qwabdbo3cnc5jjcpqjrb3qte5a

PTUM: Pre-training User Model from Unlabeled User Behaviors via Self-supervision [article]

Chuhan Wu, Fangzhao Wu, Tao Qi, Jianxun Lian, Yongfeng Huang, Xing Xie
2020 arXiv   pre-print
Motivated by pre-trained language models which are pre-trained on large-scale unlabeled corpus to empower many downstream tasks, in this paper we propose to pre-train user models from large-scale unlabeled  ...  The pre-trained user models are finetuned in downstream tasks to learn task-specific user representations.  ...  Okura et al. (2017) proposed to use a GRU network for news recommendation, which models users from their clicked news.  ... 
arXiv:2010.01494v1 fatcat:jbfh6ajm2rhvvisyadbxunn6me

Tiny-NewsRec: Efficient and Effective PLM-based News Recommendation [article]

Yang Yu, Fangzhao Wu, Chuhan Wu, Jingwei Yi, Tao Qi, Qi Liu
2021 arXiv   pre-print
Recently, pre-trained language models (PLMs) have demonstrated the great capability of natural language understanding and the potential of improving news modeling for news recommendation.  ...  In order to reduce the domain gap between general corpora and the news data, we propose a self-supervised domain-specific post-training method to adapt the generally pre-trained language models to the  ...  Pre-trained language models (PLMs) are powerful in text modeling and have empowered various NLP tasks (Devlin et al. 2019; Liu et al. 2019) .  ... 
arXiv:2112.00944v1 fatcat:q4xpsnqv35bp7mazeocezzlute

Plurilingualism in a Constructively Aligned and Decolonized TESOL Curriculum

Dulani Suraweera
2022 The TESL Canada Journal  
pedagogies in a TESOL curriculum for the purpose of training future EAL teachers to empower their adult EAL learners globally.  ...  Drawing on the literature of plurilingualism, decolonization of knowledge production, and curriculum design, this article discusses how plurilingual approaches can be combined with critical and transformative  ...  pedagogies in a TESOL curriculum for the purpose of training future EAL teachers to empower their adult EAL learners globally.  ... 
doi:10.18806/tesl.v38i2.1355 fatcat:a5hm4wkjdvfpnhkznoxin7766q

Editorial The Hartford Consensus to improve survivability in mass casualty events: Process to policy

Lenworth Jacobs, MD, MPH, FACS, Karyl J. Burns , RN, PhD
2014 American Journal of Disaster Medicine  
First Care Provider Training The First Care Provider model empowers community members to take life saving actions.  ...  The new model must train first responders to identify the FCP, conduct a rapid threat assessment, appropriately gauge the FCP skill level, provide clear assignments to the FCP, and utilize the FCP as a  ... 
doi:10.5055/ajdm.2014.0143 pmid:24715646 fatcat:auqyoocwizhv3lhjqzbz4nqbwa

Metacognitive Skills of Junior High School Students in a Pandemic Period Based on the Enriched Virtual Model of PjBL

Elisa Rohimatun Nafi'ah, Elly Purwanti, Fendy Hardian Permana, Ahmad Fauzi
2022 Journal of Education Technology  
Data collection is done by using pre-test, post-test, and assignment. Successively, the test used to analyze the metacognitive skills data was One-Way ANCOVA.  ...  Moreover, the development of science and technology in the 21st century provides new challenges for the world of education, requiring students to have cognitive skills and metacognitive skills.  ...  Therefore, teachers need to apply specific learning models optimally to empower students' metacognition. One learning model that can empower metacognitive skills is project-based learning (PjBL).  ... 
doi:10.23887/jet.v6i1.41470 fatcat:5g7gdiejqjghzonlpytzylkelu

UserBERT: Contrastive User Model Pre-training [article]

Chuhan Wu, Fangzhao Wu, Yang Yu, Tao Qi, Yongfeng Huang, Xing Xie
2021 arXiv   pre-print
Two self-supervision tasks are incorporated in UserBERT for user model pre-training on unlabeled user behavior data to empower user modeling.  ...  User modeling is critical for personalized web applications. Existing user modeling methods usually train user models from user behaviors with task-specific labeled data.  ...  By pre-training the user model with unlabeled user behaviors via self-supervision, the model can exploit the universal user information conveyed by user behaviors to empower downstream tasks.  ... 
arXiv:2109.01274v1 fatcat:64ijwcctergzjotk3f4ch2hxca

Fastformer: Additive Attention Can Be All You Need [article]

Chuhan Wu, Fangzhao Wu, Tao Qi, Yongfeng Huang, Xing Xie
2021 arXiv   pre-print
In this way, Fastformer can achieve effective context modeling with linear complexity.  ...  on its interaction with global context representations.  ...  ) , a fine-grained interest matching method for personalized news recommendation; (3) PLM-NR (Wu et al., 2021a) , empowering news recommendation with pre-trained language models.  ... 
arXiv:2108.09084v6 fatcat:yjlyhq7rrrdl7jwqpdnaqblkty

Digital Health-Enabled Community-Centered Care (D-CCC): A Scalable Model to Empower Future Community Health Workers utilizing Human-in-the-Loop AI [article]

Sarah M. Rodrigues, Anil Kanduri, Adeline M. Nyamathi, Nikil Dutt, Pramod P. Khargonekar, Amir M. Rahmani
2021 medRxiv   pre-print
Utilizing an artificial intelligence-enabled closed-loop digital health platform designed for, and with, community health workers, D-CCC enables timely and individualized delivery of interventions by community  ...  restricted community health worker-delivered care and services into an expanded, digitally interconnected and collaborative community-centered health and social care ecosystem which centers around a digitally empowered  ...  The authors would like to also thank Arman Anzanpour for his assistance with the conceptualization and visualization of the figures.  ... 
doi:10.1101/2021.03.03.21252873 fatcat:zjev4pbylfarpmr3s6emdkegri

Lesson Study for Professional Development of English Language Teachers: Key Takeaways from International Practices

Özgehan Uştuk, İrem Çomoğlu
2019 Journal on Efficiency and Responsibility in Education and Science  
Given the empowering dimension of lesson study model both in terms of content and form, it can be adopted as a model for effective and sustainable language teacher professional development.  ...  Considering the deficiencies of the current language teacher professional development practices in Turkey, this paper provides a systematic review of lesson study as a professional development model for  ...  In tandem with this background, the authors argue that this model can be discussed within the context of a recent reformist movement in Turkey that specifically aims to empower foreign language teachers  ... 
doi:10.7160/eriesj.2019.120202 fatcat:krkf3uzywfe3tb7j4kcd4wqrdm

The impact of virtual trips on the development of Arabic language listening skills among third grade students in Jordan

Nail M. Alhajya, Sumaia S. Alzaghamim, Yousef M. Arouri
2018 Journal of Technology and Science Education  
Additional implications and future recommendations were argued.  ...  This study aimed at investigating the impact of virtual trips on developing Arabic language listening skills among third grade students in Jordan.  ...  Another recommendation is to conduct further quasi-experimental studies to detect the impact of virtual trips on the development of various Arabic language skills such as reading, writing, anthem, and  ... 
doi:10.3926/jotse.331 fatcat:istyvmlidrbpbcdw6bfxa76ldm

SocialML: machine learning for social media video creators [article]

Tomasz Trzcinski, Adam Bielski, Paweł Cyrta, Matthew Zak
2018 arXiv   pre-print
videos with over three billion views per month.  ...  In this work, we present a comprehensive overview of machine learning-empowered tools we developed for video creators at Group Nine Media - one of the major social media companies that creates short-form  ...  We trained a model for binary classification by fine-tuning the last layer of ResNet50 [1] model pre-trained on ImageNet dataset [2] .  ... 
arXiv:1802.02204v1 fatcat:kexuowksarhazp47zpcxbi6qzm
« Previous Showing results 1 — 15 out of 57,911 results