Filters








28,599 Hits in 5.0 sec

Understanding Long Programming Languages with Structure-Aware Sparse Attention [article]

Tingting Liu, Chengyu Wang, Cen Chen, Ming Gao, Aoying Zhou
2022 arXiv   pre-print
To solve this problem, in this paper, we present SASA, a Structure-Aware Sparse Attention mechanism, which reduces the complexity and improves performance for long code understanding tasks.  ...  The key components in SASA are top-k sparse attention and Abstract Syntax Tree (AST)-based structure-aware attention.  ...  U1911203 and 61877018, and Alibaba Group through the Alibaba Innovation Research Program.  ... 
arXiv:2205.13730v1 fatcat:u7wab6judjecpasfr6nmfcjp54

Graph Conditioned Sparse-Attention for Improved Source Code Understanding [article]

Junyan Cheng, Iordanis Fostiropoulos, Barry Boehm
2021 arXiv   pre-print
In this work, we propose the conditioning of a source code snippet with its graph modality by using the graph adjacency matrix as an attention mask for a sparse self-attention mechanism and the use of  ...  Source code can have long-range dependencies that require larger sequence lengths to model effectively.  ...  Transformers have succeeded in multiple natural language understanding tasks, and have also been evaluated on the understanding of programming languages (Ahmad et al. 2020) .  ... 
arXiv:2112.00663v2 fatcat:ymvcjyxq4rathb76jrwmw5bv6q

The role of neuroscience in the remediation of students with dyslexia

Guinevere F. Eden, Louisa Moats
2002 Nature Neuroscience  
Reading and reading failure Unlike oral language, which is learned naturally from infancy, reading is a skill that is acquired at an older age, through instruction and with effort.  ...  If implemented appropriately, commercial programs can be effective in identifying dyslexia.  ...  Acknowledgments The authors are supported by the National Institute of Child Health and Human Development (NICHID).This article has been reprinted with permission from the Center for the study of Learning  ... 
doi:10.1038/nn946 pmid:12403991 fatcat:be45sy5c65hytipedsa225ceva

DKPLM: Decomposable Knowledge-Enhanced Pre-trained Language Model for Natural Language Understanding

Taolin Zhang, Chengyu Wang, Nan Hu, Minghui Qiu, Chengguang Tang, Xiaofeng He, Jun Huang
2022 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Knowledge-Enhanced Pre-trained Language Models (KEPLMs) are pre-trained models with relation triples injecting from knowledge graphs to improve language understanding abilities.Experiments show that our  ...  model outperforms other KEPLMs significantly over zero-shot knowledge probing tasks and multiple knowledge-aware language understanding tasks.  ...  Acknowledgements This work is supported by the Alibaba Group through Alibaba Research Intern Program.  ... 
doi:10.1609/aaai.v36i10.21425 fatcat:4pookzwbbfg4pon55xfwzxvnp4

On the proper role of linguistically-oriented deep net analysis in linguistic theorizing [article]

Marco Baroni
2022 arXiv   pre-print
that unbiased learners can learn natural languages with enough data, or whether human abilities to acquire language given sparse stimulus implies a strong innate human learning bias" (Papadimitriou and  ...  This sparse grammar-aware circuit is complemented by a distributed system that can fill in the number feature based on purely sequential heuristics.  ... 
arXiv:2106.08694v2 fatcat:awcfoyt5zbf5fl3wlf7kecggaq

Code Summarization with Structure-induced Transformer [article]

Hongqiu Wu and Hai Zhao and Min Zhang
2021 arXiv   pre-print
Code summarization (CS) is becoming a promising area in recent language understanding, which aims to generate sensible human language automatically for programming language in the format of source code  ...  It is well known that programming languages are highly structured.  ...  Sparse SAN To further valid our structure-based approach, we compare the performance of structure-induced attention with other sparse attention patterns, window attention in Longformer, ETC (Beltagy et  ... 
arXiv:2012.14710v2 fatcat:qshoiyhanbbpdp4k7ktgaopmv4

Early intervention of Autism Spectrum Disorder: translating research into practice

Preeti Kandasamy, Harshini Manohar
2018 Indian Journal Of Mental Health And Neurosciences  
Though accelerating interventional research in ASD is observed globally, evidence from Indian setting is sparse.  ...  Improving awareness among parents, medical professionals and stakeholders is the first step forward, towards translating research into practice.  ...  This ray of hope prepares them to understand and engage in behavioral interventions, speech and language and occupational therapy as indicated in the short and long term.  ... 
doi:10.32746/ijmhns.2018.v1.i1.3 fatcat:rquflbstcrgwljlkm7zggkq7vq

Starting small: Building preschool teacher knowledge that supports early literacy development

Anne E. Cunningham, Jamie Zibulsky, Mia D. Callahan
2009 Reading and writing  
Recommendations for strengthening professional development programs and developing more robust measures of preschool teacher knowledge are proposed.  ...  literacy skills in general, and awareness of the structure of the English language more specifically.  ...  students' knowledge of spoken and written language structures.  ... 
doi:10.1007/s11145-009-9164-z fatcat:j6r2snlmv5c2hlduv37j7kz26e

Table Pre-training: A Survey on Model Architectures, Pre-training Objectives, and Downstream Tasks [article]

Haoyu Dong, Zhoujun Cheng, Xinyi He, Mengyu Zhou, Anda Zhou, Fan Zhou, Ao Liu, Shi Han, Dongmei Zhang
2022 arXiv   pre-print
And to best leverage the characteristics of (semi-)structured tables, various tabular language models, particularly with specially-designed attention mechanisms, have been explored.  ...  Since tables usually appear and interact with free-form text, table pre-training usually takes the form of table-text joint pre-training, which attracts significant research interests from multiple domains  ...  Downstream Tasks As shown in Figure 2 , tasks of table understanding often have intersections with domains like NL, programming language, and computer vision, and thus prefer different capabilities of  ... 
arXiv:2201.09745v4 fatcat:fckxlk6przhsthnyhozehw3dz4

Knowledge Extraction in Low-Resource Scenarios: Survey and Perspective [article]

Shumin Deng, Ningyu Zhang, Hui Chen, Feiyu Xiong, Jeff Z. Pan, Huajun Chen
2022 arXiv   pre-print
We hope that our survey can help both the academic and industrial community to better understand this field, inspire more ideas and boost broader applications.  ...  Knowledge Extraction (KE) which aims to extract structural information from unstructured texts often suffers from data scarcity and emerging unseen types, i.e., low-resource scenarios.  ...  with structured constraints.  ... 
arXiv:2202.08063v1 fatcat:2q64tx2mzne53gt24adi6ymj7a

Research and Application of Machine Learning in Automatic Program Generation

Xiaojiang Zhang, Ying Jiang
2020 Chinese journal of electronics  
With the development of artificial intelligence, machine learning has been applied in more and more domains.  ...  Decision trees, language models, and cyclic neural networks have been applied in code generation, code completion and code knowledge mining.  ...  [22] used circular neural networks in code prediction tasks and introduced a sparse attention mechanism that effectively capturing the long-term dependencies between code in such dynamic programming  ... 
doi:10.1049/cje.2020.10.006 fatcat:tntz2klabnapnb7kh6vhzrkpp4

Table-to-text Generation by Structure-aware Seq2seq Learning [article]

Tianyu Liu, Kexiang Wang, Lei Sha, Baobao Chang, Zhifang Sui
2017 arXiv   pre-print
To encode both the content and the structure of a table, we propose a novel structure-aware seq2seq architecture which consists of field-gating encoder and description generator with dual attention.  ...  The attention visualizations and case studies show that our model is capable of generating coherent and informative descriptions based on the comprehensive understanding of both the content and the structure  ...  Acknowledgments Our work is supported by the National Key Research and Development Program of China under Grant No.2017YFB1002101 and project 61772040 supported by NSFC.  ... 
arXiv:1711.09724v1 fatcat:7cxxndksp5euvf6gemzpuxezya

Reviews and Descriptions of Tables and Books

1992 Mathematics of Computation  
(c) There is insufficient material to provide a detailed understanding of the program.  ...  Since the language draws from many previous languages, several equivalent programming "paradigms" coexist. The choice of the wrong technique can make a program quite slow.  ... 
doi:10.1090/s0025-5718-92-99745-6 fatcat:zl3qkv43kvgodh55rh4s6koh5i

A Survey on Table Question Answering: Recent Advances [article]

Nengzheng Jin, Joanna Siebert, Dongfang Li, Qingcai Chen
2022 arXiv   pre-print
Generally speaking, table QA tasks can be traced back to querying relational databases with natural language, in which the tables are relatively structured.  ...  One of classical semantic parsing tasks is text2sql, which converts the natural language utterances into structured query language (SQL).  ... 
arXiv:2207.05270v1 fatcat:xjjpppms3rfuhmwyu3mjwuhtxm

Language and Literacy: Issues and Considerations

Maria C. Hartman, Onudeah D. Nicolarakis, Ye Wang
2019 Education Sciences  
The outcomes of this discussion have instructional implications and proffer guidelines for teacher preparation programs. The article concludes with directions for further research.  ...  This article provides background on the major perspectives involving the development of English language and literacy with respect to the evolving demography of d/Deaf and hard-of-hearing children and  ...  Another area of need in teacher preparation programs relates to a deeper understanding of and facility with varieties of assessment and assessment tools.  ... 
doi:10.3390/educsci9030180 fatcat:ac72pskksvflhfepq3ogl3o7l4
« Previous Showing results 1 — 15 out of 28,599 results