A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is
GW25-e5248 PPARg agonist Pioglitazone Suppresses AngII-induced Inflammation in Cardiac Fibroblast Cells via TLR4-depended Signaling Pathway Gao Dengfeng, Zhe Meng, Zhongwei Liu, Xin Dong, Gao Dengfeng ...doi:10.1016/j.jacc.2014.06.071 fatcat:bnulaf4fvve2xobkec4hi3qwnu
Mispronunciation detection and diagnosis (MDD) technology is a key component of computer-assisted pronunciation training system (CAPT). In the field of assessing the pronunciation quality of constrained speech, the given transcriptions can play the role of a teacher. Conventional methods have fully utilized the prior texts for the model construction or improving the system performance, e.g. forced-alignment and extended recognition networks. Recently, some end-to-end based methods attempt toarXiv:2206.07289v1 fatcat:kzf26zru2vb5fkfwggnzbqvvlu
more »... orporate the prior texts into model training and preliminarily show the effectiveness. However, previous studies mostly consider applying raw attention mechanism to fuse audio representations with text representations, without taking possible text-pronunciation mismatch into account. In this paper, we present a gating strategy that assigns more importance to the relevant audio features while suppressing irrelevant text information. Moreover, given the transcriptions, we design an extra contrastive loss to reduce the gap between the learning objective of phoneme recognition and MDD. We conducted experiments using two publicly available datasets (TIMIT and L2-Arctic) and our best model improved the F1 score from 57.51% to 61.75% compared to the baselines. Besides, we provide a detailed analysis to shed light on the effectiveness of gating mechanism and contrastive learning on MDD.
Proceedings 2003 VLDB Conference
As with relational data, XML data changes over time with the creation, modification, and deletion of XML documents. Expressing queries on timevarying (relational or XML) data is more difficult than writing queries on nontemporal data. In this paper, we present a temporal XML query language, ¢ XQuery, in which we add valid time support to XQuery by minimally extending the syntax and semantics of XQuery. We adopt a stratum approach which maps a ¢ XQuery query to a conventional XQuery. The paperdoi:10.1016/b978-012722442-8/50062-8 dblp:conf/vldb/GaoS03 fatcat:pai5rsoxgnfr3fpuc6p4z4qlmu
more »... cuses on how to perform this mapping, in particular, on mapping sequenced queries, which are by far the most challenging. The critical issue of supporting sequenced queries (in any query language) is time-slicing the input data while retaining period timestamping. Timestamps are distributed throughout an XML document, rather than uniformly in tuples, complicating the temporal slicing while also providing opportunities for optimization. We propose four optimizations of our initial maximally-fragmented time-slicing approach: selected node slicing, copy-based perexpression slicing, in-place per-expression slicing, and idiomatic slicing, each of which reduces the number of constant periods over which the query is evaluated. While performance tradeoffs clearly depend on the underlying XQuery engine, we argue that there are queries that favor each of the five approaches. ¢ XQuery utilizes the data model of XQuery. The few reserved words added to XQuery indicate three different kinds of valid time queries. Representational queries have the same semantics with XQuery, ensuring that ¢ XQuery is upward compatible with XQuery. New syntax for current and sequenced queries makes these queries easier to write. We carefully made ¢ XQuery compatible with XQuery to ensure the smooth migration from nontemporal application to temporal application; this compatibility also simplifies the semantics and its implementation. To implement ¢ XQuery, we adopt the stratum approach, in which a stratum accepts ¢ XQuery expressions and maps each to a semantically equivalent XQuery expression. This XQuery expression is passed to an XQuery engine. Once the XQuery engine obtains the result, the stratum possibly performs some additional processing and returns the result to the user. The advantage of this approach is that we can exploit the existing techniques in an XQuery engine such as the query optimization and query evaluation. The stratum
Objective. We sought to investigate whether the peroxisome proliferator-activated receptor-γ (PPAR-γ) ligand pioglitazone can attenuate vascular fibrosis in spontaneously hypertensive rats (SHRs) and explore the possible molecular mechanisms.Methods. SHRs (8-week-old males) were randomly divided into 3 groups (n=8each) for treatment: pioglitazone (10 mg/kg/day), hydralazine (25 mg/kg/day), or saline. Normal male Wistar Kyoto (WKY) rats (n=8) served as normal controls. Twelve weeks later, wedoi:10.1155/2012/856426 pmid:22550475 pmcid:PMC3324923 fatcat:d3juwv7punfbbbz24e22rq2iyi
more »... uated the effect of pioglitazone on vascular fibrosis by Masson's trichrome and immunohistochemical staining of collagen III and real-time RT-PCR analysis of collagen I, III and fibronectin mRNA.Vascular expression of PPAR-γ and connective tissue growth factor (CTGF) and transforming growth factor-β (TGF-β) expression were evaluated by immunohistochemical staining, western blot analysis, and real-time RT-PCR.Results. Pioglitazone and hydralazine treatment significantly decreased systolic blood pressure in SHRs. Masson's trichrome staining for collagen III and real-time RT-PCR analysis of collagen I, III and fibronectin mRNA indicated that pioglitazone significantly inhibited extracellular matrix production in the aorta. Compared with Wistar Kyoto rats, SHRs showed significantly increased vascular CTGF expression. Pioglitazone treatment significantly increased PPAR-γ expression and inhibited CTGF expression but had no effect on TGF-β expression.Conclusions. The results indicate that pioglitazone attenuated vascular fibrosis in SHRs by inhibiting CTGF expression in a TGF-β-independent mechanism.
The VLDB journal
Joins are arguably the most important relational operators. Poor implementations are tantamount to computing the Cartesian product of the input relations. In a temporal database, the problem is more acute for two reasons. First, conventional techniques are designed for the evaluation of joins with equality predicates rather than the inequality predicates prevalent in valid-time queries. Second, the presence of temporally varying data dramatically increases the size of a database. These factorsdoi:10.1007/s00778-003-0111-3 fatcat:lgdvkgkftzglbodt3jlx5jtrra
more »... ndicate that specialized techniques are needed to efficiently evaluate temporal joins. We address this need for efficient join evaluation in temporal databases. Our purpose is twofold. We first survey all previously proposed temporal join operators. While many temporal join operators have been defined in previous work, this work has been done largely in isolation from competing proposals, with little, if any, comparison of the various operators. We then address evaluation algorithms, comparing the applicability of various algorithms to the temporal join operators and describing a performance study involving algorithms for one important operator, the temporal equijoin. Our focus, with respect to implementation, is on non-index-based join algorithms. Such algorithms do not rely on auxiliary access paths but may exploit sort orderings to achieve efficiency.
Bacteria-based self-healing concrete is a construction material used to repair cracks in concrete, in which the bacterial spores are immobilized by bacteria carriers. However, the currently available bacteria carriers are not always suitable due to a complicated procedure or high cost. To develop a more suitable bacteria carrier as well as improve the anti-crack capability of self-healing concrete, in this study we evaluate the feasibility of using rubber particles as a novel bacteria carrierdoi:10.3390/ma12142313 pmid:31331051 pmcid:PMC6678105 fatcat:66syk33sizh2rob3kev2v44tja
more »... self-healing concrete. Two types of self-healing concrete are prepared with rubber particles of different sizes to quantify the crack-healing effect. In addition, the fluidity and mechanical properties of the self-healing rubber concrete are compared with those of plain concrete and normal rubber concrete. The experimental results show that the self-healing rubber concrete with a particle size of 1~3 mm has a better healing capacity than the self-healing rubber concrete with a particle size of 0.2~0.4 mm, and the width value of the completely healed crack is 0.86 mm. The self-healing rubber concrete has a higher slump than the plain concrete and normal rubber concrete. According to the strength tests, the compressive strengths of the self-healing rubber concrete are low early on but they exceed those of the corresponding normal rubber concrete at 28 days. Moreover, the self-healing rubber concrete has higher splitting tensile strengths than the plain concrete and a better anti-crack capability. The results of a comparison to the other two representative bacterial carriers indicate that rubber particles have potential to be a widely used bacteria carrier for practical engineering applications in self-healing concrete.
 59  70  52 Qiao and Gao Medicine (2016) 95:51 www.md-journal.com count in 15 of 19 patients ranged from 2.5 to 12.5 Â10 9 /L, with a mean of 8.1Â10 9 /L. ...doi:10.1097/md.0000000000005080 pmid:28002315 pmcid:PMC5181799 fatcat:too2is63drcwzm4jnqqsndtjo4
This paper introduces multiresolution analyses with composite dilations (AB-MRAs) and addresses frame multiresolution analyses with composite dilations in the setting of reducing subspaces ofL2(ℝn)(AB-RMRAs). We prove that an AB-MRA can induce an AB-RMRA on a given reducing subspaceL2(S)∨. For a general expansive matrix, we obtain the characterizations for a scaling function to generate an AB-RMRA, and the main theorems generalize the classical results. Finally, some examples are provided to illustrate the general theory.doi:10.1155/2011/850850 fatcat:zfkxhdws7fcb7kpd5xeg3f7shq
We show how to extend temporal support of SQL to the Turing-complete portion of SQL, that of persistent stored modules (PSM). Our approach requires minor new syntax beyond that already in SQL/Temporal to define and to invoke PSM procedures and functions, thereby extending the current, sequenced, and non-sequenced semantics of queries to such routines. Temporal upward compatibility (existing applications work as before when one or more tables are rendered temporal) is ensured. We provide adoi:10.1109/icde.2012.70 dblp:conf/icde/SnodgrassGZT12 fatcat:zs6nsudozvcrvi4wxnpxdbqplm
more »... ormation that converts Temporal SQL/PSM to conventional SQL/PSM. To support sequenced evaluation of stored functions and procedures, we define two different slicing approaches, maximal slicing and per-statement slicing. We compare these approaches empirically using a comprehensive benchmark and provide a heuristic for choosing between them.
Hypothyroidism is a risk factor of heart failure (HF) in the general population. However, the relationship between hypothyroidism and clinical outcomes in patients with established HF is still inconclusive.We conducted a systematic review and meta-analysis to clarify the association of hypothyroidism and all-cause mortality as well as cardiac death and/or hospitalization in patients with HF. We searched MEDLINE via PubMed, EMBASE, and Scopus databases for studies of hypothyroidism and clinicaldoi:10.1097/md.0000000000001159 pmid:26222845 pmcid:PMC4554113 fatcat:4wxvc566pvdqhhz5ufaa4kf7ce
more »... utcomes in patients with HF published up to the end of January 2015. Random-effects models were used to estimate summary relative risk (RR) statistics. We included 13 articles that reported RR estimates and 95% confidence intervals (95% CIs) for hypothyroidism with outcomes in patients with HF. For the association of hypothyroidism with all-cause mortality and with cardiac death and/or hospitalization, the pooled RR was 1.44 (95% CI: 1.29-1.61) and 1.37 (95% CI: 1.22-1.55), respectively. However, the association disappeared on adjustment for B-type natriuretic protein level (RR 1.17, 95% CI: 0.90-1.52) and in studies of patients with mean age <65 years (RR 1.23, 95% CI: 0.88-1.76).We found hypothyroidism associated with increased all-cause mortality as well as cardiac death and/or hospitalization in patients with HF. Further diagnostic and therapeutic procedures for hypothyroidism may be needed for patients with HF.
Background. The main purpose of this study was to explore the predictive value of the systemic immune inflammation index (SII), a novel clinical marker, in heart failure (HF) patients. Methods. Critically ill patients with HF were identified from the Medical Information Mart for Intensive Care III (MIMIC III) database. Patients were divided into three groups according to tertiles of SII (group 1, group 2, group 3). We used Kaplan-Meier curves and Cox proportional hazards regression models todoi:10.1155/2022/3455372 pmid:35634435 pmcid:PMC9135558 fatcat:gerzquerbbfizfijepe4g5a5iu
more »... luate the association between the SII and all-cause mortality in HF. Subgroup analysis was used to verify the predictive effect of the SII on mortality. Results. This study included 9107 patients with a diagnosis of HF from the MIMIC III database. After 30, 60, 180, and 365 days of follow-up, 25.60%, 32.10%, 41.30%, and 47.50% of the patients in group 3 had died. Using the Kaplan-Meier curve, we observed that patients with higher SII values had a shorter survival time (log rank p < 0.001 ). The Cox proportional hazards regression model adjusted for all possible confounders and indicated that the higher SII group had a higher mortality (30-day: HR = 1.304 , 95 % CI = 1.161 − 1.465 , 60-day: HR = 1.266 , 95% CI = 1.120 − 1.418 , 180-day: HR = 1.274 , 95 % CI = 1.163 − 1.395 , and 365-day: HR = 1.255 , 95 % CI = 1.155 − 1.364 ). Conclusions. SII values could be used as a predictor of prognosis in critically ill patients with HF.
A data warehouse infrastructure needs to support the requirement of (day time) ad hoc query response time and (night time) batch workload completion time. The following tasks need to be finished in a batch window: (1) Apply one day's delta data to the base tables; (2) refresh MQTs (Materialized Query Tables) for ad hoc queries and batch workloads; (3) run batch queries. Tools are available to optimize each step; however, many factors need to be considered for improving the overall performancedblp:conf/vldb/LiGBNMNOF07 fatcat:s57kiihv3bbb7hkq4nykocl65a
more »... a data warehouse (i.e. meeting batch window deadline and ad hoc query response time). We have prototyped a Data Warehouse Operation Advisor to systematically study each component contributing to the batch window problem, and then perform global optimization to achieve desired results!
Joins are among the most frequently executed operations. Several fast join algorithms have been developed and extensively studied; these can be categorized as sort-merge, hash-based, and index-based algorithms. While all three types of algorithms exhibit excellent performance over most data, ameliorating the performance degradation in the presence of skew has been investigated only for hash-based algorithms. However, for sort-merge join, even a small amount of skew present in realistic data candoi:10.1145/564691.564711 dblp:conf/sigmod/LiGS02 fatcat:pefou65jjvcghonom43rjsjo3u
more »... result in a significant performance hit on a commercial DBMS. This paper examines the negative ramifications of skew in sort-merge join and proposes several refinements that deal effectively with data skew. Experiments show that some of these algorithms also impose virtually no penalty in the absence of data skew and are thus suitable for replacing existing sort-merge implementations. We also show how sortmerge band join performance is significantly enhanced with these refinements.
Motion monitoring by flexible strain or pressure sensors have been under spotlight in the field of wearable electronics. Based on triboelectric effect, generated energy from body contact and compression during daily movement can be used for both reflecting motion status and energy recollection. Here, we report a stretchable pressure sensor based on triboelectric effect and dotsdistributed metallic electrodes, adopting contact-separation mode. The dots-distributed electrode based triboelectricdoi:10.1109/ojnano.2020.3019425 fatcat:y6aev5ndhnfkzbgzfywzidb3m4
more »... nogenerator (D-TENG) could be easily integrated with body and cloth, such as on the skin and under foot, to sense a broad range of activity related strain information. The D-TENGs enable accurate detecting a broad range pressure from ~5 kPa to ~50 kPa with open circuit voltage variation from several volts to tens of volts, and thus allow monitoring body daily actives such as joints' bending, walking and running. These devices maintain stable and high-level signal outputs even after thousands cycles of measurement, proving the good stability. Simultaneously, the mechanical energy produced by our body motions could also be recollected by the D-TENG sensor for energy harvesting. Under a constant tapping by finger (39.59 kPa), the induced voltage is sufficient to light up 15 LEDs. The stretchable D-TENG sensor indicates its great potential in motion monitoring and mechanical energy harvesting.
A query optimizer compares alternative plans in its search space to find the best plan for a given query. Depending on the search space and the enumeration algorithm, optimizers vary in their compilation time and the quality of the execution plan they can generate. We build a compilation time estimator that provides a quantified estimate of the optimizer compilation time for a given query. Such an estimator is useful for automatically choosing the right level of optimization in commercialdoi:10.1145/872757.872803 dblp:conf/sigmod/IlyasRLGL03 fatcat:rt2ls6nnbrharai5zbfcewk2lq
more »... se systems. In addition, compilation time estimates can be quite helpful for midquery reoptimization, for monitoring the progress of workload analysis tools where a large number queries need to be compiled (but not executed), and for judicious design and tuning of an optimizer. Previous attempts to estimate optimizer compilation complexity used the number of possible binary join sequences as the metric and overlooked the fact that each join sequence often translates into a different number of join plans because of the presence of "physical" properties. We use the number of plans (instead of join sequences) to estimate query compilation time and employ two novel ideas: (1) reusing an optimizer's join enumerator to obtain accurate number of join sequences, but bypassing plan generation to save estimation overhead; (2) maintaining a small number of "interesting" properties to facilitate plan counting. We prototyped our approach in a commercial database system and our experimental results show that we can achieve good compilation time estimates (less than 30% error, on average) for complex real queries, using a small fraction (within 3%) of the actual compilation time.
« Previous Showing results 1 — 15 out of 381 results