A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2005; you can also visit the original URL.
The file type is
Lecture Notes in Computer Science
In this paper, starting from a collection of training examples, we show how to produce a very compact set of classification rules. The induction idea is a clustering principle based on Kohonen's self-organizing algorithms. The function to optimize in the aggregation of examples to become rules is a classificatory quality measure called impurity level, which was previously employed in our system called FAN. The rule conditions obtained in this way are densely populated areas in the attributedoi:10.1007/bfb0098207 fatcat:uirzjeivqjebzlx2t64hp2yzxa
more »... e. The main goal of our system, in addition to its accuracy, is the high quality of explanations that it can provide attached to the classification decisions.
Lecture Notes in Computer Science
Case-based information systems can be seen as lazy machine learning algorithms; they select a number of training instances and then classify unseen cases as the most similar stored instance. One of the main disadvantages of these systems is the high number of patterns retained. In this paper, a new method for extracting just a small set of paradigms from a set of training examples is presented. Additionally, we provide the set of attributes describing the representative examples that aredoi:10.1007/bfb0098210 fatcat:2xgnsbmffneibppcdkks43pdsi
more »... t for classification purposes. Our algorithm computes the Kohonen self-organizing maps attached to the training set to then compute the coverage of each map node. Finally, a heuristic procedure selects both the paradigms and the dimensions (or attributes) to be considered when measuring similarity in future classification tasks.
High Performance Computing Clusters (HPCCs) are common platforms for solving both up-to-date challenges and high-dimensional problems faced by IT service providers. Nonetheless, the use of HPCCs carries a substantial and growing economic and environmental impact, owing to the large amount of energy they need to operate. In this paper, a two-stage holistic optimisation mechanism is proposed to manage HPCCs in an eco-efficiently manner. The first stage logically optimises the resources of thedoi:10.3390/en12112129 fatcat:uombaoak6fhuzmsglr7jxd4aey
more »... through reactive and proactive strategies, while the second stage optimises hardware allocation by leveraging a genetic fuzzy system tailored to the underlying equipment. The model finds optimal trade-offs among quality of service, direct/indirect operating costs, and environmental impact, through multiobjective evolutionary algorithms meeting the preferences of the administrator. Experimentation was done using both actual workloads from the Scientific Modelling Cluster of the University of Oviedo and synthetically-generated workloads, showing statistical evidence supporting the adoption of the new mechanism.
Except for a few cases, nowadays it is very common to find a camera embedded in a consumer grade laptop, notebook, mobile internet device (MID), mobile phone or handheld game console. Some of them also have a Graphic Processing Unit (GPU) to handle 3D graphics and other related tasks. This trend will probably continue in the next future and the pair camera+GPU will be more and more frequent in the market. Because of this, the proposal of this work is to use these resources in order to build adoi:10.3233/ica-2011-0384 fatcat:v5sfzy5ksbhvncpbtlrcdumjym
more »... w-cost software-based 3D Human Interface Device (3D HID) able to run in this kind of devices, in real time without degrading the overall performance. This is achieved implementing a parallel version of an existing Optical Flow Algorithm that runs fully in the GPU without using it at full power. In this way, usual graphic processes coexist with Optical Flow computations. To the best of author's knowledge, this approach (a software-based 3D HID that runs fully in a GPU) is not found in academic research nor in commercial products prototypes. Indeed, this is the salient contribution of this paper. The performance of the proposal is good enough to achieve real time in low grade computers.
RANILLA AND A. ... RANILLA AND A. ...doi:10.1006/ijhc.2002.1002 fatcat:wown6nzejvbilkoapxqdooawmm
Lecture Notes in Computer Science
There we show that machine learning systems like C4.5 [Quinlan, 93] or our Abanico [Ranilla, Bahamonde, 95] can be improved in noisy well-known problems. ... The learning algorithms used here to produce rules to be self-organized by our algorithm were C4.5 [Quinlan, 93] and Abanico [Ranilla, Bahamonde, 95 ]. ...doi:10.1007/bfb0032513 fatcat:l6mkhntpbfbpnerdbrtrhengny
As data and supercomputing centres increase their performance to improve service quality and target more ambitious challenges every day, their carbon footprint also continues to grow, and has already reached the magnitude of the aviation industry. Also, high power consumptions are building up to a remarkable bottleneck for the expansion of these infrastructures in economic terms due to the unavailability of sufficient energy sources. A substantial part of the problem is caused by current energydoi:10.3390/en9030197 fatcat:t4wo2kdrwra7dirlf4dfzxkojm
more »... consumptions of High Performance Computing (HPC) clusters. To alleviate this situation, we present in this work EECluster, a tool that integrates with multiple open-source Resource Management Systems to significantly reduce the carbon footprint of clusters by improving their energy efficiency. EECluster implements a dynamic power management mechanism based on Computational Intelligence techniques by learning a set of rules through multi-criteria evolutionary algorithms. This approach enables cluster operators to find the optimal balance between a reduction in the cluster energy consumptions, service quality, and number of reconfigurations. Experimental studies using both synthetic and actual workloads from a real world cluster support the adoption of this tool to reduce the carbon footprint of HPC clusters.
This work proposes an approach to collaborative tag recommendation based on a machine learning system for probabilistic regression. The goal of the method is to support users of current social network systems by providing a rank of new meaningful tags for a resource. This system provides a ranked tag set and it feeds on different posts depending on the resource for which the recommendation is requested and on the user who requests the recommendation. Different kinds of collaboration among usersdblp:conf/pkdd/MontanesQDR09 fatcat:v5fj6zkyt5htfkriy2poqtwnjm
more »... and resources are introduced. That collaboration adds to the training set additional posts carefully selected according to the interaction among users and/or resources. Furthermore, a selection of post using scoring measures is also proposed including a penalization of oldest post. The performance of these approaches is tested according to F1 but just considering at most the first five tags of the ranking, which is the evaluation measure proposed in ECML PKDD Discovery Challenge 2009. The experiments were carried out over two different kind of data sets of Bibsonomy folksonomy, core and no core, reaching a performance of 26.25% for the former and 6.98% for the latter.
El propósito de esta investigación es describir de qué manera los entornos virtuales de aprendizaje que integran simulaciones 3D favorecen la comunicación, interacción y colaboración de los estudiantes. Esta investigación utiliza una metodología cualitativa basada en la observación sistémica. Los resultados muestran que los estudiantes a partir de la experiencia en entornos de simulación 3D, pasan a ser agentes activos en la construcción de su proceso de aprendizaje. Los entornos de simulacióndoi:10.35362/rie722102 fatcat:frr67ju3kfcgngv64lh3362mpu
more »... D con mediación pedagógica favorecen la comunicación e interacción de los estudiantes en el contexto de la secuencia pedagógica planteada promoviendo el aprender de los estudiantes y generando dos tipos de comunicación: interacción entre participante y objeto 3D, e interacción entre participantes que se desarrolla más espontáneamente y en mayor cantidad.
This special issue collects research papers selected among those presented at the second minisymposium "HPC applied to Computational Problems in Science and Engineering" which was held in June 2010, in Almeria, Spain. This workshop was a special event organized within the framework of the "10th International Conference on Computational and Mathematical Methods in Science and Engineering". The papers in the issue can be classified into three main groups: parallel linear algebra algorithms,doi:10.1007/s11227-011-0630-4 fatcat:gor3e4wgirgnfgp5dunbora4d4
more »... elization of applications, and tools and environments for parallel programming. These are papers that deal with important computations in the linear algebra domain as, for example, using graphics processors to accelerate the computation of the matrix inverse or a parallel python library for nonlinear systems. Science and engineering are an endless source of complexity. Therefore, it is not a surprise that so many researchers dedicate their time to develop parallel solvers or specific hardware designs for these applications. Along this line, a heterogeneous sample of problems are analyzed in the issue as, for example, a heterogeneous parallel solution for the fast multipole method, the analysis of magnetic resonance imaging using GPUs, or a many-core solution for real-time massive convolution for audio applications.
The objective of this study was to analyze the chemical composition, in vitro ruminal fermentation, and intestinal digestibility of discarded samples of four Brassica vegetables: Brussels sprouts (BS), white cabbage, Savoy cabbage, and red cabbage, and to assess the effects of including increasing amounts of BS in the concentrate of a dairy sheep diet on in vitro fermentation, CH4 production, and in situ degradation of the diets. All cabbages had low dry matter content (DM; <16.5%), but theirdoi:10.3390/ani9090588 pmid:31438498 pmcid:PMC6770265 fatcat:6rmjuu3hync3je4nuiwmtncdv4
more »... had high crude protein (19.5–24.8%) and sugars (27.2–41.4%) content and low neutral detergent fiber (17.5–28%) and was rapidly and extensively fermented in the rumen. Rumen degradability of protein at 12 h of in situ incubation was greater than 91.5% for all cabbages, and in vitro intestinal digestibility of protein ranged from 61.4 to 90.2%. Replacing barley, corn, and soybean meal by 24% of dried BS in the concentrate of a diet for dairy sheep (40:60 alfalfa hay:concentrate) increased in vitro diet fermentation and in situ degradability of DM and protein, and reduced in vitro CH4/total volatile fatty acid ratio. In vivo trials are necessary to confirm these results.
Understanding the interactions between hydrogen producers and consumers in the rumen ecosystem is important for ruminant production and methane mitigation. The present study explored the relationships between rumen protozoa, methanogens and fermentation characteristics. A total of six donor sheep harbouring (F, faunated) or not (D, defaunated) protozoa in their rumens (D animals were kept without protozoa for a period of a few months (D2) or for more than 2 years (Dþ)) were used in in vitro anddoi:10.1017/s0007114511002935 pmid:21762544 fatcat:wda7xe65pff6jpwoftw4snbteq
more »... in vivo experiments. In vitro the absence of protozoa decreased NH 3 and butyrate production and had no effect on methane. In contrast, the liquid-associated bacterial and methanogens fraction of Dþ inocula produced more methane than D2 and F inoculum (P,0·05). In vivo fermentation parameters of donor animals showed the same trend on NH 3 and butyrate and showed that Dþ animals were high methane emitters, while D2 were the lowest (235 %). The concentration of dissolved dihydrogen measured after feeding followed the opposite trend. Methane emissions did not correlate with the relative abundance of methanogens in the rumen measured by quantitative PCR, but there was a trend for higher methanogens concentration in the solid-associated population of Dþ animals compared with D2 animals. In contrast, PCR-denaturing gradient gel electrophoresis profiles of methanogens' methyl coenzyme-M reductase A gene showed a clear clustering in liquid-associated fractions for all three groups of donors but fewer differences in solid-associated fractions. These results show that the absence of protozoa may affect differently the methanogen community and methane emissions in wethers. Abbreviations: D, defaunated; D2 , medium-term defaunation (6-12 weeks); Dþ, long-term defaunation (more than 2 years); D2centr, inocula obtained from centrifugation of rumen fluids from defaunated animals (6-12 weeks); Dþcentr, inocula obtained from centrifugation of rumen fluids from defaunated animals (more than 2 years); DGGE, denaturing gradient gel electrophoresis; DMD, DM degradation; F, faunated; Fcentr, inocula obtained from centrifugation of rumen fluids from faunated animals; VFA, volatile fatty acid.
It is well known that checkerboard partitioning can exploit more concurrency than striped partitioning because the matrix computation can be divided among more processors than in the case of striping. In this work we analyze the performance of Neville method when a checkerboard partitioning is used, focusing on the special case of block-cyclic-checkerboard partitioning. This method is an alternative to Gaussian elimination and it has been proved to be very useful for some classes of matrices,doi:10.1016/j.laa.2003.11.028 fatcat:bsjufdt5lbbzvbv7qbzbxr7mby
more »... ch as totally positive matrices. The performance of this parallel system is measured in terms of the efficiency (the fraction of time for which a processor is usefully employed) which in our model is close to one, when the optimum block size is used. Also, we have executed our algorithms on a Parallel PC cluster, observing that both efficiencies (theoretical and empirical) are quite similar.
The potential of broccoli wastes (florets and stems) as ruminant feed was analyzed using in vitro and in situ techniques. Both stems and florets had high moisture content (90.6 and 86.1%, respectively), but the stems contained (% dry matter) lower levels (p < 0.05) of crude protein (CP; 23.2 vs. 30.8%) and ether extract (2.91 vs. 6.15%) and tended to have greater sugars content (p = 0.071; 33.4 vs. 19.6%) than florets. Stems had greater in vitro dry matter rumen degradability (45.3%; 24 hdoi:10.3390/ani10111989 pmid:33137999 fatcat:twpkdoas3vdobnt3uixsyj5nw4
more »... tion) and lower in vitro CP intestinal digestibility (82.7%) compared with florets (42.2 and 90.1%, respectively). Rumen degradability of protein was high (<85%) for both fractions. In a second experiment, diets including different proportions of broccoli were formulated and fermented in vitro. The replacement of 24% of conventional feeds (wheat, soybean meal and wheat bran) in a concentrate by dried broccoli increased the amount of organic matter fermented in vitro and the NH3-N concentrations of a mixed diet including 40% of the concentrate. Including dried broccoli in the diet produced only small modifications in the volatile fatty acid profile and did not affect CH4 emission.
« Previous Showing results 1 — 15 out of 207 results