A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is
Enterprise risk management (ERM) is an integrated approach to manage the risks of companies. Despite the wide adoption of ERM into companies' organizational processes, there are neither clear standards for ERM nor ground based theories about the outcome of it. This paper gives an overview of scientific research in the topic of ERM by comparing recent academic papers which focus ERM in the context of performance evaluation or effectiveness by categorizing and evaluating each source against itsdoaj:41014a4429d847faadd235d792a05835 fatcat:h2ciep7itjczvc255a72c2ozdu
more »... mitations. The limitations are used to develop a unified view on the question of how ERM influences performance of organizations. This also involves aspects of measuring the current status of ERM within companies and the effects of ERM on the other side. This paper shows that there is diversity in scientific literature of how to measure performance in the ERM-context. The authors identify reasons for that and suggest approaches to solve the problem by identifying best-practice approaches and a generic framework on how to use them to improve ERM-assessment in practice as well as research
Publication in the conference proceedings of EUSIPCO, Aalborg, Denmark, 2010doi:10.5281/zenodo.41880 fatcat:okbpqjyp55c3lep2syqueubwyy
In this paper the effect of unreliable buffer memory on Turbo equalization is investigated. Under the assumption that the resulting bit errors are uniformly and independently distributed on receive and LLR buffer output, the effect of unreliable memory on the channel capacity is analyzed. It is further shown, that extrinsic information transfer and bit error rate (BER) performance of a conventional Turbo Equalizer are heavily degraded. Based on these observations a fault tolerant (FT) Turbodoi:10.1109/vetecf.2011.6092880 dblp:conf/vtc/GeldmacherHG11 fatcat:qujf667iabcwfgculu7ycjlvhe
more »... lizer is derived, which considers the memory error characteristics by using a modified transition metric for the involved MAP equalizer and decoder. It is demonstrated, that the FT Turbo Equalizer can effectively compensate memory errors and thus yields a significantly improved BER performance.
Enterprise Risk Management (ERM) includes processes and methodologies for organizations to manage companies' risks. Due to an increase in the environmental complexity, a standardized approach of managing risks and opportunities is not only useful, but absolutely essential for business continuation – the way organizations deal with such risks is the key role for success and can be seen in the company's overall performance. In a previous paper the authors suggested a model for ERM assessment. Indoaj:582b1fd4fce04208a7f20147af82ad74 fatcat:q6uucshulfc3zatp5ihc5y2kku
more »... rder to proof this model, this paper presents a case study of a production company with a working ERM to evaluate the model, based on a real example. The results demonstrate that a suggested model to assess ERM and its performance is practically useable by organizations and might be further extended in future studies.
2008 IEEE International Symposium on Wireless Communication Systems
A block processing approach for decoding of convolutional codes is proposed. The approach is based on the fact that it is possible for Scarce-State-Transition decoding and syndrome decoding to determine the probability of a certain trellis state before the actual decoding happens. This allows the separation of the received sequence into independant blocks with known initial and final states, thus making overlapping or modifications of the encoder or the information stream unnecessary. Thedoi:10.1109/iswcs.2008.4726115 fatcat:lihfb5g67rfyrpdqyyhum6lvoa
more »... ed scheme offers potentials for both parallelization and reduction of power consumption.
Organizations today have to position themselves correctly in the market in order to survive in the competitive landscape. Corporate strategy often forms the way the operational business is executed. Corporate strategy therefore plays an important role for the internal structure of an organization. But is also the opposite true? The goal of this study is to identify relationships between the internal organization of a company and overall goals and strategic settings of the enterprise. Thisdoaj:9231069a0ded4c81b0cf2c31b4a6c252 fatcat:i4aaiv3pw5a7fnpzefamernbu4
more »... ch empirically compares the use of standardized management systems, developed levels of knowledge management and strategic goals of car manufactures. The main question is how the use of certain methodologies and best practise approaches (TQM, ISO-based management systems, etc.) influences the strategies of an organization. The result of this study shows that there is a correlation between the internal organization and the strategies of corporations. Internal methodologies and processes are defined according to strategic decisions of an organization. Internal processes and methodologies on the other side have an effect on the strategy as well. This study also suggests a holistic framework to assess this effect on strategy.
A new syndrome trellis based decoding approach for Turbo coded data is described in this paper. Based on estimating error symbols instead of code symbols, it inherently features options for reducing the computational complexity of the decoding process. After deriving required transition metrics, numerical results in terms of block error rate and required equivalent iterations are presented to demonstrate the efficiency of the approach.doi:10.1109/isit.2012.6283938 dblp:conf/isit/GeldmacherHGK12 fatcat:iqyp6xwr3vgxtlkbugl6c3ci5y
In this paper syndrome former trellis construction for punctured convolutional codes (PCCs) is discussed. The class of (np − 1)/np-rate PCCs derived from 1/n-rate mother codes is considered. It is shown that a syndrome former trellis of the PCC with identical complexity as the encoder trellis of the mother code can always be constructed. An explicit construction of the syndrome former trellis can be beneficial, when realizing adaptive complexity reduced decoding, or decoding with reduced statedoi:10.1109/isita.2010.5650115 dblp:conf/isita/GeldmacherG10 fatcat:q3wzrtis6jf7vle2lp7a5qytda
more »... ransitions, based on the syndrome decoding approach.
Decoding of Turbo Codes requires buffer memories to store the received values and the extrinsic information that is exchanged between the constituent decoders. In this paper, the effect of unreliable buffer memories on the decoding performance is analyzed. The buffer is modeled as a discrete memoryless channel, which introduces spatially independent and uniform bit errors on the binary representation of the stored values. This leads to a strong performance degradation if a conventional Turbodoi:10.1109/istc.2012.6325236 dblp:conf/istc/GeldmacherG12 fatcat:vp3usly6knauzbsbtjo4l4zk5m
more »... oding algorithm is employed. It is however shown that suitable modification of quantizer, index assignment, and of the transition metrics of the MAP algorithm can effectively compensate for these errors.
2010 6th International Symposium on Turbo Codes & Iterative Information Processing
Turbo equalization is a powerful method to iteratively detect and decode convolutionally encoded data that is corrupted by inter symbol interference (ISI) and Gaussian noise. It is based on the exchange of reliability information between the equalizer and the decoder, which is typically some sort of maximum a posteriori (MAP) decoder. While the number of remaining errors in the received sequence decreases during the iteration process, the computational effort for decoding remains unchanged indoi:10.1109/istc.2010.5613804 fatcat:essowlyvqvedxmim6lyuqy3t6i
more »... ch iteration. In this paper a syndrome based MAP decoder is proposed, that is capable of reducing the computational decoding effort during the iteration process without significantly influencing the convergence behavior.
Humanized mice have been critical for HIV-1 research, but are inefficient at mucosal HIV-1 transmission. We present a fetal tissue-independent model called CD34T+ with enhanced human leukocyte levels in the blood and improved T cell homing to the gut-associated lymphoid tissue. CD34T+ mice are highly permissive to intra-rectal HIV-1 infection, which can be prevented by infusion of broadly neutralizing antibodies. Therefore, CD34T+ mice provide a novel platform for mucosal HIV-1 transmission and prevention studies.doi:10.1101/2020.08.31.274167 fatcat:wk6uw3oj2rc4rnc2372oqnduoe
This paper investigates the implementation of a trellis based syndrome decoder on symmetric multiprocessor (SMP) platforms. Two advantages of the proposed approach will be exposed: First, compared to conventional parallel Viterbi decoder implementations, the syndrome decoder achieves a higher parallel efficiency in terms of speedup on the SMP platform. This is realized by reducing the computational overhead of the parallel Viterbi algorithm implementation. Second, it offers an adaptivedoi:10.1109/iswcs.2010.5624395 dblp:conf/iswcs/HueskeGG10 fatcat:p4h6q5tuyzhwjeyqujtzvn2ffa
more »... y, i.e. the number of decoding operations decreases with improving transmission conditions. This property can be exploited to reduce the average energy consumption of a radio receiver. Measurement results are shown for two SMP platforms: ARM's MPCore and Intel's Core i7.
In this paper a syndrome based Low SNR early termination (Low SNR ET) scheme for Turbo decoding is presented. The scheme is based only on hard decision binary computations and is therefore easy to realize. While Low SNR ET is a general principle, which can be applied in various scenarios, the focus of this work is on its application in the Long Term Evolution (LTE) system. It is shown that Low SNR ET is in particular very effective for typical LTE scenarios and that the porposed scheme can reduce the decoding iterations by about one third.doi:10.1109/iswcs.2011.6125303 dblp:conf/iswcs/GeldmacherHGK11 fatcat:titsuxbfcbg3dp75ia22w3tqb4
Several studies have revealed considerable differences between self assessments and proxy ratings (Geldmacher 2008). ...doi:10.1007/s10389-009-0294-1 fatcat:subgx4dmqbbkzb42gke7isjapu
Humanized mice are critical for HIV-1 research, but humanized mice generated from cord blood are inefficient at mucosal HIV-1 transmission. Most mucosal HIV-1 transmission studies in mice require fetal tissue-engraftment, the use of which is highly restricted or prohibited. We present a fetal tissue-independent model called CD34T+ with enhanced human leukocyte levels in the blood and improved T cell homing to the gut-associated lymphoid tissue. CD34T+ mice are highly permissive to intra-rectaldoi:10.3390/vaccines9030198 pmid:33673566 fatcat:2v4p3g4xmzefnbuwg6vkoobwhi
more »... IV-1 infection and also show normal env diversification in vivo despite high viral replication. Moreover, mucosal infection in CD34T+ mice can be prevented by infusion of broadly neutralizing antibodies. CD34T+ mice can be rapidly and easily generated using only cord blood cells and do not require any complicated surgical procedures for the humanization process. Therefore, CD34T+ mice provide a novel platform for mucosal HIV-1 transmission studies as well as rapid in vivo testing of novel prevention molecules against HIV-1.
« Previous Showing results 1 — 15 out of 280 results