Algorithm and Architecture for a Low-Power Content-Addressable Memory Based on Sparse Clustered Networks

Hooman Jarollahi, Vincent Gripon, Naoya Onizawa, Warren J. Gross
2015 IEEE Transactions on Very Large Scale Integration (vlsi) Systems  
We propose a low-power content-addressable memory (CAM) employing a new algorithm for associativity between the input tag and the corresponding address of the output data. The proposed architecture is based on a recently developed sparse clustered network using binary connections that on-average eliminates most of the parallel comparisons performed during a search. Therefore, the dynamic energy consumption of the proposed design is significantly lower compared with that of a conventional
more » ... er CAM design. Given an input tag, the proposed architecture computes a few possibilities for the location of the matched tag and performs the comparisons on them to locate a single valid match. TSMC 65-nm CMOS technology was used for simulation purposes. Following a selection of design parameters, such as the number of CAM entries, the energy consumption and the search delay of the proposed design are 8%, and 26% of that of the conventional NAND architecture, respectively, with a 10% area overhead. A design methodology based on the silicon area and power budgets, and performance requirements is discussed. Index Terms-Associative memory, content-addressable memory (CAM), low-power computing, recurrent neural networks, sparse clustered networks (SCNs). 1063-8210 He was a Visiting Scholar with the Research Institute of Electrical Communication, Tohoku University, Sendai, Japan, from 2012 to 2013. His current research interests include design and hardware implementation of energy-efficient and application-specific VLSI systems, such as associative memories and content-addressable memories. Mr. Jarollahi was a recipient of the Teledyne DALSA Award in 2010, for which he presented a patented architecture of a power and area-efficient SRAM.
doi:10.1109/tvlsi.2014.2316733 fatcat:i3zdsmqedranlg4yaaoixq3gsq