cache

(redirected from Cache hit)
Also found in: Dictionary, Medical, Encyclopedia.
Related to Cache hit: Cache miss
Graphic Thesaurus  🔍
Display ON
Animation ON
Legend
Synonym
Antonym
Related
  • all
  • noun
  • verb

Synonyms for cache

Synonyms for cache

to put or keep out of sight

Synonyms for cache

a hidden storage space (for money or provisions or weapons)

Related Words

a secret store of valuables or money

Synonyms

Related Words

(computer science) RAM memory that is set aside as a specialized buffer storage that is continually updated

save up as for future use

References in periodicals archive ?
(3) Our simulation results show that in the case of a network operator and multiple content providers, our approach can improve the cache hit rate while balancing the interests of ICPs.
Equation 4 shows the improvement factor of the memory hierarchy whenever there is cache hit for one level cache memory.
If a cache hit occurs, the reference bit of the requested data unit is set to 1.
The cache hit ratios of various replacement polices with pre-fetching is shown in figure 5.
Thus, when a response varies on the User-agent header we can only get a cache hit for clients running the exact same version of the browser, on the same operating system.
This coordination complicates cooperative caching; not only must clients coordinate to provide traditional caching functions such as block lookup, replacement, and consistency, but they must also coordinate to manage the size of the cooperative cache so that local cache hit rates are not affected.
Despite significantly increasing' cache contention and reducing overall cache space, prefetching into the primary cache resulted in higher cache hit rates, which proved to be the dominant performance factor.
Hence if we had an infinitely large cache, every reuse would result in a cache hit. In practice, however, reuse does not necessarily result in cache hit.
This effect was not detected in Wilkinson and Neimat [1990] because that study used a probabilistic cache model that assumed that cache hit probabilities were independent of cache size.
In 2014, Prime+Probe was developed into a more powerful attack as Flush+Reload [7], which requires memory deduplication to infer the cache hit and cache miss.
The read operation will create either cache hit or cache miss events.
Among that, the video segment which is requested frequently enough greater than the threshold ([NOR.sub.sth]), are shifted to the LTB because mostly requested (popular) segments are maintained in the LTB for a long time to improve the cache hit rate for cached segments.
So, more local clients in a cache name server means higher cache hit rate and therefore more flattening effect on the domain name distribution.
Experimental results demonstrate that class prediction can deliver a class cache hit ratio of up to 54% using a modest cache size of 64kb on the client, whereas a 16kb cache delivers a hit ratio of 37%.
In case of cache hit, there is no need to wait for the next IR and hence the query latency is reduced.