Cache Locality
Cache locality refers to the tendency of a processor to access the same set of memory locations repeatedly, allowing the data to be kept in the CPU's high-speed cache. In financial software, designing algorithms that respect cache locality can lead to massive performance gains.
By organizing data structures to be cache-friendly, such as using arrays instead of linked lists, developers minimize cache misses that would otherwise require waiting for slower main memory. This is critical for trading systems where every nanosecond counts.
High cache locality ensures that the CPU spends its time executing instructions rather than waiting for data to arrive from RAM, which is essential for low-latency market analysis.