Вопрос

My copy of Hacker's Delight is at home, and the web resources I've found aren't clear on this detail.

I have written the following 8-level LRU, using the well-known "row-column square" algorithm. (Is there a better name?).

#include <stdint.h>

typedef union {
  uint8_t rows[8];  
  uint64_t cols;
} lru_state;

void lru_init(lru_state *this) {
  this->cols=0;
}

void lru_up(lru_state *this, int used /* in 0..7 */) {
  this->rows[used]=0xff;
  this->cols &= ~(0x0101010101010101 << used);
}

int lru_get(lru_state *this) {
  int i;
  for (i=1; i<8 ; i++) {
    if (0==(this->rows[i])) return i;
  }
  return 0;
}

I would like confirmation of my assumption that the least-recently-used row will be all zeroes. It seems to work, but I don't have the math to prove it to my satisfaction.

So, is this right? Or do I need to compute the minimum Hamming weight for each row?

Это было полезно?

Решение

Your assumption is correct.

We can prove it by contradiction:

Suppose the LRU candidate (the byte with most zero bits) is c and has a bit set in position x.

This means that line x has not been used after line c, therefore x must have all the zero bits that c has, plus a zero in position x. This is a contradiction as c is the byte with most zero bits so we conclude that c cannot have any bit set.

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top