I'm using a struct to define a memory area like so:
Code: Select all
typedef struct {
size_t size; // The size of the memory area
int magic; // Used to detect block header corruption
bool free; // The area is free ?
} block;
To find the next block I just need to do p + p->size + sizeof(block) (p is a pointer to a block) yea ?
But I've got strange result :/
With p=0x106c74, p->size=0xff4 and sizeof(block)=0xc I should got 0x107c74 (as 0x106c74 + 0xff4 + 0xc = 0x106c74 + 0x1000). But when testing with GDB I've got 0x112c74 :/
Any idea ? GCC optimization are disabled. Did I miss something with c++ math ?