Yes, there are many ways to do this, the simlpest is simply making a bitmap for each level of memory sizes you want (there is nothing saying you MUST use 2^N, you can use log's, you can use random numbers depending on your OS specific requirements. The other way, is to use a tree (as mentioned), so if you know the an entire 4mb is used, using a bitmap for each level, you will have 1024 entries for 4k pages, 512 entries for 8k pages, etc, etc. and this wastes a ton of memory, so using a tree, you can do away with some redundancy at the added expense of more code.
Now, the beneift of the first method is simplicity, however it is wasteful, and still not the fastest to search for free entries (just because 4mb area is marked, doesn't mean it doesn't have any 4kb area's free). Depending on your OS's requirements for speed/size you can do a few tricks. For example, using just bitmaps like the first example, except using 2 bits per entry (for everything except your lowest level). The first bit means partially used, while the second bit means fully used. Neither bit means completely free, and then you can have both bits set mean anything you want (anything else useful). The benefit of this is, if you're searching for say, 4kb, and 4kb is your smallest size, you can scan the 4mb pages to see if there are any smaller sizes free, if so, you can then scan the smaller sizes until you find something, or find it isn't available. This is MUCH more efficient speed wise than just searching ALL 4kb blocks, however it doubles the size of you bitmapped area. For an example:
Code: Select all
4k x x x x x x x x x x o o o o o o
8k x x x o
16k x x
Typically, if you are allocation 4k, it would search through all 4kb blocks, so you would induce 1 compares before finding it. If you encoded with 2 bits per block, it would check the first 16k block and determine it had nothing free under it, and move to the next 16k block, which would say it has some available. So, it would check the first 8k block which would say it had some free, then check 3 4k blocks before finding a free block. This induced 6 compares rather than 11, now imagine this after you have been running the OS for a long time, and trying to allocate memory, and you can see the speed benefits could be large, and since the bottom layer (4k chunks) have nothing under them, they can be 1-bit encoded.
Now, for the example of the first 16k chunk, if you encoded this as a binary tree (or any tree for that matter), you the first element of the tree would have no children, which means you save the memory required for everything under it, but the memory to encode a tree, and how you allocate that memory is the challenge, but if you can figure it out, you will save space, and gain speed, especially when the system is taxed the most (almost all memory in use). It would take up more memory when the system had lots of free memory, and take the least space when the system was using almost all memory, so it's much better for worst case, although allocating and deallocating speeds suffer do to the extra complexity of having to build/destroy the tree each time.