Well, I've a problem understanding the structure of the a GDT-entry. I have the article about GDT and the GDT tutorial, and also the wikipedia article, but it's like already said
The actual structure of descriptors is a little messy for backwards compatibility with the 286's GDT.
My problem is mainly: why are there two times limit-bits and three times base-bits?? What values do I have to put in the different bytes? Setting the flags and other bits shouldn't be difficult.
For example, i want a segment starting at 1GB end ending by 4GB. 1024 Bytes = 1 KB, 1024 KB = 1MB, 1024 MB = 1 GB, hence 1024³ Bytes = 1GB = 1073741824 Bytes = 0x40000000 Bytes.
3 GB would be 3221225472 Bytes, butif the granularity bit is set, limit is counted in 4 KB-steps, so 3221225472 divided by 4096 is 786432 or 0xC0000 in hex.^^
Now, ignoring the flag and attribute bits, what would the entry look like? I'm sorry, but I'm really confused
The other problem is, that I don't understand the codeexample enough to deduce the structure
Thank you!