Page 1 of 1

GDT entry?!

Posted: Sat Sep 14, 2013 5:20 pm
by terrabyte
Hello community!

Well, I've a problem understanding the structure of the a GDT-entry. I have the article about GDT and the GDT tutorial, and also the wikipedia article, but it's like already said
The actual structure of descriptors is a little messy for backwards compatibility with the 286's GDT.
:D
My problem is mainly: why are there two times limit-bits and three times base-bits?? What values do I have to put in the different bytes? Setting the flags and other bits shouldn't be difficult.
For example, i want a segment starting at 1GB end ending by 4GB. 1024 Bytes = 1 KB, 1024 KB = 1MB, 1024 MB = 1 GB, hence 1024³ Bytes = 1GB = 1073741824 Bytes = 0x40000000 Bytes.
3 GB would be 3221225472 Bytes, butif the granularity bit is set, limit is counted in 4 KB-steps, so 3221225472 divided by 4096 is 786432 or 0xC0000 in hex.^^
Now, ignoring the flag and attribute bits, what would the entry look like? I'm sorry, but I'm really confused :P
The other problem is, that I don't understand the codeexample enough to deduce the structure :oops:

Thank you!

Re: GDT entry?!

Posted: Sat Sep 14, 2013 6:15 pm
by sortie
I would recommend that you read the Intel/AMD CPU manuals, they describe the data structure in great detail.

Re: GDT entry?!

Posted: Sun Sep 15, 2013 9:52 am
by dozniak
terrabyte wrote: why are there two times limit-bits and three times base-bits??
for backwards compatibility with the 286's GDT.

Re: GDT entry?!

Posted: Sun Sep 15, 2013 12:46 pm
by terrabyte
Well, "why" is probably not the corrct word...

But ok, thank you, I'll "rtfm" ;)