i'm thinking about using UTF16 as the Internal Code of my OS. but after a long time, i found that gcc don't like it,and i hate gcc for this

we can use prefix 'L' in VS to declare that a string is UTF16 but not in gcc..e.g, L"abc".
and i find there is a 'u' act as 'L' in gcc,e.g. u"abc".
but i still can't get it work now even i add -std=gnu99.what's more, i don't like this argument, so i hope to write a macro to replace 'u'.
i hope to know how does the 'u' or 'L' works, and if it is done on compiling(then 'u' is considerable) or running(forget it,i dont like such a function running in my kernel).and i'm not good at preprocess,so i hope someone can help.
i use gcc 4.4.1.
problem: how does 'u' work? how to write a macro converting ascii to UTF16?
thx