Re: Setting up a linear frame buffer without a BIOS
Posted: Tue May 26, 2015 4:48 am
By doing this on an USB photo frameAntti wrote:I wonder how it is possible to even have a "painfully slow" console.
The Place to Start for Operating System Developers
http://f.osdev.org/
By doing this on an USB photo frameAntti wrote:I wonder how it is possible to even have a "painfully slow" console.
I have UTF-8 support in the GUI library. I just don't think the console is a specialized "GUI app".glauxosdev wrote:So it is time to support them. Implementing UTF-8 is not hard, I did it in a single day.
I can in the GUI.glauxosdev wrote: You will never see the beauty of Greek characters in your "console".
No, it isn't. But I have to tell you for nth time that a "console" has to have UTF-8 support (or any other Unicode encoding for that matter). Really, just stop.I have UTF-8 support in the GUI library. I just don't think the console is a specialized "GUI app".
It means "All I know is that I know nothing". Adopt this quote to your own life and stop thinking you are the superior one here.Sokrates wrote:Ἓν οἶδα, ὅτι οὐδὲν οἶδα.
You should stop. Keep your opinion about what someone else's console should do and should not do to yourself.glauxosdev wrote:No, it isn't. But I have to tell you for nth time that a "console" has to have UTF-8 support (or any other Unicode encoding for that matter). Really, just stop.
So your GUI can be fully localized, except file names, which must be written in Latin characters even if the local language is something like Arabic, which has no standard representation in Latin characters.rdos wrote:OTOH, I don't support UTF-8 filenames anyway, so the whole issue is non-relevant.
You want the error messages to be in the user's language. You should not be parsing error messages programmatically; that's what return values are for.rdos wrote:You want the commands to be the same in different languages, and you also want the OUTPUT from the commands to be (because otherwise you cannot parse results programatically).
I'm not so sure about that.rdos wrote:Today you can easily expect people that use the console to be knowledgeable in English.
And I'm sure your GUI that forces all file names to be written in Latin characters will be very popular.rdos wrote:Typical users no longer use the command shell, they expect something better than that.
My view is that GUI has dominated since the 1990s; studies have shown that things like icons and WYSIWYG and "discoverable" user interfaces are far easier for users; the ability to script GUI apps has been around for at least 20 years now (Applescript, 1993); and I expect we've all recognised a trend towards mobile devices that don't have a keyboard (and use touch instead, where entering text is clumsy and GUI is far more suited). With these things in mind; a truly modern OS (e.g. where backward compatibility with ancient/obsolete things like CP/M and Unix aren't important) simply shouldn't have a console in the first place.Candy wrote:My view is that we shouldn't localize anything, but do allow the user-generated content to be user-localized. Your app uses standard names for things and it's the same for everyone (so you can google them and so on), but the stuff you make can use unicode names and so on.
It might be a valid design decision but you could still handle UTF-8 strings but not render the non-ASCII characters. A band name Motörhead could be encoded as a byte sequence (in hex): 4D 6F 74 C3 B6 72 68 65 61 64.rdos wrote:If you send UTF-8 filenames to a real textmode console, it will print some funny characters, that's all.
Yes.Brendan wrote:console should support Unicode, not because it's important to support Unicode, but because OS developers have better thing to do than waste time implementing unnecessary "ASCII only" support.
Sure, but I didn't waste any time when I did this maybe 20 to 25 years ago, when the available video-modes were really crappy and I didn't even have a GUI because of it. If I did it today, I might have wasted time, but not back then.Brendan wrote: Basically; console should support Unicode, not because it's important to support Unicode, but because OS developers have better thing to do than waste time implementing unnecessary "ASCII only" support.
Perhaps it's not an elegant solution but my Non-Unicode OS can read files with unicode chars from Fat32 on pendrives simply by skipping the unicode chars that are greater than 127 during comparisons. When I print the filenames I print an 'x' instead of (non-ascii) unicode chars. This way I can cd directories-with-unicode-names too.Combuster wrote:The biggest problem comes when you have files existing on a medium, and you can't do anything with them because you can't enter its filename in any way.
Is there any resolution for possible ambiguity? E.g. what if you have files named filé and filè, how do you distinguish fil from fil or filx from filx?bigbob wrote:my Non-Unicode OS can read files with unicode chars from Fat32 on pendrives simply by skipping the unicode chars that are greater than 127 during comparisons. When I print the filenames I print an 'x' instead of (non-ascii) unicode chars. This way I can cd directories-with-unicode-names too.
Well, that's a problem. The first file will be found in your case.alexfru wrote:Is there any resolution for possible ambiguity? E.g. what if you have files named filé and filè, how do you distinguish fil from fil or filx from filx?bigbob wrote:my Non-Unicode OS can read files with unicode chars from Fat32 on pendrives simply by skipping the unicode chars that are greater than 127 during comparisons. When I print the filenames I print an 'x' instead of (non-ascii) unicode chars. This way I can cd directories-with-unicode-names too.