Page 2 of 2

Re: drawing a GUI for my kernel, [ something like desktop

Posted: Tue Dec 30, 2014 9:40 pm
by Brendan
Hi,
tjmonk15 wrote:Maybe you could preface your posts like this with a "If you're looking for A correct way to do this" or something simliar, that make make your views/posts more acceptable/approachable.
Maybe; but then maybe my intention is to get the original poster to think about the design of their OS and how a GUI would fit into it, rather than just doing the natural thing (and implementing a GUI without considering the lower level layers or the interface/s between the GUI and the lower level layers and ending up with some prehistoric pixel pounder).
tjmonk15 wrote:Beyond that, for my own info:
Brendan wrote:
  • Nothing prevents the video driver from doing HDR.
Except Full-screen applications that wish to use Cel Shading instead.
There's no reason why you can't do HDR on cell shaded images.

Note 1: the amount of light that a monitor can put out is limited by its design (mostly by the strength of its back light). HDR is just a way to bypass that limitation - essentially, allowing "brighter than hardware can support" pixels, and then scaling the brightness down to the range the hardware can support, in a way that tricks people into believing the image actually is "brighter than technically possible" (by mimicking the effect of the human eye's iris). Basically, HDR is a way to bypass the hardware's pixel brightness limits, super-sampling is a way to bypass the hardware's resolution limits, and dithering is a way to bypass the hardware's "number of colours" limits.

Note 2: the other major limitation of current hardware is frame rate. For example, I'm sure we've all seen videos of a moving car where the wheels either look like they aren't rotating or look like they're rotating in the wrong direction. The solution to this problem is motion blur. However, motion blur is either extremely complex and expensive (e.g. keeping track of the trajectories and instead of drawing pixels you'd draw a (not necessarily straight) line from where the pixel was to where it is now) or extremely expensive and simple (e.g. the brute force approach of generating "n sub-frames" and merging them to form a frame).
tjmonk15 wrote:
Brendan wrote:
Except full screen applications that couldn't possibly render in real time at a higher resolution than they specify. (That resolution should be user-defined obviously, and be chosen from a list of resolutions that the user's monitor supports at all times)
In general there's 2 alternatives:
  • "Fixed detail", where frame rate varies as a consequence of performance and scene complexity
  • "Fixed frame rate", where detail varies as a consequence of performance and scene complexity
The first alternative is idiotic and (like a lot of things game developers do) it should be banned outright.

For the second alternative you need to estimate how much work you're able to do in a fixed amount of time (including determining how much of the data you cached last time can be reused to avoid doing the work again) and use that estimate to vary things (like "intermediate resolution", where "too small/too distant" cut-off points are, whether to use textures or solid colour for which polygons, whether to use "slightly changed but maybe close enough" data you cached from the last frame, etc). It's not something that belongs on the application side of the "graphics device abstraction" (unless that "graphics device abstraction" is a leaky abstraction that leaks so badly that you're better off not having a video driver in the first place).

Basically, applications/GUIs should send a "description of scene" 60 times per second without caring about how it's rendered; and powerful hardware might render it at 60 frames per second with extremely high quality, and crappy hardware might render it at 60 frames per second with extremely low quality; but the applications/GUIs shouldn't need to know or care which. It should also be possible to send the raw "description of scene 60 times per second" data to a file and then spend an hour to generate each frame to produce an insanely high quality/photo-realistic video (without the applications/GUIs knowing or caring); and be possible to send the same "description of scene" to a pool of 100 computers and get each of those computers to render 1% of the screen and combine the results from each computer before displaying the frame in real-time (still without the applications/GUIs knowing or caring); or be possible to do any/all of the above and display the result on a 20*10 grid of 200 monitors where each monitor is connected to one of 100 completely different video cards (some ATI, some Intel, some NVidia) and where each monitor may be using one of many different video modes (still without the applications/GUIs knowing or caring).


Cheers,

Brendan

Re: drawing a GUI for my kernel, [ something like desktop

Posted: Sat Jan 03, 2015 2:03 pm
by halofreak1990
KemyLand wrote:This thread has became a harsh discusion between me and Gigasoft. Shouldn't we fork a separate thread, as the OP has right to a answer (which only Brendan has gave :? )?
I suggest that if you want proper answers be given without a harsh discussion, you should stop attacking Gigasoft for having a different opinion than you and pointing out flaws in your reasoning.

Re: drawing a GUI for my kernel, [ something like desktop

Posted: Sat Jan 03, 2015 2:49 pm
by KemyLand
halofreak1990 wrote:
KemyLand wrote:This thread has became a harsh discusion between me and Gigasoft. Shouldn't we fork a separate thread, as the OP has right to a answer (which only Brendan has gave :? )?
I suggest that if you want proper answers be given without a harsh discussion, you should stop attacking Gigasoft for having a different opinion than you and pointing out flaws in your reasoning.
What are you saying? I was trying to give a practical and impartial solution to this issue, and you came way days later, when the thread is already dead, to throw over me all guilty? Both of us where giving rational and strong points. We were over a theorical discussion, just that I saw the OP wasn't getting his answer, and like every discussion, it was eventually becoming pointless and harsh. I should again remark that only Brendan gave an elaborate answer to the OP's question.

I fully respect Gigasoft's opinions and ideas, but I don't agree with them. Think of it carefully, if either me or Gigasoft would have started a flamewar, the moderators would respond. That didn't happened. I'm pretty sure both me and Gigasoft understand the benefits and flaws of our two models. His proposal is perfect for embedded systems; but high-tech mainframes with 10x20 monitors wouldn't benefit in any way. Remember us, the programmers, are forced to follow what the hardware manufacturers specify. There's no other way. Modern video devices refresh the screen with a constant rate. That tradition cames from the VGA/EGA/CGA.

Do you say I don't point out my faults? Read carefully each of our posts again. I signed several of my model's faults, such as the fact that it requires extra buffers and moves/copies. If you really want to restart a dead topic, do it with a correct, imparcial, and colaborative mindset at least [-X .

Re: drawing a GUI for my kernel, [ something like desktop

Posted: Sun Jan 04, 2015 6:16 am
by Combuster
KemyLand wrote:I'm pretty sure both me and Gigasoft understand the benefits and flaws of our two models.
I personally doubt that statement holds, especially on your side:
The wiki wrote:There are a lot of people trying to help, but you'll find that some people post arbitrary suggestions based on their own lack of understanding. Again, if you don't understand why, you are better to question the suggestion for elaboration and understand what's really happening.
The problem is that if any moderator silences anybody, there would be no lessons learned. I'd rather have someone explain the canonical (and functionally 100% accurate, mind you!) way of doing things to you than to tell you you're wrong and simply denying you the better answer.

Beyond that, there's no point in adding this level of rudeness to your posts. There's always a better answer possible and you should accept that.