Brendan wrote:For movies it's just going to be an application telling the video driver "Play this file" (but I haven't thought about or designed the file format for this yet).
I'm not talking about pre-recorded videos, I'm talking about things like Pixar's render farms that are actually generating those videos.
Brendan wrote:Games use custom shaders because the graphics API sucks. It's not necessary. It might be "desirable" in some cases, but that doesn't mean it's worth the pain.
It's absolutely necessary. Custom shaders are just as important as custom CPU-side programs. You would call me insane if I proposed only giving access to the CPU through pre-existing code to process pre-existing file formats.
Brendan wrote:There's never a need to synchronise user input and physics with the frame rate.
There is absolutely a need to synchronize user input and physics with the frame rate. There needs to be a consistent lag time between input and its processing, physics needs to happen at a consistent rate to get consistent results, and rendering needs to happen at a consistent rate to avoid jitter/tearing. Of course these rates don't have to be the same, but they do have to be synchronized. Many games already separate these rates and process input as fast as is reasonable, video at the monitor refresh rate, and physics at a lower, fixed rate for simulation consistency.
Brendan wrote:(and are the reason why gamers want "500 frames per second" just to get user input polled more frequently; and the reason why most games fail to use multiple threads effectively).
These gamers are wrong. Correctly-written games take into account all the input received since the last frame whether they're running at 15, 20, 30, or 60 frames per second. Doing it any faster than the video framerate is literally impossible to perceive and anyone who claims to do so is experiencing the placebo effect.
Brendan wrote:By shifting the renderer into the video driver the renderer gets far more control over what is loaded into VRAM than ever before.
The video driver doesn't have the correct information to utilize that control. It doesn't know the optimal format for the data nor which data is likely to be used next; a renderer that's part of a game does.
Brendan wrote:Games like (e.g.) Minecraft only need to tell the video driver when a block is placed or removed. Not only is this far simpler for the game but allows the video driver to optimise in ways "generic code for wildly different video hardware" can never hope to do.
...
when games specify what they want (via. a description of their scene) instead of how they want things done, there's no way to scale games up to new hardware without the games specifying what they want??
...
The game just tells the video driver the volume of the liquid, its colour/transparency and how reflective the surface is. Things like "rippling" will be done elsewhere via. control points (part of physics, not rendering).
So you're going to include support specifically for voxel-based games in your API? Your standardized scene description format cannot possibly anticipate all the types of information games will want to specify in the future, nor can it possibly provide an optimal format for all the different things games care about specifying today.
Take my water example and explain how that will work- will you design a "Concise Way to Describe Bodies of Water" in all possible situations so that games can scale from "transparent textured plane" to "ripples and reflections using a shader" that takes into account all the possible ways games might want to do that, especially when there's not enough bandwidth to do ripples as a physics process so it must be done in the shader? How will this solution work when you have to specify a "Concise Way to Describe Piles of Dirt" and a "Concise Way to Describe Clouds" and a "Concise Way to Describe Plants" and a "Concise Way to Describe Alien Creatures" and a "Concise Way to Describe Stars" and a "Concise Way to Describe Cars" and a "Concise Way to Describe Spaceships" so that detail can automatically be added when new hardware is available?
Even if you could design good formats for every possible thing that might take advantage of new hardware, your format will never be optimal for what each particular game is doing, because it has to take into account every possibility. Letting games design their own formats is much better- and you don't even have the "waaaah everybody's using different image file formats" problem in that case.
This is the reason custom shaders are no more undesirable than custom CPU programs. What happens when somebody comes along with a new rendering technique that enables a new property of rendered objects that you hadn't incorporated before?
Brendan wrote:Sure; and the "minimum requirements" listed by every PC game is just a suggestion, and all modern games will run on ancient "fixed function pipeline" hardware because the API provides the minimum necessary to be independent of the specific features of the hardware...

...
Working the same as it does on old hardware is not the same as taking advantage of new hardware's capabilities.
I never said the old OpenGL/DirectX APIs did this. They don't, because they added higher level features without including fallback support for them on older hardware (DirectX did a little bit better at this, though). I said Vulkan/DX12
can do this now, because the API is so much simpler that writing fallback support is also simpler. I also said domain-specific rendering engines have an easier time of scaling down the requirements for older hardware.
On the scaling-up side, we differ in our goals. I care about artistic integrity, and think games and movies should look the way their designers intended them, and that they should have the freedom to control what that look is, whether it's purposefully-blocky, shaded and outlined like a cartoon, or whatever more-subtle variation on reality they care to come up with. Thus, the only thing old games should do on new hardware is run faster and with higher resolution.
You don't care about any of that and think the video driver should redesign every game's look on every hardware update for the sake of "realism." I think this should only apply to applications that want it, like CAD tools or scientific imaging tools, that can use a specific library for that purpose, which will only need to be updated once when a new generation of hardware comes out, not again for every driver.
Brendan wrote:the idiotic/unnecessary complexity of the modern API without trying to cope with "N previous iterations" on top of that.
We agree on this point. We just disagree on which direction to go to solve it- you want a generic high level renderer, I want Vulkan/DX12 with many third-party renderers on top.
Brendan wrote:Rusky wrote:I'm not talking about already-written games and tools, I'm talking about people wanting to continue to create similar games and tools on your new-and-improved OS.
Did you have any relevant examples that are significant enough for me to care?
I've been giving them this whole time. Any games that don't want to be "as realistic as possible" and actually have their own art style are out, as you already said. That includes, as part of a very large list, nearly every game made by or for Nintendo, Square Enix, Capcom, etc...
Brendan wrote:Often impostors aren't "good enough" and do need to be redrawn, but it's far less often than "every object redrawn every frame".
Please note that this is not something I invented. "Static impostors" have been used in games for a very long time, and (some) game developers are doing auto-generated/dynamic impostors with OpenGL/DirectX.
Imposters are a trick to improve performance when necessary, not the primary way things are drawn in games today. Parallelizing a game across a network by converting everything to imposters is not an improvement, it's throwing everything in the toilet for the sake of forcing your idea to work.
The point is, in the worst case (which happens a lot) the camera will be moving and rotating and changing the perspective on everything visible, so you need to be prepared to handle this case even if it doesn't happen all the time. And if your render state is strewn across the network, you have absolutely no hope of handling this and games will just have imposter artifacts and lowered quality all the time.