SPICE: lots of theoretical wankery that may someday be an OS

Discussions on more advanced topics such as monolithic vs micro-kernels, transactional memory models, and paging vs segmentation should go here. Use this forum to expand and improve the wiki!
User avatar
iansjack
Member
Member
Posts: 4685
Joined: Sat Mar 31, 2012 3:07 am
Location: Chichester, UK

Re: SPICE: lots of theoretical wankery that may someday be a

Post by iansjack »

MessiahAndrw wrote:commodorejohn - Talking about alternatives and how things could be better is great. We all love great debates and new ideas. However, if you do criticize something - a standard practice, someone's project, current ways of doing things - you really need to focus on presenting an alternative, rather than just talking about why current methods are bad. If you do not do this, what you are saying can easily be misinterpreted as either an offensive attack or simply whining, and that's when things generally get bad on this forum and your thread gets locked. I hope that doesn't become the case, so please think through what you're about to post. If you focus your discussions around 'here's an idea that I think could make things better' rather than 'here's a list of things that suck' people will respond a lot more positively.
Spot on. Whining about what is wrong is easy; defining a solution, less so. If you are to crticize something that serves many people very well then you should have an alternative to hand - preferably not one that has already been tried and found wanting.
User avatar
Rusky
Member
Member
Posts: 792
Joined: Wed Jan 06, 2010 7:07 pm

Re: SPICE: lots of theoretical wankery that may someday be a

Post by Rusky »

Brendan: You've forgotten the law of leaky abstractions. Interfaces always dictate their implementation to some degree- a particular method of implementation will be more efficient with some interfaces, but less efficient or counter-productive with others. You will never be able to standardize on a single interface that will work optimally for every implementation strategy you'll want to use.

For example, the interface of sending descriptions rather than framebuffers was required for good performance on early X setups, but it's impossible to implement efficiently for today's requirements. On the other hand, the interface of sending framebuffers could not have satisfied early X requirements of low memory usage and server-side rendering on low-bandwidth networks, but it's essential for today's requirements of low-latency, high-performance animation, rendering, video, games.

You cannot describe a single interface that would enable implementations to satisfy both those sets of requirements. What they did instead was to extend the interface they had to keep backwards compatibility until that was no longer necessary and they had the resources to make a new one without the legacy baggage. There is no problem with that at all, and that is what should continue to happen in the future- Wayland is designed to be extensible so it will continue to be extended for new requirements until its fundamental design is no longer useful.

The same law of leaky abstractions applies to 3rd party dependencies. You can build a single standard database interface into the OS and have everything implement it, but for each implementation you're still stuck between "works well with the standard interface" and "has to be included separately in every application that uses it." The first is great when it works, but even at the interface level there are things that can't or shouldn't be included in the OS but that should be shared between applications to help performance and security- it is in fact important to be able to choose incompatible interfaces, and because you'd be an idiot to design a standard without testing it in the real world, that is how standards should be created in the first place.

The correct solution is not to include a copy of every library with every program that uses it, but to properly track versions to avoid dependency hell while still allowing for the optimizations and security of shared libraries. You don't have to give up whole program optimization or simple package management to do this- architecture-specific optimization should be done at install time anyway, and graph sorting algorithms are introductory CS subjects.

The biggest success story here is the web- despite its horrific pile of technologies and hacks and glue, incompatible implementations, dead-ended designs of interfaces and implementations, it's still exploded. Browsers have just added stuff however they felt like, sometimes eventually standardized it, and left web developers to deal with differences through JavaScript polyfills, but guess what? Not only are massive applications and platforms built on the web, you see multiple companies offering computers with nothing but web browsers- ChromeOS and FirefoxOS!
User avatar
Rusky
Member
Member
Posts: 792
Joined: Wed Jan 06, 2010 7:07 pm

Re: SPICE: lots of theoretical wankery that may someday be a

Post by Rusky »

commodorejohn wrote:And are the things I said obvious? Darn right they are. Unfortunately, they seem to be the kind of obvious that people frequently miss or ignore despite their obviousness - so I think they bear repeating.
That's usually a sign that it's not actually obvious, because it's missing something that is obvious if you have any experience building real software.
commodorejohn wrote:Will it? Seriously, I'm still trying to wrap my head around this viewpoint after years of hearing it treated as gospel. Please, explain to me how, say, a text editor can present a security flaw that doesn't spring from the developer doing something incredibly bone-headed like making it Internet-facing for no good reason.

(For that matter, could someone relate a network security flaw that didn't spring from something incredibly, obviously stupid like using buffers without bounds-checking on an outward-facing connection? I'm really curious.)
This is not a case of mere logic, this is a case of what actually happens. What significant program have you used that hasn't had security patches? Of course network vulnerabilities are higher profile, but anything dealing with user data (so, just about everything useful) is an attack target. Console games get exploited to run arbitrary code on consoles. Word processors get macro viruses. Image file formats have been used as attack vectors! Do you really developers are just dumb and ignore the "incredibly, obviously stupid" buffer overflows in their code? Or is it maybe just a little bit less obvious than you claim?
commodorejohn wrote:Who said anything about throwing out wisdom and experience? I'm perfectly willing to take lessons from these into consideration. What I'm not interested in is trying to kludge a system that's already half made up of kludges into being something more elegant and modern when it would be simpler to just start from scratch.
The problem is you're not seeing the lessons as what they are, and instead seeing them as garbage, because you haven't thought the requirements through. It's this kind of pie-in-the-sky rewrite-the-universe-and-it-will-be-perfect thinking that leads to the legitimate problems you're complaining about.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: SPICE: lots of theoretical wankery that may someday be a

Post by Brendan »

Hi,
Rusky wrote:
commodorejohn wrote:Will it? Seriously, I'm still trying to wrap my head around this viewpoint after years of hearing it treated as gospel. Please, explain to me how, say, a text editor can present a security flaw that doesn't spring from the developer doing something incredibly bone-headed like making it Internet-facing for no good reason.

(For that matter, could someone relate a network security flaw that didn't spring from something incredibly, obviously stupid like using buffers without bounds-checking on an outward-facing connection? I'm really curious.)
This is not a case of mere logic, this is a case of what actually happens. What significant program have you used that hasn't had security patches? Of course network vulnerabilities are higher profile, but anything dealing with user data (so, just about everything useful) is an attack target. Console games get exploited to run arbitrary code on consoles. Word processors get macro viruses. Image file formats have been used as attack vectors! Do you really developers are just dumb and ignore the "incredibly, obviously stupid" buffer overflows in their code? Or is it maybe just a little bit less obvious than you claim?
It's nice to see both of you (Rusky and commodorejohn) agreeing with each other and admitting that, for current programmers using current tools (e.g. C/C++) the risk of security problems is excessive.

I also agree with both of you - the languages are too complex for people (including experienced programmers) to get things right, and the tools aren't able to detect an adequate amount of "programmer error", resulting in far too many bugs and far too many security vulnerabilities.

The differences are how we are planning to solve the complexity and "mistake detection" problems - commodorejohn seems to be going for "interpreted" (which could solve the complexity problem and should solve most "mistake detection" problems, but implies severe performance penalties).
commodorejohn wrote:Who said anything about throwing out wisdom and experience? I'm perfectly willing to take lessons from these into consideration. What I'm not interested in is trying to kludge a system that's already half made up of kludges into being something more elegant and modern when it would be simpler to just start from scratch.
The first lesson to learn is that there's a "well trodden path" that leads to the status quo, and if you attempt to leave that path (e.g. go in a direction that isn't "pointless wheel reinvention") you can expect much higher resistance. 8)


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
Rusky
Member
Member
Posts: 792
Joined: Wed Jan 06, 2010 7:07 pm

Re: SPICE: lots of theoretical wankery that may someday be a

Post by Rusky »

There's a difference between leaving the well-trodden path to do something new and useful and leaving the well trodden path with your fingers in your ears singing lalala because you got frustrated with something. Just because a solution has problems doesn't mean the problems it solves won't exist when you rewrite the universe.

For example, you can't just remove all the pieces of C++ you don't like and say "look, a perfect language nobody will have problems learning and writing secure software in!" without providing replacement solutions for the use cases currently handled by templates, macros, pointers, RAII, etc. It doesn't have to (and shouldn't) be a point-by-point checklist of replacement features, but you have to solve the problems somehow or you'll be no better overall.

For another example, you can't just arbitrarily declare that all application data shall and must be in one directory, with no installation steps allowed. You don't have to give up on making the process of getting and using software simpler, but you can't ignore the use cases that you make impossible with your utopian rainbow magic vision.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: SPICE: lots of theoretical wankery that may someday be a

Post by Brendan »

HI,
Rusky wrote:There's a difference between leaving the well-trodden path to do something new and useful and leaving the well trodden path with your fingers in your ears singing lalala because you got frustrated with something. Just because a solution has problems doesn't mean the problems it solves won't exist when you rewrite the universe.
Agreed; but I don't see anyone doing that.
Rusky wrote:For example, you can't just remove all the pieces of C++ you don't like and say "look, a perfect language nobody will have problems learning and writing secure software in!" without providing replacement solutions for the use cases currently handled by templates, macros, pointers, RAII, etc. It doesn't have to (and shouldn't) be a point-by-point checklist of replacement features, but you have to solve the problems somehow or you'll be no better overall.
When I was a teenager I got appendicitis, and had my appendix removed. The doctors didn't replace my appendix with a "replacement solution" because my appendix solved nothing to begin with. You could say that a human's appendix is a "design anti-feature that provides no solution to anything".

I wouldn't assume that C++ doesn't have any "design anti-features that provide no solutions to anything" that can't be removed without providing a replacement solution. In fact I'd be tempted to suggest that the majority of people who prefer C over C++ would consider most of the C++ features to be things that can be removed without providing any replacement solution.
Rusky wrote:For another example, you can't just arbitrarily declare that all application data shall and must be in one directory, with no installation steps allowed. You don't have to give up on making the process of getting and using software simpler, but you can't ignore the use cases that you make impossible with your utopian rainbow magic vision.
Why can't you declare that all application data (excluding the user's data) shall and must be in one directory, with no installation steps allowed? I can't think of any significant problem that this would cause, or think of any significant feature that this would prevent.


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
Rusky
Member
Member
Posts: 792
Joined: Wed Jan 06, 2010 7:07 pm

Re: SPICE: lots of theoretical wankery that may someday be a

Post by Rusky »

You should read some of the history of C++. Literally every feature was added because of a particular use case that it solved. That's arguably not the best way to do it, but it does tell you something about whether any of its features are vestigial...

People in this thread, and others with the same attitude, complain about things on a general level without offering real solutions. They say things like "ridiculously complex yet unbelievably primitive system architecture" and "complications that benefit nobody that the designer failed to avoid" without considering that the way they use a tool is not the only valid way. You can tell they've had frustrating experiences with the things they're complaining about, but they immediately jump from "it didn't do what I want" to "it's an overly complicated piece of garbage" rather than "I should probably learn more about this tool" or "I wonder how this could be improved."

They complain that the X11 interface wasn't good enough to take into account future requirements, without so much as vaguely outlining an alternative design. They complain that C++ is too complicated, while ignoring the real benefits it has from some of the very features they complain about. They complain that program installation is too complicated, while ignoring all the different configurations people need to install software in. They complain that Gnome apps have too many dependencies, while ignoring the features their alternatives lack. They complain that computers today are so much more complicated than the C64 or Amiga or whatever, but they ignore how many more things they're used for. They complain that the web is a terrible platform for apps, but they ignore that nothing else has as seamless an experience in distribution, updating, sharing, etc.

I have the same frustrations. C++ really is more complicated than it needs to be. Linux's directory structure really does have some dark corners. Computers really do have lots more edge cases and dark holes to trip people up. The web really is a pretty gross place for apps. But I don't make huge exaggerations and declare the whole of the software universe unfit for consumption, when the opposite evidence is staring me in the face with everything I touch in modern civilization. People really do get important work done in C++ without bringing the whole world crashing down. People really do use Linux productively without compiling anything from source or getting confused by the interface. People really do learn how to use computers and have fun doing it. People really do make useful web apps for important use cases.
embryo

Re: SPICE: lots of theoretical wankery that may someday be a

Post by embryo »

Brendan wrote:When I was a teenager I got appendicitis, and had my appendix removed. The doctors didn't replace my appendix with a "replacement solution" because my appendix solved nothing to begin with.
In fact the appendix is part of a complex immunity system of a human. The consequence of it's absence is not very obvious, but on average people without it are less immune.

Then it is better to think about the original purpose of some programmatic component instead of declaring it useless.
embryo

Re: SPICE: lots of theoretical wankery that may someday be a

Post by embryo »

Rusky wrote:People in this thread, and others with the same attitude, complain about things on a general level without offering real solutions.
But the complains are a motivation for us to propose those "real solutions".

Have we succeed to propose solutions?
User avatar
iansjack
Member
Member
Posts: 4685
Joined: Sat Mar 31, 2012 3:07 am
Location: Chichester, UK

Re: SPICE: lots of theoretical wankery that may someday be a

Post by iansjack »

Have we succeed to propose solutions?
Ranting about contemporary operating systems, with claims that the idividual could do it better (but is too busy doing the washing to make a start) prove nothing.

Before we propose solutions we have to accept that the complaints are valid and that solutions are needed. In the case of, for example, package management I don't believe the complaint is valid. (This is not to say that the proliferation of different library versions is not a problem - but it is that very problem that package management helps to solve. Although you may believe, as I do, that a more efficient solution is the BSD ports model - as adopted by Gentoo - with recompilation from source. This helps to drastically reduce any library variation.)
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: SPICE: lots of theoretical wankery that may someday be a

Post by Brendan »

Hi,
iansjack wrote:
Have we succeed to propose solutions?
Ranting about contemporary operating systems, with claims that the idividual could do it better (but is too busy doing the washing to make a start) prove nothing.

Before we propose solutions we have to accept that the complaints are valid and that solutions are needed. In the case of, for example, package management I don't believe the complaint is valid. (This is not to say that the proliferation of different library versions is not a problem - but it is that very problem that package management helps to solve.
It'd be more accurate to say package management helps hide the symptoms of "dependency hell"; and hiding symptoms of a problem isn't something I'll ever be fond of (in general, hiding symptoms is something you do when you fail to solve a problem).
iansjack wrote:Although you may believe, as I do, that a more efficient solution is the BSD ports model - as adopted by Gentoo - with recompilation from source. This helps to drastically reduce any library variation.)
As someone that's been using Gentoo for years, I can guarantee that it solves nothing. After constant breakage caused by updates I simply stopped updating anything. I haven't done "emerge world" for about 3 years now, and while I'm sure there are many security vulnerabilities on my computer that have been fixed since, all of those security vulnerabilities combined have a far lower risk than the OS's own package management system.

FreeBSD is different to Linux in that they do have a "base system incorporating kernel, system libraries and all essential applications" (which helps to reduce dependency problems); and there isn't thousands of distributions all breaking compatibility for the sake of a different GUI theme. For this reason, I'd expect package management to be much more reliable on FreeBSD despite the fact that there's far fewer volunteers slaving away behind the scenes in an attempt to shield end-users from the dependency nightmare.

The other problem with Gentoo is that compiling from source is an extremely slow and bloated process. When a new version of (e.g.) KDE hits the source repositories and you make the mistake of updating, you'd better hope you've got a Windows machine to use for the rest of the day (and then most of the next day while you're trying to fix the inevitable breakage). However, to be fair, the severe lack of efficiency involved with "recompilation from source" is not the package management's fault - it's the tools used for recompilation that deserve the majority of the blame for that; starting with auto-conf (a tool whose only purpose is to hide the symptoms of severe design failures throughout the remainder of the tool chain and environment in which those tools are expected to operate).


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
commodorejohn
Posts: 11
Joined: Sat Jul 05, 2014 9:31 pm
Location: Duluth, MN
Contact:

Re: SPICE: lots of theoretical wankery that may someday be a

Post by commodorejohn »

Rusky wrote:This is not a case of mere logic, this is a case of what actually happens.
I didn't say it didn't happen, I asked if anybody could point me to a time it happened that wasn't for very stupid reasons.
What significant program have you used that hasn't had security patches?
Programmer's Notepad. FCE Ultra.
Word processors get macro viruses.
Yes, and that would be a prime example of what I'm talking about when I say that it seems like most of these things stem from stupid design decisions - because seriously, how could you not realize the potential of a fully programmable and completely unsecured scripting language that can be transparently embedded into ordinary-looking documents and interface with the operating system at large?
Do you really developers are just dumb and ignore the "incredibly, obviously stupid" buffer overflows in their code?
Since they keep happening (Heartbleed, anyone?) I'd have to go with "yes, apparently so."
The problem is you're not seeing the lessons as what they are, and instead seeing them as garbage, because you haven't thought the requirements through. It's this kind of pie-in-the-sky rewrite-the-universe-and-it-will-be-perfect thinking that leads to the legitimate problems you're complaining about.
If by "you're not seeing the lessons as what they are" you mean "you're not blindly accepting that the way Unix/Linux does things is always and invariably the best and only way to do it," sure. Otherwise, no, that's not what I'm doing at all.

And I'm curious as to how the idea of starting with a more or less clean slate and building a new system from the ground up is the source of problems I've complained about that pretty much all stem from trying to kludge an existing arcane, primitive design into successive approximations of an actually modern operating system.
Brendan wrote:The differences are how we are planning to solve the complexity and "mistake detection" problems - commodorejohn seems to be going for "interpreted" (which could solve the complexity problem and should solve most "mistake detection" problems, but implies severe performance penalties).
Actually, I'm looking at interpreted systems more for the purposes of portability than security - beyond a certain level of catching access violations, I really don't think that it's feasible or wise to expect the language or runtime environment to do the developer's job of debugging for them.

(About the performance penalties, ten years ago when I was learning Java in college and marvelling over what a mess it was, I would've agreed completely. Now, though, I really think we've reached a point where a sensibly-designed VM can be quite performant enough for pretty much anything besides heavy number-crunching or high-end gaming.)
The first lesson to learn is that there's a "well trodden path" that leads to the status quo, and if you attempt to leave that path (e.g. go in a direction that isn't "pointless wheel reinvention") you can expect much higher resistance. 8)
Indeed.
Rusky wrote:Just because a solution has problems doesn't mean the problems it solves won't exist when you rewrite the universe.
Maybe so, but the fact that there will always be problems needing to be solved doesn't mean that you can't arrive at a better solution to certain problems by avoiding the poor design decisions that led to them in the first place.
Rusky wrote:People in this thread, and others with the same attitude, complain about things on a general level without offering real solutions. They say things like "ridiculously complex yet unbelievably primitive system architecture" and "complications that benefit nobody that the designer failed to avoid" without considering that the way they use a tool is not the only valid way. You can tell they've had frustrating experiences with the things they're complaining about, but they immediately jump from "it didn't do what I want" to "it's an overly complicated piece of garbage" rather than "I should probably learn more about this tool" or "I wonder how this could be improved."
Man, I spent upwards of seven years "learning more about [these tools]." All it got me was a fuller comprehension of just what a mess they are. If you want to go "works for me!" or otherwise argue that you don't care, well, fine, that's your affair. But this typical blame-the-user-for-an-overly-convoluted-design thing is the same crap I've already heard a million times, and it didn't change my mind then, either.
Linux's directory structure really does have some dark corners.
Seriously? It's all dark corners.
But I don't make huge exaggerations and declare the whole of the software universe unfit for consumption,
I never did that, and you freaking know it. What I did was suggest that Unix and Unixoids, specifically, are too much of a mess to constitute a viable base for a high-quality modern operating system on account of their being a mainframe OS kludged up with forty years of legacy cruft - but of course dissing Unix is Nerd Heresy and cannot be tolerated, so naturally that gives everybody else license to read inventive new meanings into the things I actually did say and just generally make stuff up, and to further berate me for only complaining and not offering better ideas when the time I'm actually able to spare to get over here is taken up pretty much completely with explaining how "no, I didn't actually say that" to the point where I simply haven't had the opportunity to sit down and explain what I think good solutions would be.
Brendan wrote:It'd be more accurate to say package management helps hide the symptoms of "dependency hell"; and hiding symptoms of a problem isn't something I'll ever be fond of (in general, hiding symptoms is something you do when you fail to solve a problem).
This.
Computers: Amiga 1200, DEC VAXStation 4000/60, DEC MicroPDP-11/73
Synthesizers: Roland JX-10/MT-32/D-10, Oberheim Matrix-6, Yamaha DX7/FB-01, Korg MS-20 Mini, Ensoniq Mirage/SQ-80, Sequential Circuits Prophet-600, Hohner String Performer
User avatar
iansjack
Member
Member
Posts: 4685
Joined: Sat Mar 31, 2012 3:07 am
Location: Chichester, UK

Re: SPICE: lots of theoretical wankery that may someday be a

Post by iansjack »

As someone that's been using Gentoo for years, I can guarantee that it solves nothing. After constant breakage caused by updates I simply stopped updating anything. I haven't done "emerge world" for about 3 years now
I can only say that you are doing something wrong. I've updated my Gentoo at least once a week for the past 5 years or more with no real problems (nothing that couldn't be solved by a quick Google). But it is important to do an "eselect news" before updating @world. And it is a good idea to do a --newuse --deep upgrade every now and then.
User avatar
iansjack
Member
Member
Posts: 4685
Joined: Sat Mar 31, 2012 3:07 am
Location: Chichester, UK

Re: SPICE: lots of theoretical wankery that may someday be a

Post by iansjack »

I simply haven't had the opportunity to sit down and explain what I think good solutions would be.
Well, stop being defensive, cut the crap and make the time now to do that very thing before posting further. People would probably take your points more seriously if you could demonstrate feasible ideas of ways of overcoming the deficiencies that you perceive in contemporary OSs.

Time to put up or ....
User avatar
Rusky
Member
Member
Posts: 792
Joined: Wed Jan 06, 2010 7:07 pm

Re: SPICE: lots of theoretical wankery that may someday be a

Post by Rusky »

If so many good programmers still let buffer overflows through, you cannot lay the whole blame on them for not seeing the obvious, because it is clearly not obvious or stupid at that point. As Brendan said, C and C++'s model for memory has too many holes.

You still haven't come up with any specific, real criticisms of Unix. There are a significant number of users of desktop Linux that don't have any problems keeping their software up to date without breaking things, without compiling anything from source, and without getting confused by the GUI. It really does work remarkably well, despite its flaws, and not just by band-aiding the symptoms- the design does get some things right from the start. In fact, many of Brendan's criticisms are specific to his particular installation of Gentoo.

Let's be clear here: I don't mind criticism of Unix. I do plenty of it myself. My own operating system project really don't resemble it at all. What I don't like is unbounded utopian thinking that ignores all the problems that the Unix design (or any existing-and-crufty design) solves (no, not creates-and-band-aids, but solves by its design). That kind of thinking is very tempting for programmers to fall into- "our code right now is just crap, we need to rewrite it and everything will be better." Every concrete suggestion for "improvement" on Unix in this thread has ignored or dismissed the problems it solves, but any rewrite is going to have to deal with the exact same real world as its predecessor, and you haven't even attempted to deal with that in design, let alone in code.

Quit trying to defend yourself point by point, making yourself a martyr who's been attacked for trying to improve things and whose words have been twisted, and then complain that you don't have time to write a real defense of your ideas. Just explain exactly what problems you have with current systems and how you would solve the problems instead, and then we can have a real discussion. Until then, nobody's really going to take your complaining seriously, because they simply don't have those problems.
Post Reply