I can't believe it's been a year since I posted in this thread.
It doesn't feel like so long at all, although I think the way I express myself has changed in that time. Come to think of it, the way I think about development has changed a lot over the past few years.
Hi Vania! Your post interested me.
However, I'm not at all convinced of the security of systems based on virtual machines. I really don't see how they'd be any more than a temporary inconvenience to crackers. I recall 9front's lead dev not seeing how they'd offer more isolation than processes, and I could quote some very strong language from OpenBSD's lead dev. I think it's interesting to look at processes. By design, there's nothing they can do without the operating system's support, yet there are oversights in design and implementation and oversights in implementation, and sometimes we blow holes right through good security practice for reasons of performance or convenience. (I have a story about WebGL on Intel hardware. :s ) It's all the same with VMs.
Last year, moonchild wrote:eekee wrote:I have had some ideas for persistence of main memory, but that's really the easy part. Swap, mmap (if always file-backed), and Forth blocks all offer natural methods for persisting whatever is written to them.
It's not necessarily that easy.
Consider crash tolerance in the face of synchronization as one problem. Say process A requests some token from process B, and process B sends the token. Then there's a crash. Process B's state was synced after sending the token, and process A's before receipt. So now, process B remembers giving away the token, but process A doesn't remember receiving it. Process A says ‘can I have a token’ and process B says ‘??? no, I already gave you a token’.
Obviously this is a bit of a contrived example, and there are a number of simple solutions; but the point is that it's not easy to make something like that completely transparent to user code. And if you do make it transparent then you may have to make other compromises like rolling back state that you could otherwise have kept.
Right, thanks. Robust message-passing is certainly going to be interesting. I don't think that's too contrived; the window for the crash is very narrow but could happen. (It's like a race condition in that way.) It's also interesting in that the lost token may represent part of a finite resource. I think process B should not consider the token to be in use until process A acknowledges receipt. If A never acknowledges, then the token and the resource it represents should be considered for reuse. Of course, A should absolutely not use the token before it's sent an acknowledgement.
Oh wait... *sigh* Let's say there's a single-use resource, so there's only 1 token which B can give out to 1 process at a time. A requests it, B grants it, but before A can acknowledge, C requests it. Because the token hasn't been acknowledged, it's up for reuse, but then, when A acknowledges it, B has to accept or reject the acknowledgement. Is that the final necessary part of the process or could it get worse? I've run out of brain cells for now.
I'm not sure I can just say it'll work with cooperative multitasking... I suspect a system crash could reorder process execution... Right! I'm quitting before I get a headache. I still haven't written my reply to Vania.
moonchild wrote:eekee wrote:memory leaks and other cruft from programs and the OS never being restarted
Erlang deals with this well.
One for the list of languages to learn. Does it have a notion of garbage-collecting threads? I think Go does, and I vaguely remember Erlang coming up in a Go discussion or two.
moonchild wrote:eekee wrote:I would
like to deconstruct the whole idea of applications. (Smol bombshell there.
) I want components for users to assemble rather than applications which stand apart from the system. [snip] But how to design a
good set of app/gui components for users to assemble into applications is another story!
Recommend taking a look at
arcan.
Thanks! It's definitely on my list to look at later. I really will have to get around to learning Lua. It shouldn't be hard at all after the languages I've learned in the last few years.
moonchild wrote:IMO the traditional gui paradigm is super broken. I
wrote a bit about this on reddit a few months ago.
I have different responses to different parts of that. Analog input is particularly important to picking items from a list because it can go straight to the item you want as soon as you see it. Getting the text highlight to the right place is something I find disruptive; it's distracting. On a nicer note, I agree with lumping terminal multiplexers under window management. I'm finding digital input for text editing okay with some caveats; it must be simpler than VI and preferably written by myself so I know exactly what each command does,
but I still find the mouse easier for editing prose. It's the same issue as with lists: the mouse pointer can go exactly where I'm looking without any particular thought. I guess that might not be the same for everyone.