Automatic memory management

Programming, for all ages and all languages.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Automatic memory management

Post by Brendan »

Hi,
embryo2 wrote:
Brendan wrote:What application?
Database, browser, game.

And there are also the Notepad, Calculator or Minesweeper.

For the first list it is important to have tools an application can use to cache it's data or code. For the second list may be it is better if the OS takes care of the caching. If there are some common libraries (used by notepad and calculator and many other applications) then the library itself can optimize the caching better than OS can. But it requires some change in the way the libraries are created.

However, some common things are suitable for the OS. And the OS can improve the application's performance even better if it has additional information like the AOT or JIT can provide. It means the OS is destined to include the features that help it "manage" the application. Yes, in the end it's the managed environment.
Brendan wrote:From past behaviour it should know only one user logs on, and it should know it's very likely that user is going to use a web browser sooner or later; so it should have pre-fetched most of the files that the web browser would want.
Prefetching "most of the files" is inefficient. The browser can help and tell the OS "I need this and this". But of course, there can be some cooperation.
I really think you don't understand the difference between "pre-fetching" (e.g. before any application is started) and plain old "fetching" (e.g. after a program is started).
embryo2 wrote:
Brendan wrote:It should also pre-fetch the IP addresses for some frequently used domain names
The effect of such prefetch is negligible. And there also should be some service for frequent refresh of the prefetched data to detect it's changes. Such service requires some memory. Also the actually prefetched data requires some memory. The indexes and pattern usage statistics also require some memory. Final result will be less memory for 100ms gain in the page load speed.
The only difference between free RAM and RAM that's being used to cache data (including pre-fetched data) is that free RAM contributes nothing to performance and is being pointlessly wasted. In both cases, if something more important needs the RAM it can be allocated.
embryo2 wrote:
Brendan wrote:If I was playing a game that gobbles a lot of RAM and then exit the game; the OS should notice there's suddenly a lot of "under-used"/free RAM and start pre-fetching.
Hopefully, it starts the prefetching with some background priority. And also hopefully, I can tell the OS what I want and what I do not want to be prefetched (a bit of extra control over the situation).
IO priorities are standard practice (and are supported and used internally by every major OS). The only real problem is that most programming languages (and/or their standard libraries) are crippled jokes that don't let the programmer explicitly specify priority.
embryo2 wrote:
Brendan wrote:If I happen to start the web browser while it's pre-fetching the browser's executable the OS just bumps that file up to "high priority".
There are such things as schedulers. User or scheduler's developer still know better what it is for and how to optimize it. And even more, because the scheduler runs not so often (once per day while I can run a browser many times a day) it can be in the state "poorly prefetched". But when it starts it's disk usage can break my uninterrupted work in browser. It's actually the case for some antivirus soft and it's scheduled updates on my PC (and it also wants to do all it can at the boot time when it seems the OS allows you to start working, but there's this ugly thing running...).
I have no idea what you're saying here. Are you talking about the scheduler (that schedules CPU time), or any of the many IO schedulers, or a "job scheduler" (like Cron)? Note that none of these run once per day.
embryo2 wrote:
Brendan wrote:Note that Windows has supported something roughly like this since Vista (they call it "SuperFetch")
I know it. And the prefetch is not working "as expected". May be there are some other problems with Windows and it's actually not the prefetch's sin, but there's something that starts periodically and slows things in a very noticeable manner. In the end I just turned prefetching off. At least now I can control what happens and when. But the general direction of the prefetching elaboration in different OSes is good.
Pre-fetching is the sort of thing that would take a fair bit of "trial and error" and tuning to get right; but I've never noticed anything periodically slowing down without an obvious cause (e.g. garbage collectors).
embryo2 wrote:
Brendan wrote:You start a text editor, and it notices you're editing a source code of some kind and (because it's been profiling user behaviour in the background when it's not even running) the text editor decides to start pre-fetching the compiler?
In fact I do not use a great mix of applications for development. It's just Eclipse and the most of the time there's nothing else. But yes, sometime an external watching entity can speed things. The only problem here is the actual value we can get from it. The speed increase of 100ms when start time takes 1 second is not visible. But if it's something like "from 15 seconds to 3 seconds", then yes, it's interesting. And again, if it buys the start time for the interruption of my work with other software, then I prefer not to have such trade.
The advantage could be anything from no speed-up to massive speed-ups; and the disadvantage (assuming it's implemented properly - e.g. low priority IO with the ability to cancel) is a little more power consumption (where power consumption is something the OS could/should take into account when deciding what/if to pre-fetch).

If you've disabled pre-fetch; it should be fairly easy to estimate the performance difference it could make. For example, boot the computer and measure how long it takes to start eclipse (when none of eclipse's files are in the OS's file cache), and then close eclipse and start it a second time (when all of eclipse's files are in the OS's file cache). I'm not sure about eclipse (it's written in Java so there'd be a huge amount of bloat during application start-up that pre-fetching can't avoid); but I do know that things I use are noticeably slower after rebooting (e.g. starting a web brower, my own project's build utility, etc), mostly because Linux is extremely bad at pre-fetching.
embryo2 wrote:
Brendan wrote:It's about interactions between many pieces (where swap is just one of the pieces involved). If you only look at swap in isolation then you end up with naive misconceptions.
Well, the initial question was about the swap use cases.
The initial statement was "The swap is a very ugly thing of the past. It ruins the uninterrupted flow of user's work. It just should be removed.".

Note that it's actually common for users to think "Hey, I've got plenty of RAM so I don't need swap space" because they don't understand the role swap plays in a well designed system (and this is made worse by some old and/or poorly designed OSs that don't use swap space and/or pre-fetching very well). We are supposed to be OS developers and not just normal users, so we should know better.
embryo2 wrote:But the cooperation of the applications and the OS is something I really support. And the management of an application life cycle can be performed optimally if the OS has more information about the application, would it be some annotations or AOT/JIT's output or whatever related to the ability of the OS to introspect the code and it's usage.
The OS doesn't need to inspect the application's code. There's always something that can be hooked to gather statistics (the VFS's "open file" code, the OS's DNS service, etc).


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
azblue
Member
Member
Posts: 147
Joined: Sat Feb 27, 2010 8:55 pm

Re: Automatic memory management

Post by azblue »

I don't understant the contempt for swapping that I've seen from a few OSdevers over the last few weeks.

Consider OS1, which doesn't support swapping, and OS2, which does, running WorkloadA, which requires less ram than the computer has, and WorkloadB, which requires more ram than the computer has.

OS1 runs WorkloadA quickly.
OS2 also runs WorkloadA quickly -- since the demands on ram do not exceed actual ram, the OS isn't swapping anything, and so there is no difference between OS1.

WorkloadB, on the other hand, is a different story. OS1 refuses to run it. Meanwhile, OS2 runs it just fine, albeit slower than WorkloadA.

Is there any reason refusing to do what the user asked is superior to doing it slowly?

Should the OS refuse to boot on monitors under 32 inches, lest the user be bothered by a small screen?

Should the OS refuse to boot on computers lacking SSDs connected via NVMe, lest the user be bothered by slow access to files?

Should the OS refuse to boot with less than 16GB ram, lest the user be bothered by a small amount of ram?

Should the OS refuse to boot on processors under 3 GHz, lest the user be bothered by a slow computer?

If we take the "kill the swap file because it's slow" to it's logical conclusion, the answer to all these ridiculous questions is yes.
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: Automatic memory management

Post by Combuster »

azblue wrote:Is there any reason refusing to do what the user asked is superior to doing it slowly?
Your argument seems to exclude the possibility that any blame is to fall on the developer of the memory-hogging software.
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Automatic memory management

Post by Brendan »

Hi,
azblue wrote:I don't understant the contempt for swapping that I've seen from a few OSdevers over the last few weeks.

Consider OS1, which doesn't support swapping, and OS2, which does, running WorkloadA, which requires less ram than the computer has, and WorkloadB, which requires more ram than the computer has.

OS1 runs WorkloadA quickly.
OS2 also runs WorkloadA quickly -- since the demands on ram do not exceed actual ram, the OS isn't swapping anything, and so there is no difference between OS1.

WorkloadB, on the other hand, is a different story. OS1 refuses to run it. Meanwhile, OS2 runs it just fine, albeit slower than WorkloadA.
It's not quite that simple. For example; OS2 might start WorkloadA a lot faster because it sent unimportant pages to swap space to increase the amount of files it can keep in its file system cache.

It's the possibility of increasing performance (even when there is enough RAM) that people don't really understand.
Combuster wrote:
azblue wrote:Is there any reason refusing to do what the user asked is superior to doing it slowly?
Your argument seems to exclude the possibility that any blame is to fall on the developer of the memory-hogging software.
If the user knows an application hogs a huge amount of memory for no sane reason, but the user asks the OS to run that process anyway; is there any reason refusing to do what the user asked is superior to doing it slowly?


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
User avatar
SpyderTL
Member
Member
Posts: 1074
Joined: Sun Sep 19, 2010 10:05 pm

Re: Automatic memory management

Post by SpyderTL »

azblue wrote:I don't understant the contempt for swapping that I've seen from a few OSdevers
I turn off paging on all of my Windows machines, because I've never needed it. I've never had one application crash due to being out of memory. I have had Windows "warn" me on occasion that I was "low on memory", which I usually just ignore.

I might use paging in Windows if I thought that it would only be used as a last resort, but it always feels like it's paging at the worst time, like right when I'm trying to open a new application.

I also disable all prefetch/superfetch functionality as well, for similar reasons. I always feel like I'm waiting on it to finish loading stuff so that it will be faster at some later point, while making me wait right now.

I think the problem is that my usage isn't what you might call "typical". First of all, I shut down my PC when I'm not using it. I like the simplicity of resetting the machine to a known state on every use. So, every time I need to use my PC, I'm cold booting.

Second, i typically only use 3 applications during a "session": Visual Studio, Outlook, and Internet Explorer. All three of these fit in memory with 4GiB or more of RAM, so right off the bat there is no need for swapping anything to disk. This is also the order that I open these applications in, and Visual Studio takes the longest to load of the three, especially when you consider the additional time of opening a project. So, what I DON'T want is for Windows to try to be "helpful" by pre-loading Outlook and Internet Explorer from the (slow) hard drive WHILE I'm waiting for Visual Studio to load my project.

Perception is everything. The reason that every application you use has a progress bar is because, psychologically, people don't like open ended wait states. In reality, progress bars make no difference, and actually take additional CPU cycles to render, but try taking them away and see how people react.

Let me put it another way. Let's say you go to the DMV to get your license. Imagine that you walk in and you are the only person in line, and there is one person behind the counter. That person is helpful, and answers all of your questions and then hands you a license, and then you leave.

Now imagine that you go to the post office. You walk in, and there are 20 people in line, and 20 people behind the desk. But only certain workers can perform certain tasks, so you have to figure out which line to stand in. And every time you ask a question, you have to go to the back of the line and wait until someone is available to give you the answer.

Technically, the second scenario may be more efficient, in that more "people" are served in the same amount of time, but that doesn't matter to the person that is standing in line. They would still rather be the only person in the room.
Project: OZone
Source: GitHub
Current Task: LIB/OBJ file support
"The more they overthink the plumbing, the easier it is to stop up the drain." - Montgomery Scott
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: Automatic memory management

Post by Brendan »

Hi,
SpyderTL wrote:
azblue wrote:I don't understant the contempt for swapping that I've seen from a few OSdevers
I turn off paging on all of my Windows machines, because I've never needed it. I've never had one application crash due to being out of memory. I have had Windows "warn" me on occasion that I was "low on memory", which I usually just ignore.
I think Windows only allows you to disable swap space, and still does all the other stuff (e.g. freeing unmodified memory mapped file's pages because it can get them from disk again if necessary, etc).
SpyderTL wrote:I might use paging in Windows if I thought that it would only be used as a last resort, but it always feels like it's paging at the worst time, like right when I'm trying to open a new application.

I also disable all prefetch/superfetch functionality as well, for similar reasons. I always feel like I'm waiting on it to finish loading stuff so that it will be faster at some later point, while making me wait right now.
There are only 2 cases:
  • It's implemented properly (e.g. with IO priorities, the ability to cancel an "in progress" operation, and the ability to adjust an operation's priority while it's pending or in progress); where it should never make performance worse.
  • It's not implemented properly; and may make performance worse.
The thing is, nobody (at least nobody that I could find, including myself) actually tests it properly (e.g. in a way that eliminates unrecognised biases). They just make random assumptions and enable/disable it based on their own wishful thinking, and congratulate themselves on fiddling with a knob because they feel better after fiddling (even if it made things worse).

Of course if proper testing was done, we'd only find out if Windows implemented it properly (and not how useful it is/isn't when implemented properly).
SpyderTL wrote:I think the problem is that my usage isn't what you might call "typical". First of all, I shut down my PC when I'm not using it. I like the simplicity of resetting the machine to a known state on every use. So, every time I need to use my PC, I'm cold booting.
I do this too. I turn my Windows computer on, then do something else while it's booting, and sooner or later (sometimes after half an hour) I start using it (and start wondering why the crappy OS failed to prefetch things I'm likely to use, even though prefetch/superfetch is enabled and even though the computer has about 30 GiB of RAM doing almost nothing).
SpyderTL wrote:Second, i typically only use 3 applications during a "session": Visual Studio, Outlook, and Internet Explorer. All three of these fit in memory with 4GiB or more of RAM, so right off the bat there is no need for swapping anything to disk. This is also the order that I open these applications in, and Visual Studio takes the longest to load of the three, especially when you consider the additional time of opening a project. So, what I DON'T want is for Windows to try to be "helpful" by pre-loading Outlook and Internet Explorer from the (slow) hard drive WHILE I'm waiting for Visual Studio to load my project.
Do you honestly think Windows is so stupid that it'd do a low priority "background prefetch" when there are higher priority "an application needs this ASAP" fetches waiting?
SpyderTL wrote:Let me put it another way. Let's say you go to the DMV to get your license. Imagine that you walk in and you are the only person in line, and there is one person behind the counter. That person is helpful, and answers all of your questions and then hands you a license, and then you leave.

Now imagine that you go to the post office. You walk in, and there are 20 people in line, and 20 people behind the desk. But only certain workers can perform certain tasks, so you have to figure out which line to stand in. And every time you ask a question, you have to go to the back of the line and wait until someone is available to give you the answer.

Technically, the second scenario may be more efficient, in that more "people" are served in the same amount of time, but that doesn't matter to the person that is standing in line. They would still rather be the only person in the room.
Imagine you're walking along the side-walk and a fat man in a chicken costume falls out of the sky, lands on you, and breaks your neck. This is far worse!

Technically, none of the scenarios above have anything at all to do with swap or pre-fetching.

Now imagine you go to a pizza shop and order a large supreme pizza and garlic bread. The pizza shop staff are morons who have been doing nothing for the last 30 minutes, so they start by putting away stuff they used for the previous order to free up some counter space, then they begin making pizza dough because they were too stupid to pre-make any, then they put the tomato sauce on the dough because they weren't smart enough to do that when they had nothing better to do either, then they start putting the other ingredients on the pizza because "We sell at least 50 large supreme pizzas every night" wasn't a big enough hint for these people and they don't have ~5 large supreme pizzas stashed away in the fridge ready to put into the oven at a moments notice. If they put things away when there's nothing better to do to free up space (equivalent to sending stuff to swap space), and if they pre-made things they know they're going to need (equivelent to pre-fetching) you might have your pizza in 10 minutes instead of having to wait for an hour.


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
onlyonemac
Member
Member
Posts: 1146
Joined: Sat Mar 01, 2014 2:59 pm

Re: Automatic memory management

Post by onlyonemac »

Brendan wrote:
SpyderTL wrote:Second, i typically only use 3 applications during a "session": Visual Studio, Outlook, and Internet Explorer. All three of these fit in memory with 4GiB or more of RAM, so right off the bat there is no need for swapping anything to disk. This is also the order that I open these applications in, and Visual Studio takes the longest to load of the three, especially when you consider the additional time of opening a project. So, what I DON'T want is for Windows to try to be "helpful" by pre-loading Outlook and Internet Explorer from the (slow) hard drive WHILE I'm waiting for Visual Studio to load my project.
Do you honestly think Windows is so stupid that it'd do a low priority "background prefetch" when there are higher priority "an application needs this ASAP" fetches waiting?
Actually, Windows *is* stupid and *does* slow things down by fetching stuff that isn't needed just yet. That's why a Linux system with an optimized boot process starts quickly, whereas a Windows system takes a long time to start by fetching loads of stuff that the user doesn't need just yet. There isn't really a disk access priority system in Windows.
When you start writing an OS you do the minimum possible to get the x86 processor in a usable state, then you try to get as far away from it as possible.

Syntax checkup:
Wrong: OS's, IRQ's, zero'ing
Right: OSes, IRQs, zeroing
User avatar
Muazzam
Member
Member
Posts: 543
Joined: Mon Jun 16, 2014 5:59 am
Location: Shahpur, Layyah, Pakistan

Re: Automatic memory management

Post by Muazzam »

I'm sorry for the off-topic post, but am I too dumb that I never get things like garbage collection? I think it's just unnecessarily complex. I don't use memory management at all in my OS. Anyway, what else could you expect from someone with the IQ of 83?
User avatar
iansjack
Member
Member
Posts: 4685
Joined: Sat Mar 31, 2012 3:07 am
Location: Chichester, UK

Re: Automatic memory management

Post by iansjack »

onlyonemac wrote:That's why a Linux system with an optimized boot process starts quickly, whereas a Windows system takes a long time to start by fetching loads of stuff that the user doesn't need just yet. There isn't really a disk access priority system in Windows.
My Windows 10 installs start far quicker than any of my Linux ones.
User avatar
Rusky
Member
Member
Posts: 792
Joined: Wed Jan 06, 2010 7:07 pm

Re: Automatic memory management

Post by Rusky »

...and for another anecdote, my linux install on real hardware starts an order of magnitude faster than my Windows 10 install in a VM that paravirtualizes everything but the hard drive. :roll:
onlyonemac
Member
Member
Posts: 1146
Joined: Sat Mar 01, 2014 2:59 pm

Re: Automatic memory management

Post by onlyonemac »

iansjack wrote:
onlyonemac wrote:That's why a Linux system with an optimized boot process starts quickly, whereas a Windows system takes a long time to start by fetching loads of stuff that the user doesn't need just yet. There isn't really a disk access priority system in Windows.
My Windows 10 installs start far quicker than any of my Linux ones.
At least once my Linux system has started it's actually ready for use, and not still ridiculously slow because it's trying to impress me by making it look like the web browser can open in half a second (actually the web browser on my Linux system *can* open in half a second, but that's a different matter...).
Rusky wrote:...and for another anecdote, my linux install on real hardware starts an order of magnitude faster than my Windows 10 install in a VM that paravirtualizes everything but the hard drive. :roll:
Probably because the bottleneck in the Windows system is the excessive hard drive activity. Paravirtualise the hard drive, but nothing else, and your Windows system will probably boot *very* quickly.
When you start writing an OS you do the minimum possible to get the x86 processor in a usable state, then you try to get as far away from it as possible.

Syntax checkup:
Wrong: OS's, IRQ's, zero'ing
Right: OSes, IRQs, zeroing
User avatar
iansjack
Member
Member
Posts: 4685
Joined: Sat Mar 31, 2012 3:07 am
Location: Chichester, UK

Re: Automatic memory management

Post by iansjack »

Don't you just hate religious arguments. Big-endians and little-endians all over again.
User avatar
Rusky
Member
Member
Posts: 792
Joined: Wed Jan 06, 2010 7:07 pm

Re: Automatic memory management

Post by Rusky »

onlyonemac wrote:
Rusky wrote:...and for another anecdote, my linux install on real hardware starts an order of magnitude faster than my Windows 10 install in a VM that paravirtualizes everything but the hard drive. :roll:
Probably because the bottleneck in the Windows system is the excessive hard drive activity. Paravirtualise the hard drive, but nothing else, and your Windows system will probably boot *very* quickly.
Err, I may have gotten the terminology backwards. The hard drive is real and passed straight through; all the other devices are virtual.
User avatar
SpyderTL
Member
Member
Posts: 1074
Joined: Sun Sep 19, 2010 10:05 pm

Re: Automatic memory management

Post by SpyderTL »

muazzam wrote:I'm sorry for the off-topic post, but am I too dumb that I never get things like garbage collection? I think it's just unnecessarily complex. I don't use memory management at all in my OS. Anyway, what else could you expect from someone with the IQ of 83?
Normally, when you are done using a block of memory, you give it back to the OS memory manager.

With garbage collection, you don't give the memory back when you are done with it. The memory manager "detects" that the memory is no longer needed, and "collects" the memory itself.

Pretty simple concept. Making it work is a little bit more difficult, because your application has to be written slightly differently.
Project: OZone
Source: GitHub
Current Task: LIB/OBJ file support
"The more they overthink the plumbing, the easier it is to stop up the drain." - Montgomery Scott
User avatar
SpyderTL
Member
Member
Posts: 1074
Joined: Sun Sep 19, 2010 10:05 pm

Re: Automatic memory management

Post by SpyderTL »

Do you honestly think Windows is so stupid that it'd do a low priority "background prefetch" when there are higher priority "an application needs this ASAP" fetches waiting?
I honestly think that a non-SSD drive can only read one block at a time, and that once the command is executed, there is no way to interrupt it with a higher priority command. And unless the low priority data is right next to the high priority data, you're going to be waiting on the drive head to find the low priority data, read it, and then find your high priority data.

A good approach to prefetching would be to wait for no activity from the user for, say 5 minutes, and then start loading stuff. Or better yet, wait till the machine is locked.
Project: OZone
Source: GitHub
Current Task: LIB/OBJ file support
"The more they overthink the plumbing, the easier it is to stop up the drain." - Montgomery Scott
Post Reply