How to avoid patching good programs into ****?
How to avoid patching good programs into ****?
There was this topic about good software that was updated into totally useless crap. I've personally noticed this problem numerous of times, in commercial, freeware and open source software.
I beg you not to list examples were that has happened. This is not my intention with this "followup" thread on the issue. Instead, I'd like people to tell, if they think they know any reasons why this might be so hard to avoid, what are the common reasons for this to happen, and how to prevent it.
I can personally think of at least a few things. For one, all (or most) of these updated that mess things up seem to be introducing some new features. Maybe it's just a new dialog, or some "improved" flexibility somewhere, but in all cases, it seems that the features improved aren't "gradual improvents" but instead "radical new features" or "advanced new frameworks" or whatever.
I even dare to suggest, that when such "radical" and "advanced" improvements are built piece by pieces, without replacing entire parts of the program/system, this problems seems to be much more rare...
Finally, is there anything from the "programming" point of view that could be done? Keeping the old models as optional doesn't seem like a that good idea in the long run, so are there others?
I beg you not to list examples were that has happened. This is not my intention with this "followup" thread on the issue. Instead, I'd like people to tell, if they think they know any reasons why this might be so hard to avoid, what are the common reasons for this to happen, and how to prevent it.
I can personally think of at least a few things. For one, all (or most) of these updated that mess things up seem to be introducing some new features. Maybe it's just a new dialog, or some "improved" flexibility somewhere, but in all cases, it seems that the features improved aren't "gradual improvents" but instead "radical new features" or "advanced new frameworks" or whatever.
I even dare to suggest, that when such "radical" and "advanced" improvements are built piece by pieces, without replacing entire parts of the program/system, this problems seems to be much more rare...
Finally, is there anything from the "programming" point of view that could be done? Keeping the old models as optional doesn't seem like a that good idea in the long run, so are there others?
Re:How to avoid patching good programs into ****?
Seeing as most of those programs were updated to support eithermystran wrote: There was this topic about good software that was updated into totally useless crap. I've personally noticed this problem numerous of times, in commercial, freeware and open source software.
I beg you not to list examples were that has happened. This is not my intention with this "followup" thread on the issue. Instead, I'd like people to tell, if they think they know any reasons why this might be so hard to avoid, what are the common reasons for this to happen, and how to prevent it.
I can personally think of at least a few things. For one, all (or most) of these updated that mess things up seem to be introducing some new features. Maybe it's just a new dialog, or some "improved" flexibility somewhere, but in all cases, it seems that the features improved aren't "gradual improvents" but instead "radical new features" or "advanced new frameworks" or whatever.
I even dare to suggest, that when such "radical" and "advanced" improvements are built piece by pieces, without replacing entire parts of the program/system, this problems seems to be much more rare...
Finally, is there anything from the "programming" point of view that could be done? Keeping the old models as optional doesn't seem like a that good idea in the long run, so are there others?
- Stupid users
- Industries (DRM)
- The author easier
They all make it harder for me to use. If you make it easy for the music industry to "protect their rights", I am having problems even listening to the music and will thus not use the software. I'd rather use a hack around it to be able to listen. This also goes for Adobe PDF, you can just plain open files in V5 and not in V6. This means that for some documents, I can no longer print them -> less functionality.
Stupid users: Try MSoffice. Newer versions do not replace text that's selected with the stuff you type, but place it before it. This causes me to lose an entire way of working, since it did work in all versions before, and even in all other applications too. Some noob might be surprised once, but he'll get the hang of it. It was a nice feature but now it's gone. Same with newer versions of winamp, they have a special "modern" skin. It is not in any way more usable, it's less recognisable (the old skin is SO known) and it can be used for exploits.
Making it easier for the author: If the author always uses function X when using the program, making it always use X makes it easier for him/them to use it. Example is again MSoffice, where your words are automatically corrected to whatever language it thinks you're typing, and it then doesn't detect that you were typing a different language (since it auto-changed those words). No US developer that's gonna notice that... Your email addresses and web addresses are automatically made blue, images are thrown about pages like it's nothing, and all I really want is for it to not crash while I work on my document. During my internship it must've crashed at least 20x.
-
- Member
- Posts: 1600
- Joined: Wed Oct 18, 2006 11:59 am
- Location: Vienna/Austria
- Contact:
Re:How to avoid patching good programs into ****?
- use a coherent coding style
- Have some users *test* the introduced "Add-on" - especially in terms of useability, accessibility - and if that thing is needed - innovation or not, if it's only eyecandy to lure away from som ecrucial weakness, honestly, no one needs that.
-have specifications on how to patch - documentation, planning, doing some portfeulles and flowcharts to find out situations which require a new feature or patch.
-never disturb users with some new and extraordinary clever function. users like things to act the way they are used to. They expect things to be ready explorable - to fit into their view of the world.
-ask the following: is the patch neccessary. is the quality of the entire product the same as before (if not, the market should - and it has in former times - wipe out unneeded/inappropriate things)?
The times, when innovation meant to do some really crack-out stuff - instead of coloured state bars - seem to be gone. At least for the otto normal user. Nowadays they sell each and every goddamn crap which slows down even the best system - as innovation. Damned shall they be for selling such a crap.
*is this creative enuff*? Or dost thou wish a longer rant about how to avoid patching sins (it's a sin ... I am a sinner ... oh yeah, it's a sin)
- Have some users *test* the introduced "Add-on" - especially in terms of useability, accessibility - and if that thing is needed - innovation or not, if it's only eyecandy to lure away from som ecrucial weakness, honestly, no one needs that.
-have specifications on how to patch - documentation, planning, doing some portfeulles and flowcharts to find out situations which require a new feature or patch.
-never disturb users with some new and extraordinary clever function. users like things to act the way they are used to. They expect things to be ready explorable - to fit into their view of the world.
-ask the following: is the patch neccessary. is the quality of the entire product the same as before (if not, the market should - and it has in former times - wipe out unneeded/inappropriate things)?
The times, when innovation meant to do some really crack-out stuff - instead of coloured state bars - seem to be gone. At least for the otto normal user. Nowadays they sell each and every goddamn crap which slows down even the best system - as innovation. Damned shall they be for selling such a crap.
*is this creative enuff*? Or dost thou wish a longer rant about how to avoid patching sins (it's a sin ... I am a sinner ... oh yeah, it's a sin)
... the osdever formerly known as beyond infinity ...
BlueillusionOS iso image
BlueillusionOS iso image
Re:How to avoid patching good programs into ****?
why
programmers like to program? good for the ego? lack of clues? they get a kick out of making the program look like a swiss army knife? perhaps it's not hard to avoid, maybe it's easy to implement?
in case of commercial development:
perhaps the client wants every feature anyone may ever need? lacks clues? the developers just get paid to develop?
how to avoid
-if you're thinking about how to avoid, you're already on the right track.
-remember the objectives, particularly those of the user. know how the user uses your program.
-use common sense and evaluate critically each feature, new and old, in view of those objectives.
-see "the humane interface" by Jeff Raskin, he discusses various aspects of usability, criticises current practices and explores solutions.
a good example is Winamp clones. the user interface is very good.
considering my objectives (to play some music in the background), you can hardly do better, just look at it.
Winamp in its present shape has been around for many years.
now, compare that to other "media players", how they were never really good at that to start with (lacking a simple thing called playlist, or making it too complicated), and instead grew on them all sorts of features that don't add to everyday usefulness, but serve to confuse and distract.
now, the big companies haven't followed that example, and pro'lly won't, because (assuming they are even aware of the problem) they won't admit that they missed the point entirely, or are afraid of being accused of copying, or have their own agenda (selling and promoting stuff), or just don't care. good luck getting users to use their players.
these aren't exactly the cases of a "good thing gone bad", but imho worth mentioning, as examples of good things and bad things in the same area.
programmers like to program? good for the ego? lack of clues? they get a kick out of making the program look like a swiss army knife? perhaps it's not hard to avoid, maybe it's easy to implement?
in case of commercial development:
perhaps the client wants every feature anyone may ever need? lacks clues? the developers just get paid to develop?
how to avoid
-if you're thinking about how to avoid, you're already on the right track.
-remember the objectives, particularly those of the user. know how the user uses your program.
-use common sense and evaluate critically each feature, new and old, in view of those objectives.
-see "the humane interface" by Jeff Raskin, he discusses various aspects of usability, criticises current practices and explores solutions.
a good example is Winamp clones. the user interface is very good.
considering my objectives (to play some music in the background), you can hardly do better, just look at it.
Winamp in its present shape has been around for many years.
now, compare that to other "media players", how they were never really good at that to start with (lacking a simple thing called playlist, or making it too complicated), and instead grew on them all sorts of features that don't add to everyday usefulness, but serve to confuse and distract.
now, the big companies haven't followed that example, and pro'lly won't, because (assuming they are even aware of the problem) they won't admit that they missed the point entirely, or are afraid of being accused of copying, or have their own agenda (selling and promoting stuff), or just don't care. good luck getting users to use their players.
these aren't exactly the cases of a "good thing gone bad", but imho worth mentioning, as examples of good things and bad things in the same area.
Re:How to avoid patching good programs into ****?
I think the case with commercial software is that:
- Software sells beter if it has more features
- Enough promotion will always sell
- Most features take little time. Only complex features take a lot of time
Since the software A. sells, B. can get more features, C. can get more promotion, they reason D. will sell more. Of course, it does work in a few ways, the same way that everybody keeps asking me why I don't have one top notch computer instead of 7 measly 4-8 year old computers. This includes the laptops, the quickest one has a 366 mhz processor. They buy the new system, it has more features, they're happy for a while but can't actually put it to use. But by the time they've gotten over the "it has more features!" rage of the old product, there's this new product doing pretty much the same that has even more features. So they switch, the original adds 4 features, they switch back... etc etc etc.
Of course, this has no relation to real computer users. Problem with computer users is that their number reduces very quickly. Actually using a computer as opposed to having one and playing games / using features is near-incomprehensible in difference. For comparison, I have spent barely 300 euros on my computers this year, all together, where others spend hunderds more because they used a top-of-the-bill processor. A mere year later their processor isn't impressive, like mine, but my purse is a lot fatter. I can use my computers for the exact same things they do even though theirs was a lot more expensive.
People don't realise this. If their old computer can do X, and their new computer has loads of more features, why not buy the new one? This is also a part in the "american reasoning", if I may define that term. It comes down to not spending money on a shopping spree during sale, but saving money by not doing that outside the sale. If you weren't gonna buy a sweater, you're still wasting money. Same with computers, if you have a new computer that can do lots more, it's instantly better than the old one that couldn't do that. The old one could, and might even work better than the new one, and in a few cases actually is the only one used later on, but since the new one can do more, it's better. If it's better you should get it.
Reasoning from this, quitting the american reasoning would be the solution to the bloated program idea. Programs bloat because people want them to (see above paragraph for explanation). If people stop wanting programs to bloat, they will stop and focus on actually improving the function it has. As nice and simple as this might sound, it actually isn't. Don't forget computer users are still a small minority, and the overwhelming amount of people is also those that buy a BMW or a Mercedes because it's better than a Fiat or a Volvo. It isn't better in that it can do anything they use it for better, it's better because it can do things they never use. Seeing as the car industry has existed for something like 100 years and still has lots of brands like BMW, Mercedes, Aston Martin, Rover, Ferrari and a bunch of others that I don't even know but are more expensive, this is probably an unreachable goal
- Software sells beter if it has more features
- Enough promotion will always sell
- Most features take little time. Only complex features take a lot of time
Since the software A. sells, B. can get more features, C. can get more promotion, they reason D. will sell more. Of course, it does work in a few ways, the same way that everybody keeps asking me why I don't have one top notch computer instead of 7 measly 4-8 year old computers. This includes the laptops, the quickest one has a 366 mhz processor. They buy the new system, it has more features, they're happy for a while but can't actually put it to use. But by the time they've gotten over the "it has more features!" rage of the old product, there's this new product doing pretty much the same that has even more features. So they switch, the original adds 4 features, they switch back... etc etc etc.
Of course, this has no relation to real computer users. Problem with computer users is that their number reduces very quickly. Actually using a computer as opposed to having one and playing games / using features is near-incomprehensible in difference. For comparison, I have spent barely 300 euros on my computers this year, all together, where others spend hunderds more because they used a top-of-the-bill processor. A mere year later their processor isn't impressive, like mine, but my purse is a lot fatter. I can use my computers for the exact same things they do even though theirs was a lot more expensive.
People don't realise this. If their old computer can do X, and their new computer has loads of more features, why not buy the new one? This is also a part in the "american reasoning", if I may define that term. It comes down to not spending money on a shopping spree during sale, but saving money by not doing that outside the sale. If you weren't gonna buy a sweater, you're still wasting money. Same with computers, if you have a new computer that can do lots more, it's instantly better than the old one that couldn't do that. The old one could, and might even work better than the new one, and in a few cases actually is the only one used later on, but since the new one can do more, it's better. If it's better you should get it.
Reasoning from this, quitting the american reasoning would be the solution to the bloated program idea. Programs bloat because people want them to (see above paragraph for explanation). If people stop wanting programs to bloat, they will stop and focus on actually improving the function it has. As nice and simple as this might sound, it actually isn't. Don't forget computer users are still a small minority, and the overwhelming amount of people is also those that buy a BMW or a Mercedes because it's better than a Fiat or a Volvo. It isn't better in that it can do anything they use it for better, it's better because it can do things they never use. Seeing as the car industry has existed for something like 100 years and still has lots of brands like BMW, Mercedes, Aston Martin, Rover, Ferrari and a bunch of others that I don't even know but are more expensive, this is probably an unreachable goal
Re:How to avoid patching good programs into ****?
Then how to remedy it? How does one remedy a sickness that spreads across the globe, and where half the people are infected with it? Well, same with environmental pollution. You start small, say your own home. When that's cleaned up, you show others that it can be done, it's not more complex or expensive, and you give them reasons in their own language, and they might get the idea. I've tried this on a bunch of people at school, and 90% of them missed the point. Most thought I was a poor guy that couldn't afford more than that laptop, and so will others think that you can't afford MSOffice 2005, even though you can do that easily, but really can't find a reason to do.
Still, the answer is to start small. Try making a single program of your own that has a well-defined start and end, and that can either fulfill a long-standing problem (such as having a single home directory, without adminning trouble) or make your life a lot easier (such as having a fixed address for your electronic connection, something we now call a mail server). If the problem is fixed, or the life has become easier, lots will jump the bandwagon shortly after you've released the brakes, and it'll steamroller anything in its path (to put a few metaphors in a row). Avoid adding features that do not contribute to the main goal, and if possible use some form of very simple plugin system. For examples, see Firefox (if I see a link to an extension, I click it, approve it, and I have it installed within the minute) and most alternative IM-clients.
In the operating system world, most people here are familiar with it, it's a lot harder. Avoid adding features you don't need for the first release, but DO add the concepts you need to get the general target to consider it prematurely. They are the ones that can tell you what's missing, and what should be improved. Even if they ask impossible things, their advice is still needed, as it's the only thing you can rely on for your target. Still, always triply-evaluate even their comments and guard them against themselves. They can ask a feature that has lots of backstroke, without realising it.
One other word is worth mentioning. Incompatibility.
If your program is functionally identical to the previous one, people don't care which one they use. If it isn't functionally identical, they will always prefer the old one. Humans learn behaviour, and resist all change in that which they've already learnt. Learning something new takes time, but when you're young it takes less time. Why is that?
A young mind doesn't necessarily have a more open vision or anything. They're only not spoiled by ideas others told them, if only because they haven't been around long enough to be told that much. Older minds learn less quickly because they have had lots of ideas told to them, and must check the validity of this new argument with all of them, even those they found invalid back then. Being lied to doesn't make you learn a lot quicker afterwards. A young mind has few ideas to compare it with, will thus usually accept it as the truth (there's no reason not to), and therefore "learn" quicker.
Knowing this, if you're targeting most age groups, it's very important to get both the external interface right the first time (that is, the GUI plus what each button does), and the low-level access logistics sealed up quite tight. If the GUi grows to accept more than what it did, there's a chance some people don't know it, but it won't break any expectation. If the GUI shrinks, most people won't agree with the change and stick with the old one. If it is merely different, some will change where some will stick.
Idea from this is, keeping your application decent requires you to get the GUI and the lowlevel (files) interface right the first time. People are habit-animals, they will learn to use one way and prefer not to know any other ways. They prefer silent images over moving images, they prefer simple forms over complex forms. They prefer smooth forms over edgy ones (simply put, the mind is similar in compressing images to JPEG, images that compress well with jpeg are better for UI's). They like a one-on-one correlation between things, like things to react they way they always did, and all things that worked to still work. This all boils down to the Principle of Least Surprise, if you don't surprise your users they can use the application a lot faster.
Considering these things, there will be a next post with advice on how to design
Still, the answer is to start small. Try making a single program of your own that has a well-defined start and end, and that can either fulfill a long-standing problem (such as having a single home directory, without adminning trouble) or make your life a lot easier (such as having a fixed address for your electronic connection, something we now call a mail server). If the problem is fixed, or the life has become easier, lots will jump the bandwagon shortly after you've released the brakes, and it'll steamroller anything in its path (to put a few metaphors in a row). Avoid adding features that do not contribute to the main goal, and if possible use some form of very simple plugin system. For examples, see Firefox (if I see a link to an extension, I click it, approve it, and I have it installed within the minute) and most alternative IM-clients.
In the operating system world, most people here are familiar with it, it's a lot harder. Avoid adding features you don't need for the first release, but DO add the concepts you need to get the general target to consider it prematurely. They are the ones that can tell you what's missing, and what should be improved. Even if they ask impossible things, their advice is still needed, as it's the only thing you can rely on for your target. Still, always triply-evaluate even their comments and guard them against themselves. They can ask a feature that has lots of backstroke, without realising it.
One other word is worth mentioning. Incompatibility.
If your program is functionally identical to the previous one, people don't care which one they use. If it isn't functionally identical, they will always prefer the old one. Humans learn behaviour, and resist all change in that which they've already learnt. Learning something new takes time, but when you're young it takes less time. Why is that?
A young mind doesn't necessarily have a more open vision or anything. They're only not spoiled by ideas others told them, if only because they haven't been around long enough to be told that much. Older minds learn less quickly because they have had lots of ideas told to them, and must check the validity of this new argument with all of them, even those they found invalid back then. Being lied to doesn't make you learn a lot quicker afterwards. A young mind has few ideas to compare it with, will thus usually accept it as the truth (there's no reason not to), and therefore "learn" quicker.
Knowing this, if you're targeting most age groups, it's very important to get both the external interface right the first time (that is, the GUI plus what each button does), and the low-level access logistics sealed up quite tight. If the GUi grows to accept more than what it did, there's a chance some people don't know it, but it won't break any expectation. If the GUI shrinks, most people won't agree with the change and stick with the old one. If it is merely different, some will change where some will stick.
Idea from this is, keeping your application decent requires you to get the GUI and the lowlevel (files) interface right the first time. People are habit-animals, they will learn to use one way and prefer not to know any other ways. They prefer silent images over moving images, they prefer simple forms over complex forms. They prefer smooth forms over edgy ones (simply put, the mind is similar in compressing images to JPEG, images that compress well with jpeg are better for UI's). They like a one-on-one correlation between things, like things to react they way they always did, and all things that worked to still work. This all boils down to the Principle of Least Surprise, if you don't surprise your users they can use the application a lot faster.
Considering these things, there will be a next post with advice on how to design
Re:How to avoid patching good programs into ****?
Designing a good user application thus requires you to get it right the first time, to keep things the way they were, not make abrupt changes, and preferably, keep your application both backward compatible, AND forward compatible. Allow users to make the choice whether or not to change to a new version.
Keep your product one product, and refuse all possible attempts by anybody to morph it into something else. If users want a product, they'll want the same product in a newer version, not a different but similar product under the same name.
When designing a user interface for a program, look at how the OS developers intended you to use some features, and wherever possible use them the same way. If I press a pushbutton in your program, I'm going to expect it to go down when I press the mouse button, go back up when I release it, and perform an action when I release the button. I also expect it to respond to a spacebar-press similarly when it has focus (another general concept), and if it is distinct from others (not just a fat border, that'd be Microsoft-only) it should respond to pressing Enter. If it doesn't, the application is broken according to the design spec.
Make your file formats similarly. Put everything where it belongs, not where it's very convenient to put. The most convenient design is a monolithic source file, with one big @$$ window where you do your work. Everything is at hand, and nothing takes more than two or three clicks. Still, this isn't a good design.
Your source is unreadable by humans, and neither is your application window. Humans aren't that good in instantly finding something they like, they like a certain amount on screen at a time. Too little is also OK, but too much is overkill. From practice, and from my own viewing with some sites, I've found that in a new "view", 20-25 items should be max. More and the user is confused. In a familiar view, 30-50 should be OK, and in an experienced view, up to 60-80 can be displayed at a time.
Examples, webbrowsers are nearly interchangeable, where they all have 5 buttons. They have a field for the URL, and possibly one for searching. They have a menu bar, counting as 7 in Firefox. This comes to 14 in total, which is just under the 15. Since I count myself under experienced, this is very usable for me. Comparing this with Open Office, which I've started using a week ago, I see 55 items on screen instantly. Not surprisingly, I'm having quite a hard time finding a bunch of things, such as highlighting the background of the text or making a table. As a test to see whether it's just the count, I've created an application with 30 buttons placed at random. I had a hard time remembering what each button displayed (text, not gonna make it useful too ), because there both wasn't any relation to it's relative location (near others saying something similar), nor to its absolute location (the top buttons said something, same as the bottom ones). Grouping icons of a similar type together helps users remember where things are, and as such it reduces complexity.
Reduce the complexity of the screen display, but keep everything accessible. This is very hard to get right. Your initial design will probably go through a few designs before you actually get to something useful for most.
Attempt to get it working similar to others in the "trade". This does count as stealing the UI, but for users to understand your UI it must look similar to the other guy's UI.
Before I forget, a last thing on the file formats. Keep everything in its place. Draw out what needs to be in the file once, what needs to be in there for each picture, what for each text block, what for each user reference, what for each drive type log (I'm just guessing what you want to make). Also, decide how big they might be, and how small they might be. Make the pointers relative to the start of the file (seen a few that didn't), allow all lengths to the absolute max of the fields (zip files that can't compress 4GB for instance), and keep room for future expansion. Don't fix addresses, don't allocate inside your file. Keep your file content clean, and allow even the sub structures to grow (look at ELF for instance, they have an explicit "this structure is this big here" pointer that you are required to use.
Gotta stop ranting, three full-length subsequent posts should suffice
Keep your product one product, and refuse all possible attempts by anybody to morph it into something else. If users want a product, they'll want the same product in a newer version, not a different but similar product under the same name.
When designing a user interface for a program, look at how the OS developers intended you to use some features, and wherever possible use them the same way. If I press a pushbutton in your program, I'm going to expect it to go down when I press the mouse button, go back up when I release it, and perform an action when I release the button. I also expect it to respond to a spacebar-press similarly when it has focus (another general concept), and if it is distinct from others (not just a fat border, that'd be Microsoft-only) it should respond to pressing Enter. If it doesn't, the application is broken according to the design spec.
Make your file formats similarly. Put everything where it belongs, not where it's very convenient to put. The most convenient design is a monolithic source file, with one big @$$ window where you do your work. Everything is at hand, and nothing takes more than two or three clicks. Still, this isn't a good design.
Your source is unreadable by humans, and neither is your application window. Humans aren't that good in instantly finding something they like, they like a certain amount on screen at a time. Too little is also OK, but too much is overkill. From practice, and from my own viewing with some sites, I've found that in a new "view", 20-25 items should be max. More and the user is confused. In a familiar view, 30-50 should be OK, and in an experienced view, up to 60-80 can be displayed at a time.
Examples, webbrowsers are nearly interchangeable, where they all have 5 buttons. They have a field for the URL, and possibly one for searching. They have a menu bar, counting as 7 in Firefox. This comes to 14 in total, which is just under the 15. Since I count myself under experienced, this is very usable for me. Comparing this with Open Office, which I've started using a week ago, I see 55 items on screen instantly. Not surprisingly, I'm having quite a hard time finding a bunch of things, such as highlighting the background of the text or making a table. As a test to see whether it's just the count, I've created an application with 30 buttons placed at random. I had a hard time remembering what each button displayed (text, not gonna make it useful too ), because there both wasn't any relation to it's relative location (near others saying something similar), nor to its absolute location (the top buttons said something, same as the bottom ones). Grouping icons of a similar type together helps users remember where things are, and as such it reduces complexity.
Reduce the complexity of the screen display, but keep everything accessible. This is very hard to get right. Your initial design will probably go through a few designs before you actually get to something useful for most.
Attempt to get it working similar to others in the "trade". This does count as stealing the UI, but for users to understand your UI it must look similar to the other guy's UI.
Before I forget, a last thing on the file formats. Keep everything in its place. Draw out what needs to be in the file once, what needs to be in there for each picture, what for each text block, what for each user reference, what for each drive type log (I'm just guessing what you want to make). Also, decide how big they might be, and how small they might be. Make the pointers relative to the start of the file (seen a few that didn't), allow all lengths to the absolute max of the fields (zip files that can't compress 4GB for instance), and keep room for future expansion. Don't fix addresses, don't allocate inside your file. Keep your file content clean, and allow even the sub structures to grow (look at ELF for instance, they have an explicit "this structure is this big here" pointer that you are required to use.
Gotta stop ranting, three full-length subsequent posts should suffice
Re:How to avoid patching good programs into ****?
I think there are good points above, but I find one more common problem: subtle things broken can break habbits, and render a great interface in a clumsy one.
Let's say, the Gnome Panel. In 2.2 version, by default, you had the "start"-menu on top-left corner, and the window list in top-right corner. In 2.4 they still appeared to be there, but there was one difference: someone decided that the menu and the window list should be components like everything else, and it should be possible to move them... which in principle is nice. But it so happened that while in 2.2 you'd always know that the "hotspots" of these two menus would extend to the very corner of the screen. This is basic usability, making a small area conceptually an "infinite one", as you can just thrust your mouse in the corner. The problem with the 2.4 version was that not only it was almost impossible to get the menus to the very corners of the screen, they would randomly move a few pixels away from the corner when loading the panel again.
The time I used Gnome, I had a habit of always minimizing windows with a shortcut, and then searching for the interesting window from the window list menu. I thought it was much easier to find what I needed from a nice menu than to spend screen space for a task-list, which always get totally useless once you have too many windows open. It was nice. I could throw the mouse to the corner, and get a menu, and my working was very stream-lined.
Then I made the error of updating my Gnome to 2.4: suddently I no longer could access the menu fast enough to be practical every time I wanted to switch the window. Needless to say, I dumped Gnome (after waiting a while for whether the problem might be patched), and switched to something else: evilwm. Not only is it very usable, it's design policy kinda guarantees that there will not be changes that would affect the way I work with it.
So it need not really be big changes that can break working software, and I think THIS is the biggest problem. I mean, how do you make sure you don't break things like this?
Ofcourse, I think the solution is knowing the users, and I think here is the biggest problem with many of these modern desktops: there are so many different ways to do the same thing, that it is almost impossible for the developers to even know all the possible work-flows people use.
Now, I really think there is one way to avoid problems like the gnome-panel one. It consists (basicly) of two concepts: activity-based design (or task-based or whatever they call it) and UI-designer (whether that is a separate person, or simply a way of thinking) always having the last word; after all, tasks are what software enables us to do, and if the UI-designer designed the UI, it's his responsibility domain after all.. (I also support the idea that everyone should "own" the module they work with).
The funny thing is, I think commercial and voluntary development have the same problem here: in commercial development it depends on the company culture, but in a similar way, I find that why Linux (kernel) development traditionally shines, is that there are people who take the responsibility of a few parts each.
PS. The funniest thing here is that many of us should remember how long it took Microsoft to get the Start-button to extend to the corner...
PSS. About Winamp: it was a great product quite exactly as long as Justin (and Nullsoft in general) still had the final say. As soon as AOL suits started messing with the development, Winamp started to suck. I believe this is because the responsibility of the product was no longer with the person who designed it to be great.
Let's say, the Gnome Panel. In 2.2 version, by default, you had the "start"-menu on top-left corner, and the window list in top-right corner. In 2.4 they still appeared to be there, but there was one difference: someone decided that the menu and the window list should be components like everything else, and it should be possible to move them... which in principle is nice. But it so happened that while in 2.2 you'd always know that the "hotspots" of these two menus would extend to the very corner of the screen. This is basic usability, making a small area conceptually an "infinite one", as you can just thrust your mouse in the corner. The problem with the 2.4 version was that not only it was almost impossible to get the menus to the very corners of the screen, they would randomly move a few pixels away from the corner when loading the panel again.
The time I used Gnome, I had a habit of always minimizing windows with a shortcut, and then searching for the interesting window from the window list menu. I thought it was much easier to find what I needed from a nice menu than to spend screen space for a task-list, which always get totally useless once you have too many windows open. It was nice. I could throw the mouse to the corner, and get a menu, and my working was very stream-lined.
Then I made the error of updating my Gnome to 2.4: suddently I no longer could access the menu fast enough to be practical every time I wanted to switch the window. Needless to say, I dumped Gnome (after waiting a while for whether the problem might be patched), and switched to something else: evilwm. Not only is it very usable, it's design policy kinda guarantees that there will not be changes that would affect the way I work with it.
So it need not really be big changes that can break working software, and I think THIS is the biggest problem. I mean, how do you make sure you don't break things like this?
Ofcourse, I think the solution is knowing the users, and I think here is the biggest problem with many of these modern desktops: there are so many different ways to do the same thing, that it is almost impossible for the developers to even know all the possible work-flows people use.
Now, I really think there is one way to avoid problems like the gnome-panel one. It consists (basicly) of two concepts: activity-based design (or task-based or whatever they call it) and UI-designer (whether that is a separate person, or simply a way of thinking) always having the last word; after all, tasks are what software enables us to do, and if the UI-designer designed the UI, it's his responsibility domain after all.. (I also support the idea that everyone should "own" the module they work with).
The funny thing is, I think commercial and voluntary development have the same problem here: in commercial development it depends on the company culture, but in a similar way, I find that why Linux (kernel) development traditionally shines, is that there are people who take the responsibility of a few parts each.
PS. The funniest thing here is that many of us should remember how long it took Microsoft to get the Start-button to extend to the corner...
PSS. About Winamp: it was a great product quite exactly as long as Justin (and Nullsoft in general) still had the final say. As soon as AOL suits started messing with the development, Winamp started to suck. I believe this is because the responsibility of the product was no longer with the person who designed it to be great.
Re:How to avoid patching good programs into ****?
I think by making sure you understand all concepts of the UI you design, you explicitly write down all aspects and you check each and every one of them when you finish a new version. More importantly, don't even attempt to "improve" the interface, you won't succeed.mystran wrote: I think there are good points above, but I find one more common problem: subtle things broken can break habbits, and render a great interface in a clumsy one.
<gnome-stuff>
So it need not really be big changes that can break working software, and I think THIS is the biggest problem. I mean, how do you make sure you don't break things like this?
Even though that's very true, it's also barely possible in current situations. I've been working on one project where somebody (graphical designer) should use the program, and I should make it. He didn't want to understand the development environment enough to sit down and make the UI. In similar fashion, how are you going to find ANYBODY that's just gonna design the UI?Now, I really think there is one way to avoid problems like the gnome-panel one. It consists (basicly) of two concepts: activity-based design (or task-based or whatever they call it) and UI-designer (whether that is a separate person, or simply a way of thinking) always having the last word; after all, tasks are what software enables us to do, and if the UI-designer designed the UI, it's his responsibility domain after all.. (I also support the idea that everyone should "own" the module they work with).
I've taken a potshot at it anyway, some time ago, and dubbed it separating "system modules, modules and interface". This comes down to an interface, which is buttons and highlighting and graphical jokes like that, plus a bunch of modules that actually implement the interface but are not statically linked to the interface, plus a bunch of system modules that provide an interface to system level devices through a controlled method, and are to be linked to the kernel directly. Interface may only call modules, modules may call other modules and system modules.
This makes UI designers design UIs, people that can write code decently write code, and people that are good at interfacing with the system interface with the system. This should pretty much guarantee that any project will be built up from the three components, and more importantly, if you want to install an upgrade to an "application", you can choose only to install the upgrades for the modules, keeping 100% the same interface for yourself.
This combined with a system that gives you the same desktop & interfaces you have at home anywhere, you can use any computer to do your work and use the interfaces you like to use for that. No more programmer that's gonna tell me what I can and cannot do
If it was the switch to AOL that cost it its virtue, why is that? It's not strictly because of the switch to AOL, others can do the same (see XMMS for an example).PSS. About Winamp: it was a great product quite exactly as long as Justin (and Nullsoft in general) still had the final say. As soon as AOL suits started messing with the development, Winamp started to suck. I believe this is because the responsibility of the product was no longer with the person who designed it to be great.
Re:How to avoid patching good programs into ****?
I'd be willing to do that.. as long as I was also given the authority to say "it's not working right".Candy wrote: Even though that's very true, it's also barely possible in current situations. I've been working on one project where somebody (graphical designer) should use the program, and I should make it. He didn't want to understand the development environment enough to sit down and make the UI. In similar fashion, how are you going to find ANYBODY that's just gonna design the UI?
Ok, ofcourse I can't make any hard claims, but there seems to be a connection, but IIRC it was about the same time things with Winamp started to go wrong as Justin started to complain about AOL in his blog. I'd be willing to withdraw that comment..If it was the switch to AOL that cost it its virtue, why is that? It's not strictly because of the switch to AOL, others can do the same (see XMMS for an example).
That said, I can't see how XMMS has been patched wrong? It's still exactly as usable as it was when I first installed it years ago: almost exactly like Winamp2, although without the browser and some such, but with a slightly better "select folders" dialog (can select multiple folders without closing the dialog after each). None of the sort of the Winamp 3 **** anyway..
Re:How to avoid patching good programs into ****?
Well, of course. That's the point of an UI designer. In my IMS setup you'd also be responsible for deciding which class of modules you'd like to use when trying to use a given function. This way, people that don't even know your project can add a new "Ogg" class and expect your program to work with it -> forward compatibility.mystran wrote: I'd be willing to do that.. as long as I was also given the authority to say "it's not working right".
Well, the point was that XMMS wasn't patched wrong, but is still being maintained by I don't know how many. The only actual difference between Justin's development and AOL's development is that AOL decided to make a few more features, all of which annoy the hell out of me. Winamp was doing just fine, and you can see that by the changelogs between versions 2.5 and 2.94 something, they barely changed anything, and if something changed it was the functional part for compatibility with others.Ok, ofcourse I can't make any hard claims, but there seems to be a connection, but IIRC it was about the same time things with Winamp started to go wrong as Justin started to complain about AOL in his blog. I'd be willing to withdraw that comment..
That said, I can't see how XMMS has been patched wrong? It's still exactly as usable as it was when I first installed it years ago: almost exactly like Winamp2, although without the browser and some such, but with a slightly better "select folders" dialog (can select multiple folders without closing the dialog after each). None of the sort of the Winamp 3 **** anyway..
AOL needs to make money (first of the post series). They add features. People buy it because it's feature-rich, then ignore it further on when they've seen the features. AOL made money. Winamp has more features, and people not wanting them are left in the cold. Conclusion, Winamp was wrecked by AOL, but for AOL it was a good thing. So, there's no way they're gonna not do that.
Compare HIEW for a change, I don't know many of its enhancements from 5.83 and on, but since 5.83 does what I want to, and those after it need money to buy, I stick with 5.83. I can only assume just about everybody else does the same. If you don't add features but do want money, you aren't gonna get it from the feature-people (the lot), and not from your actual users (the few) since they can use the previous version just the same.
Anyway, off to school now... maths, QA meeting and a little bit of PHP...
Re:How to avoid patching good programs into ****?
"Avoid adding features that do not contribute to the main goal, and if possible use some form of very simple plugin system. For examples, see Firefox (if I see a link to an extension, I click it, approve it, and I have it installed within the minute) and most alternative IM-clients."
Oy, yar, hear, hear!!! That's exactly the best idea. In fact, the best environment is like Mozilla and Winamp where the developer says, "here's the framework, here's the API, program your own plug-ins".
Less work and $$$ for the developer, better applications/plug-ins for the consumer, and the product develops a "community" instead of just "customers", and so everyone profits by the relationship.
Oy, yar, hear, hear!!! That's exactly the best idea. In fact, the best environment is like Mozilla and Winamp where the developer says, "here's the framework, here's the API, program your own plug-ins".
Less work and $$$ for the developer, better applications/plug-ins for the consumer, and the product develops a "community" instead of just "customers", and so everyone profits by the relationship.