Re: A quick unit testing question!
Posted: Wed Jun 18, 2014 2:23 am
Neat answers. Thanks.
As for unit testing being worthwhile, I've find it incredibly so in my own work in recent years. Mostly when writing libraries. It does add a lot of time, as Boobies said, because I've taken the "test everything" approach.
I.e. Test the usual ways of utilizing some function or interface. Then feed it tons of bad data. Then ensure that you can trigger allocation failures at will, so you can test how your code reacts to that situation, etc.
I figure a potential middle ground would be to aggressively test components deemed critical and have some kind of percentage of important-ness for modules or other programs. The higher the percentage, the more aggressively tested it is.
I.e. If it's got a super high important-ness percentage, it'd be write-the-tests-first, test-everything approach.
As for say, testing a bootloader. I see that you could achieve that. But the abstraction is making it possible could be annoying. I.e. hiding BIOS invocation behind some procedure and when you're testing it, you'd link the loader to something else which provides a test stub for that invocation procedure.
Sorry, rambling. Stream of consciousness, yadda yadda.
~K
Edit:
While being able to provide a more substantial, provable, kind of trust in what you create (without reason, of course), I've found the real benefit of aggressively unit testing to be how it's influenced how I engineer what I create. There's a saying somewhere I'm sure that says: If it's really, really hard to test, it's probably designed badly.
I figure that's more about testing interfaces and such than the situation in which a given program is run. Still, with techniques such as dependency injection and such, you have a pretty massive amount of flexibility with which to test!
As for unit testing being worthwhile, I've find it incredibly so in my own work in recent years. Mostly when writing libraries. It does add a lot of time, as Boobies said, because I've taken the "test everything" approach.
I.e. Test the usual ways of utilizing some function or interface. Then feed it tons of bad data. Then ensure that you can trigger allocation failures at will, so you can test how your code reacts to that situation, etc.
I figure a potential middle ground would be to aggressively test components deemed critical and have some kind of percentage of important-ness for modules or other programs. The higher the percentage, the more aggressively tested it is.
I.e. If it's got a super high important-ness percentage, it'd be write-the-tests-first, test-everything approach.
As for say, testing a bootloader. I see that you could achieve that. But the abstraction is making it possible could be annoying. I.e. hiding BIOS invocation behind some procedure and when you're testing it, you'd link the loader to something else which provides a test stub for that invocation procedure.
Sorry, rambling. Stream of consciousness, yadda yadda.
~K
Edit:
While being able to provide a more substantial, provable, kind of trust in what you create (without reason, of course), I've found the real benefit of aggressively unit testing to be how it's influenced how I engineer what I create. There's a saying somewhere I'm sure that says: If it's really, really hard to test, it's probably designed badly.
I figure that's more about testing interfaces and such than the situation in which a given program is run. Still, with techniques such as dependency injection and such, you have a pretty massive amount of flexibility with which to test!