Go back
A Coding Story

A Coding Story

Science

zeeblebot

silicon valley

Joined
27 Oct 04
Moves
101289
Clock
24 Jul 10
Vote Up
Vote Down

dee-dum-dum-dee-dee .... dum-dum-dee-dee-dum-dum-dee-DEE-dee-dee-dee-deeeeee .... dum-dum-dee-DEE-dee-dee-dee-DEE-dee-dee-dee-dee ... DUM-DUM-dee-DEE ....

http://thedailywtf.com/Articles/That-Wouldve-Been-an-Option-Too.aspx

s
Fast and Curious

slatington, pa, usa

Joined
28 Dec 04
Moves
53321
Clock
24 Jul 10
2 edits
Vote Up
Vote Down

Originally posted by zeeblebot
dee-dum-dum-dee-dee .... dum-dum-dee-dee-dum-dum-dee-DEE-dee-dee-dee-deeeeee .... dum-dum-dee-DEE-dee-dee-dee-DEE-dee-dee-dee-dee ... DUM-DUM-dee-DEE ....

http://thedailywtf.com/Articles/That-Wouldve-Been-an-Option-Too.aspx
You know, it's funny. At a company that shall remain unnamed, our optics lab results all went through one computer with a 140 gb hd. The problem there was all the labs went to that comp and we were feeding it like at least 40 gigs a day. So every few days the IT guys had to go back and suck off old data so we could get our optics lab up an running again.

We have these weekly company meetings with the CEO and such. One day, I brought up the point that all our lab data were being funneled through this obsolete HD and made the point that if that HD took a dump, all that data would be trashed.

I made another point, just replace that piece of poop hd with a new TB drive which I think cost about 150 bucks at the time. CEO gave that to one of his managers. So I guess you can figure out if we got a new HD......

zeeblebot

silicon valley

Joined
27 Oct 04
Moves
101289
Clock
25 Jul 10
Vote Up
Vote Down

i'm guessing no! 🙂

twhitehead

Cape Town

Joined
14 Apr 05
Moves
52945
Clock
25 Jul 10
Vote Up
Vote Down

Its not uncommon that throwing hardware at a problem is the best solution. And it is also not uncommon that management would rather throw labor at a problem that is better solved with hardware.

twhitehead

Cape Town

Joined
14 Apr 05
Moves
52945
Clock
26 Jul 10
1 edit
Vote Up
Vote Down

Even with desktop PCs people frequently spend hours and hours trying to optimize it and get it to go faster, when a bit more RAM would have a greater impact.
I'm just off to buy more RAM for my current PC.

zeeblebot

silicon valley

Joined
27 Oct 04
Moves
101289
Clock
31 Jul 10
Vote Up
Vote Down

and another way: jettison the bloatware! replace Windows with Linux!

twhitehead

Cape Town

Joined
14 Apr 05
Moves
52945
Clock
31 Jul 10
Vote Up
Vote Down

Originally posted by zeeblebot
and another way: jettison the bloatware! replace Windows with Linux!
That would be a good idea for a work PC, but my home PCs are used for games, and Linux just doesn't do DirectX.

zeeblebot

silicon valley

Joined
27 Oct 04
Moves
101289
Clock
05 Aug 10
Vote Up
Vote Down

Originally posted by twhitehead
That would be a good idea for a work PC, but my home PCs are used for games, and Linux just doesn't do DirectX.
http://en.wikipedia.org/wiki/Wine_%28software%29

twhitehead

Cape Town

Joined
14 Apr 05
Moves
52945
Clock
05 Aug 10
Vote Up
Vote Down

Originally posted by zeeblebot
http://en.wikipedia.org/wiki/Wine_%28software%29
Yes I looked that up after making the previous comment.
It implies that not all games will run smoothly (and I have a lot of games). Nevertheless I will consider it, but it might be hard explaining to my son that he cant play his favorite game because wine doesn't yet support it.

It might actually be the best route to DirectX 10 as I am still on XP because I can't afford Windows 7.

j
Some guy

Joined
22 Jan 07
Moves
12299
Clock
05 Aug 10
Vote Up
Vote Down

Hard to say for this application in particular, but there are often gains for de-bloating applications like this beyond just memory usage.

Performance, for example: It takes time to turn on/off all those bits. If you're not using twice the memory, you're not USING twice the memory.

Maintainability, for another. If they actually were able to reduce the memory usage by 50% then odds are the code was just a mess, and they refactored it making it easier to maintain and hopefully will improve their insti-patch dilemna.

It drives me nuts that people feel so free to write bloatware these days. And then they wonder why it takes them 3 minutes to boot their computer, or a minute to open MS word.

twhitehead

Cape Town

Joined
14 Apr 05
Moves
52945
Clock
05 Aug 10
Vote Up
Vote Down

Originally posted by joneschr
It drives me nuts that people feel so free to write bloatware these days. And then they wonder why it takes them 3 minutes to boot their computer, or a minute to open MS word.
A programmer must find the correct balance. Enhancing the performance of an application takes time, (and money and effort), and may have both benefits and disadvantages. Quite often the best performance requires making the code more complicated - not necessarily more maintainable as you suggest.
Its just not economically viable in most cases to squeeze every bit of performance out of an application.
That even applies to MS Word. If Microsoft spend another billion dollars and two years optimizing Word, it would take half the memory, run twice as fast, cost you twice as much and not sell because it would be two years behind the competition.

In my line of work, my applications are constantly changing and may not be used a lot or be discontinued. It is only worth optimizing when speed is really an issue. And even then, tweaking the Java memory settings may result in greater benefits than days of optimization.

m

Joined
07 Sep 05
Moves
35068
Clock
05 Aug 10
1 edit
Vote Up
Vote Down

Originally posted by twhitehead
A programmer must find the correct balance. Enhancing the performance of an application takes time, (and money and effort), and may have both benefits and disadvantages. Quite often the best performance requires making the code more complicated - not necessarily more maintainable as you suggest.
Absolutely. Hence the First Rule of Software Optimisation

"Don't do it".

(The Second Rule is for use by experts, and says "Don't do it yet" )

twhitehead

Cape Town

Joined
14 Apr 05
Moves
52945
Clock
05 Aug 10
Vote Up
Vote Down

I have always had a theory about large computing projects like seti@Home and other such processor intensive projects.
If the results are not urgent, simply wait a year and you will be able to get it done in half the time.
Wait 20 years and what would have taken 1 million computers a month can now be done on 1 computer in a day.
With other projects, such as rosetta@home, they are doing science immediately based on the feedback and the sooner we cure cancer the better, so it might pay off not to wait. But I am not sure. If they saved up their money for 10 years, then used the skills and computing power then they might achieve more overall as bang for the buck potentially doubles every year.

j
Some guy

Joined
22 Jan 07
Moves
12299
Clock
05 Aug 10
2 edits
Vote Up
Vote Down

Originally posted by twhitehead
Quite often the best performance requires making the code more complicated - not necessarily more maintainable as you suggest.
I agree with most of what you say, there are certainly cases that additional complexity is needed to improve performance.

But I disagree that it's the common case. I personally find the more common case is that programs perform like dogs because they lack simplicity, not complexity.

Behold the lowly bubble sort which outperforms on small data sets.

Or consider also the case of the Linux kernel, who Linus himself acknowledges suffers from performance problems due to bloat:
http://www.theregister.co.uk/2009/09/22/linus_torvalds_linux_bloated_huge/

Mtthw is exactly right. Don't do it yet - because when you bloat your program with attempts to do so, you end up having the opposite effect.

twhitehead

Cape Town

Joined
14 Apr 05
Moves
52945
Clock
06 Aug 10
1 edit
Vote Up
Vote Down

Originally posted by joneschr
But I disagree that it's the common case. I personally find the more common case is that programs perform like dogs because they lack simplicity, not complexity.
In my experience, optimizing normally consists of the following steps:
1. Implement caching where possible - extra complexity.
2. Implement compression where possible - extra complexity.
3. Use various coding methods that reduce the number of instructions executed such as moving methods inline or optimizing the source code - extra complexity.
4. Customize the code for the specific hardware - extra complexity.
5. Implement improved algorithms such as sorting methods etc - extra complexity.
6. Instead of using the code / html produced by the various tools, edit it directly to remove all the bloat the tools produce - extra complexity.
Remember that the extra complexity I am referring to my not always be in the actual code but in the maintainability of the code.

Behold the lowly bubble sort which outperforms on small data sets.
And thats simpler than what? I have never implemented bubble sort in preference to another method. If I ever did, it would probably be adding complexity because I would be coding it myself instead of using a built in utility method.

Or consider also the case of the Linux kernel, who Linus himself acknowledges suffers from performance problems due to bloat:
http://www.theregister.co.uk/2009/09/22/linus_torvalds_linux_bloated_huge/

That article doesn't indicate that simplifying the kernel would be the solution. It would be things like assembly code optimization that would have the most impact. There may be code duplication that could be dealt with but that is likely to affect size more than performance.

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.