1. silicon valley
    Joined
    27 Oct '04
    Moves
    101289
    24 Jul '10 22:11
    dee-dum-dum-dee-dee .... dum-dum-dee-dee-dum-dum-dee-DEE-dee-dee-dee-deeeeee .... dum-dum-dee-DEE-dee-dee-dee-DEE-dee-dee-dee-dee ... DUM-DUM-dee-DEE ....

    http://thedailywtf.com/Articles/That-Wouldve-Been-an-Option-Too.aspx
  2. Subscribersonhouse
    Fast and Curious
    slatington, pa, usa
    Joined
    28 Dec '04
    Moves
    53223
    24 Jul '10 23:312 edits
    Originally posted by zeeblebot
    dee-dum-dum-dee-dee .... dum-dum-dee-dee-dum-dum-dee-DEE-dee-dee-dee-deeeeee .... dum-dum-dee-DEE-dee-dee-dee-DEE-dee-dee-dee-dee ... DUM-DUM-dee-DEE ....

    http://thedailywtf.com/Articles/That-Wouldve-Been-an-Option-Too.aspx
    You know, it's funny. At a company that shall remain unnamed, our optics lab results all went through one computer with a 140 gb hd. The problem there was all the labs went to that comp and we were feeding it like at least 40 gigs a day. So every few days the IT guys had to go back and suck off old data so we could get our optics lab up an running again.

    We have these weekly company meetings with the CEO and such. One day, I brought up the point that all our lab data were being funneled through this obsolete HD and made the point that if that HD took a dump, all that data would be trashed.

    I made another point, just replace that piece of poop hd with a new TB drive which I think cost about 150 bucks at the time. CEO gave that to one of his managers. So I guess you can figure out if we got a new HD......
  3. silicon valley
    Joined
    27 Oct '04
    Moves
    101289
    25 Jul '10 04:36
    i'm guessing no! 🙂
  4. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    25 Jul '10 15:45
    Its not uncommon that throwing hardware at a problem is the best solution. And it is also not uncommon that management would rather throw labor at a problem that is better solved with hardware.
  5. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    26 Jul '10 08:261 edit
    Even with desktop PCs people frequently spend hours and hours trying to optimize it and get it to go faster, when a bit more RAM would have a greater impact.
    I'm just off to buy more RAM for my current PC.
  6. silicon valley
    Joined
    27 Oct '04
    Moves
    101289
    31 Jul '10 05:19
    and another way: jettison the bloatware! replace Windows with Linux!
  7. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    31 Jul '10 15:13
    Originally posted by zeeblebot
    and another way: jettison the bloatware! replace Windows with Linux!
    That would be a good idea for a work PC, but my home PCs are used for games, and Linux just doesn't do DirectX.
  8. silicon valley
    Joined
    27 Oct '04
    Moves
    101289
    05 Aug '10 03:14
    Originally posted by twhitehead
    That would be a good idea for a work PC, but my home PCs are used for games, and Linux just doesn't do DirectX.
    http://en.wikipedia.org/wiki/Wine_%28software%29
  9. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    05 Aug '10 07:07
    Originally posted by zeeblebot
    http://en.wikipedia.org/wiki/Wine_%28software%29
    Yes I looked that up after making the previous comment.
    It implies that not all games will run smoothly (and I have a lot of games). Nevertheless I will consider it, but it might be hard explaining to my son that he cant play his favorite game because wine doesn't yet support it.

    It might actually be the best route to DirectX 10 as I am still on XP because I can't afford Windows 7.
  10. Standard memberjoneschr
    Some guy
    Joined
    22 Jan '07
    Moves
    12299
    05 Aug '10 14:41
    Hard to say for this application in particular, but there are often gains for de-bloating applications like this beyond just memory usage.

    Performance, for example: It takes time to turn on/off all those bits. If you're not using twice the memory, you're not USING twice the memory.

    Maintainability, for another. If they actually were able to reduce the memory usage by 50% then odds are the code was just a mess, and they refactored it making it easier to maintain and hopefully will improve their insti-patch dilemna.

    It drives me nuts that people feel so free to write bloatware these days. And then they wonder why it takes them 3 minutes to boot their computer, or a minute to open MS word.
  11. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    05 Aug '10 16:02
    Originally posted by joneschr
    It drives me nuts that people feel so free to write bloatware these days. And then they wonder why it takes them 3 minutes to boot their computer, or a minute to open MS word.
    A programmer must find the correct balance. Enhancing the performance of an application takes time, (and money and effort), and may have both benefits and disadvantages. Quite often the best performance requires making the code more complicated - not necessarily more maintainable as you suggest.
    Its just not economically viable in most cases to squeeze every bit of performance out of an application.
    That even applies to MS Word. If Microsoft spend another billion dollars and two years optimizing Word, it would take half the memory, run twice as fast, cost you twice as much and not sell because it would be two years behind the competition.

    In my line of work, my applications are constantly changing and may not be used a lot or be discontinued. It is only worth optimizing when speed is really an issue. And even then, tweaking the Java memory settings may result in greater benefits than days of optimization.
  12. Joined
    07 Sep '05
    Moves
    35068
    05 Aug '10 16:581 edit
    Originally posted by twhitehead
    A programmer must find the correct balance. Enhancing the performance of an application takes time, (and money and effort), and may have both benefits and disadvantages. Quite often the best performance requires making the code more complicated - not necessarily more maintainable as you suggest.
    Absolutely. Hence the First Rule of Software Optimisation

    "Don't do it".

    (The Second Rule is for use by experts, and says "Don't do it yet" )
  13. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    05 Aug '10 17:23
    I have always had a theory about large computing projects like seti@Home and other such processor intensive projects.
    If the results are not urgent, simply wait a year and you will be able to get it done in half the time.
    Wait 20 years and what would have taken 1 million computers a month can now be done on 1 computer in a day.
    With other projects, such as rosetta@home, they are doing science immediately based on the feedback and the sooner we cure cancer the better, so it might pay off not to wait. But I am not sure. If they saved up their money for 10 years, then used the skills and computing power then they might achieve more overall as bang for the buck potentially doubles every year.
  14. Standard memberjoneschr
    Some guy
    Joined
    22 Jan '07
    Moves
    12299
    05 Aug '10 21:542 edits
    Originally posted by twhitehead
    Quite often the best performance requires making the code more complicated - not necessarily more maintainable as you suggest.
    I agree with most of what you say, there are certainly cases that additional complexity is needed to improve performance.

    But I disagree that it's the common case. I personally find the more common case is that programs perform like dogs because they lack simplicity, not complexity.

    Behold the lowly bubble sort which outperforms on small data sets.

    Or consider also the case of the Linux kernel, who Linus himself acknowledges suffers from performance problems due to bloat:
    http://www.theregister.co.uk/2009/09/22/linus_torvalds_linux_bloated_huge/

    Mtthw is exactly right. Don't do it yet - because when you bloat your program with attempts to do so, you end up having the opposite effect.
  15. Cape Town
    Joined
    14 Apr '05
    Moves
    52945
    06 Aug '10 07:051 edit
    Originally posted by joneschr
    But I disagree that it's the common case. I personally find the more common case is that programs perform like dogs because they lack simplicity, not complexity.
    In my experience, optimizing normally consists of the following steps:
    1. Implement caching where possible - extra complexity.
    2. Implement compression where possible - extra complexity.
    3. Use various coding methods that reduce the number of instructions executed such as moving methods inline or optimizing the source code - extra complexity.
    4. Customize the code for the specific hardware - extra complexity.
    5. Implement improved algorithms such as sorting methods etc - extra complexity.
    6. Instead of using the code / html produced by the various tools, edit it directly to remove all the bloat the tools produce - extra complexity.
    Remember that the extra complexity I am referring to my not always be in the actual code but in the maintainability of the code.

    Behold the lowly bubble sort which outperforms on small data sets.
    And thats simpler than what? I have never implemented bubble sort in preference to another method. If I ever did, it would probably be adding complexity because I would be coding it myself instead of using a built in utility method.

    Or consider also the case of the Linux kernel, who Linus himself acknowledges suffers from performance problems due to bloat:
    http://www.theregister.co.uk/2009/09/22/linus_torvalds_linux_bloated_huge/

    That article doesn't indicate that simplifying the kernel would be the solution. It would be things like assembly code optimization that would have the most impact. There may be code duplication that could be dealt with but that is likely to affect size more than performance.
Back to Top

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.I Agree