Main > Everything Else

Them darn Linux pirates

<< < (5/6) > >>

daywane:


--- Quote from: Justin Z on December 12, 2008, 02:29:08 pm ---If by "using 486 PCs" you mean we might have software that is efficient enough to run fine on 486s so there would be no need for faster hardware, then I guess I see what you're saying, lol

--- End quote ---

Yeah if that efficient software didn't have such an issue with upkeep.  I still have a 486 kicking about with NT4 on it with Slackware Linux.  Both still suitable for today's needs, yet we have to muddy the water with this multimedia crap and poorly written software.  ::)
[/quote]

I still have a Atari 800xl  going for a bunch of stuff  :dunno
I like what it can do for me and I keep it around for for fun

ChadTower:

--- Quote from: boykster on December 21, 2008, 01:04:57 am ---Linux on the desktop will unlikely ever supplant windoze as the primary OS, but as a server OS, it's a powerhouse.  And as for processing power, there are many more driving factors behind the need for increased processing power than just M$ bloated OS code.  I run systems at work that are 99.9% cpu bound 24/7 and these are dual processor quad core 64bit athlon cpu's.....on a 486 platform I'd need 100 times the number of servers to run the same operations, and there's no way they could support the 32gigs of ram these servers have.

--- End quote ---


Same here.  And these are servers with 8 CPUS each.  Takes a hell of a lot of horsepower to process the ongoing transactions for 6300+ stores nevermind warehouse all of that data and then process it out to be analyzed 65 ways from Sunday for targeted marketing and customer rewards programs.  And that's not factoring in the financial tracking requirements of Sarbanes Oxley or the patient confidentiality processing for HIPAA.  It's startling when you walk around some of these server rooms how much computing is going on.  Sarah Connor needs to stop looking for chess computers and start looking at retail backplanes   :)

Justin Z:

--- Quote from: lkench on December 18, 2008, 03:05:50 pm ---It's all prodify's fault(heck, we could probably go all the way back to compuserve if we wanted to) - letting all the idiots on to our cool internet and then some joker came up with this crazy World Wide Web - with all it's fancy color movie pictures with synchronized sound.  Edlin.com was so much better than notepad.exe - if you haven't got it yet, the functionality we demand(and expect) today, requires more processing power.
--- End quote ---
For the record, my rant was about crap code that doesn't get optimized because programmers are lazy and figure "oh what the hell, they're probably running a quad core and have a cable modem anyway, who gives a crap."  Why in the world does some program like Winamp have to be 11 megs compressed?  It is beyond me.

I had a program in DOS for playing MOD files called DMP -- Dual Module Player.  I think it was about 300k zipped.  It played like eight different kinds of tracker files, and 300k only took about two minutes to download on a 2400 baud modem, so everyone was glad the author wrote tight code.

ETA and yes, you guys' examples are perfectly valid, but that's raw number crunching and not what I was talking about at all.  I run "SETI@Home" on my desktop, and download the most optimized versions of the software for exactly that reason.  I just think many modern programmers are lazy and don't look for the best ways to do things anymore because they don't have particularly stringent limitations to deal with.

The physics of Asteroids or the fractal planetary data used in Starflight are two excellent examples of the opposite mindset . . .

lkench:
You ought to read Hackers by Steven Levy if you can find it...it's got some pretty good stuff about the "good old days" of programming.  The code bloat in my opinion is directly due to these fancy "high-level" languages like C, java, etc. and all thier fancy libraries.  If we could go back and talk to a device like the display directly in machine code and not have to through a bunch of layers of abstraction (including all the abstraction built into the processors to make them backwards compatible and still include modes to make them look like 8086's), we'd have a lot tighter code.  There's trade offs everywhere...if you want something usable, yet still affordable in terms of cost tio develop, we're stuck with "big" code.  If you want "tight" code, it's not going to be anywhere near as functional, take a "lot longer" to develop and require much more skilled developers, both of which translate to high cost somewhere...

-lkench

RayB:
Cost of progress. Can you imagine coding an entire modern AAA video game in pure assembler? It would take 10 years.

(OHHH! That's why Duke Nukem Forever isn't out yet!)

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version