View RSS Feed

Shybug

When people say "old" hardware is better than "new" hardware.

Rate this Entry
Don't get me wrong, old hardware is better than new hardware for the following reasons.

1. lasts and built to last, modern day hardware is designed to die within a few to 7 years.
2. good for retro and older software, or nostalgia or learning about computers better as older hardware taught you more about the computer because they used assembler or C(ANSI C?)

But when I hear someone say that modern day hardware is "slower" than older hardware....

Then you're batshit wrong.

Only reason why there "faster" is the software is written with the hardware in mind, so it has to be fast and use as little resources as possible.

Modern day hardware is so fast, that people don't even bother written software to be fast, they just glue together code without understanding any of the internals, most programmers don't know anything about hardware.

I'm one of the lucky 18 year olds that grew up with modern day hardware and C, I take hardware for granted and write software as fast as it should be, sadly programmers think.. what a waste of time.

Languages like Java/C#, PHP ect are too high and don't allow anyone to understand the hardware, :O

End of rant
Categories
Uncategorized

Comments

  1. Note's Avatar
    This reminds me a lot of the Console vs PC/Mac/Linux argument in terms of dated hardware and performance. 1) A PC/Mac/Linux OS is designed to work with thousands of different types of peripherals thus resulting in more overhead, whereas Consoles only have a few types of configurations thus resulting in less overhead allowing the software to communicate more efficiently with the hardware. So, yes, software makes a huge difference to how hardware works and performs. You can even use the same type of comparative argument with smartphones, I.E: Android devices and iPhones.
  2. arcituthis's Avatar
    This would make a good topic for discussion.

    As far as 1 goes, the hardware wasn't built to last differently than today, the process just changed. When designing the old hardware, to get a 5 year lifespan out of something, you had to design it to last 10 years because of tolerances and the design of individual components made for wide swings in reliability. As tolerances grew tighter and simulation models became smarter, the design could become cheaper and closer to meeting the 5 year lifespan requirement. This had the directly affected the price of the components and we see the cost of all computer equipment becoming cheaper as time goes on (it's offset by the rise in speed and complexity). If you also look at the processor and the "number of transistors" over time along with the speeds they are running at, the chance of an individual breaking increases. Basically as speed increases, robustness decreases. It's worthy to note that designers (somewhat artificially) increase the lifespan of electronics by widening the tolerance on inputs and tightening the tolerance on outputs so that even as stuff begins to go bad, the device still works as intended for a little longer.

    For number 2, I completely agree. Working on microcontrollers, I've had the pleasure of programming a graphics LCD on a system with 128 Bytes of Scratchpad RAM. Once I upgraded to a microcontroller with 96KB RAM with 4GB connected serially, I no longer worried about how each individual part affected the total memory and because it was exponentially faster, I didn't have to worry about whether the 'pause' would be noticeable to the user.

    I also agree with Note. When you make a program that will run on multiple platforms, you lose the luxury of fine tuning the program to run as fast as it can on an individual piece of hardware and it will slow things down.
  3. Shybug's Avatar
    That and nothing is written in assembler anymore, so software isn't as fast as it can be, that and it tends to be badly written.

    Bad Code + higher level language = Slower computer + more instructions.
    Good Code + higher level language = faster computer + less instructions

    Good Code + assembler = faster + less instructions
    Bad Code + assembler slower + less instructions

    Though it all depends on the compiler.
  4. Maxx's Avatar


    Quote Originally Posted by arcituthis
    If you also look at the processor and the "number of transistors" over time along with the speeds they are running at, the chance of an individual breaking increases.
    Its not just more devices = more opportunities for failure, its also more devices in the same amount of space generating heat, with less material and space to dissipate it.

    Also, smaller device sizes and thinner conductors on circuit boards and within devices make them more susceptible to breakdown. Not much different than motor vehicles. A little corrosion on a modern subcompact is a lot more significant functionally and structurally than it was on a land yacht from 40 or 50 years ago.

    Except for those of us "bitter clingers" who don't let go of something until it crumbles into dust, advances in software usually obsolete the hardware before failure becomes a big problem.
  5. arcituthis's Avatar


    Quote Originally Posted by BluePanda
    That and nothing is written in assembler anymore, so software isn't as fast as it can be, that and it tends to be badly written.

    Bad Code + higher level language = Slower computer + more instructions.
    Good Code + higher level language = faster computer + less instructions

    Good Code + assembler = faster + less instructions
    Bad Code + assembler slower + less instructions

    Though it all depends on the compiler.
    Assembler Assembly (edit: committed act against my own pet peeve) isn't gone completely, it's still used by some games programmers to handle extremely time sensitive modules. In the project I was referencing, I used it to actually slow the processor to match the maximum speed for SPI bus emulation. You just can't do that with the variable speed processors they have now. The way things are done today (especially in gaming), we are intentionally adding code that makes the software run at the same speed no matter how fast the processor is in order to make a smooth user experience.(http://docs.unity3d.com/ScriptRefere...deltaTime.html)
    Updated 02-Oct-2015 at 21:44 by arcituthis
  6. Shybug's Avatar


    Quote Originally Posted by arcituthis
    Assembler Assembly (edit: committed act against my own pet peeve) isn't gone completely, it's still used by some games programmers to handle extremely time sensitive modules. In the project I was referencing, I used it to actually slow the processor to match the maximum speed for SPI bus emulation. You just can't do that with the variable speed processors they have now. The way things are done today (especially in gaming), we are intentionally adding code that makes the software run at the same speed no matter how fast the processor is in order to make a smooth user experience.(http://docs.unity3d.com/ScriptRefere...deltaTime.html)
    I'm fully aware of that, what I mean, is its rare for it to be used anywhere when languages like C++/C# can do everything the programmer needs(well, the higher level stuff).

    For example, chrome uses assembly for speed critical things (from memory).

    You would rarely see, assembly being used for applications like utorrent, web browsers ect.

    So, it tends to not be as fast as it should, than there are compilers

    From memory GCC is slower than the Intel C++ compiler same as MSVC++(could be totatlly off or wrong )

    Compilers tend to generate more "Assembly" code than pure assembly, therefore it takes longer to execute such code.

    Programs aren't really designed with speed in mind, because of how powerful the computers are, most people don't bother writing good speed efficient code, doesn't matter the language, people use things like BOOST or STL instead of writing there own code that is tied to the hardware that being said, most software is slow because there are multible platforms and hardware involved, hence the code being slower, portablitiy makes up for the speed differences.

    Not saying that all software should be written in assembly, but if it were, we would see a huge speed difference comparable to the 80s.

    Anyways
  7. arcituthis's Avatar


    Quote Originally Posted by BluePanda
    Not saying that all software should be written in assembly, but if it were, we would see a huge speed difference comparable to the 80s.
    I'm now imagining what an entire game or any complicated software would be like to write in assembly. I have enough trouble understanding the basics of it.
ADISC.org - the Adult Baby / Diaper Lover / Incontinence Support Community.
ADISC.org is designed to be viewed in Firefox, with a resolution of at least 1280 x 1024.