Pc Architecture

Status
Not open for further replies.

LittleJess

Banned
Messages
1,089
Obviously as a geek, I have a deep interest in the lower level aspects of computing, I'm actually fascinated with computer hardware, and low level programming (ie C, ASM)

I'm just curious, why don't modern computers have ram built into the motherboard is there a specific reason as to why this is? You get modern phones and consoles with ram soldered onto the motherboard, but not PC's? I'm surprised.

There was a time where GPUs were built into the motherboard, now there just built into the processor, which kind of confused the fuck out of me when I built a recent intel PC a couple years ago, being a Celeron (Intel HD Graphics). It would be cool to see computers with inbuilt storage, or computers that took SD cards as boot devices (obviously that won't ever happen) but you see this happen with things like raspberry pis.

I'm also surprised that most PC Architecture hasn't changed for 30+ years, a lot of the stuff is still valid today, only things that really changed is software and hardware being more advanced.
 
Love my Raspberry Pis!

I wondered if PC hardware would start to diversify into easy architecture and customiseable.
 
The reason that RAM is not built into the motherboard is that different people want different amounts of RAM. For the features that have different options are not built on the motherboard but configured later.
Your not old enough to know that computer architecture has changed a lot in the last 30+ years. From 8 bit processors and 640K maximum memory thru seperate floating point processors, 32 bit and 64 bit processors with built-in cache runnig at GHz speeds. In addition to changes to the bus architectures, ISA, PCI, etc.
I was working in the field before PCs, I date back to mainframes.
 
ORBaby said:
The reason that RAM is not built into the motherboard is that different people want different amounts of RAM. For the features that have different options are not built on the motherboard but configured later.
Your not old enough to know that computer architecture has changed a lot in the last 30+ years. From 8 bit processors and 640K maximum memory thru seperate floating point processors, 32 bit and 64 bit processors with built-in cache runnig at GHz speeds. In addition to changes to the bus architectures, ISA, PCI, etc.
I was working in the field before PCs, I date back to mainframes.

I mean fundamentally, not much has changed, even the programming languages. I don't mean architecture didn't change, I just mean the core of it hasn't, ie you've still got your bios, you can theoretically write programs today and compile it on older hardware, the tools haven't changed much, and a lot of the stuff you can still see in computers at some level today.

You've still got your processors and your ram, you've still get registers, I know though i missed out on a shit ton, though I've seen and been around older hardware, even had a computer from the 90s that I used to play games I grew up with.

For example, OSX / Linux or BSD have some programs dating back to the 2000s or even the 90s, such as BC (basic calculator)

A better example, is my first programming language was c89, yet most C/C++ compilers still support C99. :) that dates back some time. Well either 89 or 95) the books i had at the time were quite dated.
 
Last edited:
At the high levels computers have all been the same, input device, output device, memory, storage, and processor.
I remember before computers had a BIOS, they had a bootstrap routine that would load your program and then run it. Some would read in from punched cards or paper tape. I even remeember having to manually load instructions into memory from a panel and then executing them to load in a program. Okay, I'm feeling prehistoric now.
 
ORBaby said:
At the high levels computers have all been the same, input device, output device, memory, storage, and processor.
I remember before computers had a BIOS, they had a bootstrap routine that would load your program and then run it. Some would read in from punched cards or paper tape. I even remeember having to manually load instructions into memory from a panel and then executing them to load in a program. Okay, I'm feeling prehistoric now.

Yeah, that was what I was talking about :p Is it bad I know enough C to actually write my own kernel? I don't know enough ASM to write a bootstrapper, so I just used grub.

http://wiki.osdev.org/Bare_Bones
 
Shybug said:
I'm just curious, why don't modern computers have ram built into the motherboard is there a specific reason as to why this is? You get modern phones and consoles with ram soldered onto the motherboard, but not PC's? I'm surprised.

You do get modern computers with RAM soldered onto the motherboard. But it's extremely rare because... well... RAM is a separate component.

You might as well ask why you can buy PC components separately instead of just fully-sealed computers with no upgrade options.
 
PC stands for personal computer. Computers in general are made for various different things. They are made to be customized to whatever needs that the consumer needs it for. These needs require different amounts of RAM. Also it makes computers upgradable instead of having to buy a whole new computer.

Why would you want it welded in?
 
And plenty of computers boot from sd card, or USB stick, or network, or San device. I have a couple servers that have raid 1 sd cards for boot media

Sent from my SM-N910W8 using Tapatalk
 
Today, it's by and large a relic of the days memory was expensive and it was socketed as an upgrade path.
 
Shybug said:
I'm just curious, why don't modern computers have ram built into the motherboard is there a specific reason as to why this is? You get modern phones and consoles with ram soldered onto the motherboard, but not PC's? I'm surprised.

Soldered-on RAM was common in the 1980's and early 90's. Often times, a computer would come with RAM soldered to the motherboard, and a few slots or sockets for upgrading it. I had one like that. It was just cheaper to build them that way. The RAM chips got machine-soldered to the board along with all of the other components. No Dell employees had to manhandle your SIMMs/DIMMs. Memory was expensive, and its density and performance weren't improving as rapidly as they have in recent years, so compatibility between the soldered-on RAM and the RAM you might later add wasn't such a concern. In fact, if you did add RAM later, it was often proprietary, or else an upgrade targeting your specific computer model.

And actually, there are many PCs being sold today that have soldered-on RAM: Tablets! Yes, even the ones that run Windows. There's no expectation of upgradeability, and space inside tablets is limited. DIMM slots and DIMMs, even the laptop sort, take up too much room.

But for desktop computers, times have changed a bit. The constantly improving speed and density of memory dictates the need to (potentially) replace whatever memory comes with the PC. Chipsets often don't like you to mix memories of different sizes and speeds, or if they allow it, it comes with a performance penalty. And even on desktop PC motherboards, things can get a bit cramped. Memory is (relatively) cheap and standardized now. The obvious answer to all of these things is to just put it all in slots.

Shybug said:
or computers that took SD cards as boot devices (obviously that won't ever happen) but you see this happen with things like raspberry pis.

Linux, Windows, and several other OSes can boot off of SD cards. Heck, you can even get "hard drive emulators" for vintage PCs, Macs, and Amigas (and probably others) that mate old-school floppy and HD interfaces to SD cards so that you're not stuck running on old, failure-prone disks from the 80's and 90's. They're totally rad! (I have one in my Amiga.)

Shybug said:
I'm also surprised that most PC Architecture hasn't changed for 30+ years, a lot of the stuff is still valid today, only things that really changed is software and hardware being more advanced.

Well, that's the nature of the beast. Unlike Apple, Microsoft makes its money selling software--Windows and Office. And PC makers make their money by selling hardware that runs Windows and Office. Neither one can really go anywhere without the other, and so they've stayed largely in the same place. And that's not so bad, really. It's kept costs low, and as we've seen, there was plenty of room for improvement within the framework of the old architecture. No need to go switching CPU types and redesigning your hardware from scratch multiple times--as Apple did with the Mac, only to eventually land on the PC architecture. (I'm typing this on a Mac, so I'm allowed to make fun!)
 
Cottontail said:
Soldered-on RAM was common in the 1980's and early 90's. Often times, a computer would come with RAM soldered to the motherboard, and a few slots or sockets for upgrading it. I had one like that. It was just cheaper to build them that way. The RAM chips got machine-soldered to the board along with all of the other components. No Dell employees had to manhandle your SIMMs/DIMMs. Memory was expensive, and its density and performance weren't improving as rapidly as they have in recent years, so compatibility between the soldered-on RAM and the RAM you might later add wasn't such a concern. In fact, if you did add RAM later, it was often proprietary, or else an upgrade targeting your specific computer model.

And actually, there are many PCs being sold today that have soldered-on RAM: Tablets! Yes, even the ones that run Windows. There's no expectation of upgradeability, and space inside tablets is limited. DIMM slots and DIMMs, even the laptop sort, take up too much room.

But for desktop computers, times have changed a bit. The constantly improving speed and density of memory dictates the need to (potentially) replace whatever memory comes with the PC. Chipsets often don't like you to mix memories of different sizes and speeds, or if they allow it, it comes with a performance penalty. And even on desktop PC motherboards, things can get a bit cramped. Memory is (relatively) cheap and standardized now. The obvious answer to all of these things is to just put it all in slots.



Linux, Windows, and several other OSes can boot off of SD cards. Heck, you can even get "hard drive emulators" for vintage PCs, Macs, and Amigas (and probably others) that mate old-school floppy and HD interfaces to SD cards so that you're not stuck running on old, failure-prone disks from the 80's and 90's. They're totally rad! (I have one in my Amiga.)



Well, that's the nature of the beast. Unlike Apple, Microsoft makes its money selling software--Windows and Office. And PC makers make their money by selling hardware that runs Windows and Office. Neither one can really go anywhere without the other, and so they've stayed largely in the same place. And that's not so bad, really. It's kept costs low, and as we've seen, there was plenty of room for improvement within the framework of the old architecture. No need to go switching CPU types and redesigning your hardware from scratch multiple times--as Apple did with the Mac, only to eventually land on the PC architecture. (I'm typing this on a Mac, so I'm allowed to make fun!)

Don't forget that due to the switch over to the Intel processors, they gave birth to hackintoshes :p

There was this virus that my dad's computer got when I was younger, I think it was CIH or something, killed his bios. lucky that doesn't happen anymore. That's fascinating though, never knew PC's had built in ram at one point. I've only seen small changes since I grew up in the windows 98 era. I literally can't recall when I actually got into computers, might of been earlier than 11, I remember my dad buying windows xp for the first time ever, after pirating windows for the majority of the time lol.

from the age of 8 to 16 I used windows xp, than I moved to windows 7 when that came out, completely missed out on windows vista, but it was garbage from what I heard.
 
Shybug said:
Don't forget that due to the switch over to the Intel processors, they gave birth to hackintoshes :p
It would probably be a straightforward thing to break Hackintosh permanently with some additional cryptographic hardware, but then one has to wonder how many of the prevented Hackintoshes would translate into people buying genuine Apple hardware. The price gap is rather enormous. I'm sure the bean-counters at Apple did the math long ago, and decided that the anti-Hackintosh measures currently in place were the best balance of cost/benefit to Apple's bottom line. At some point, though, if Apple decides that maintaining compatibility with Windows and other OSes isn't easing the decision to purchase Macs, they might swap around the memory addresses of some basic components and then it would be game over for Hackintosh, Windows on the Mac, etc. We shall see! The expandable Mac Pro models are no more, which some might interpret as the start of a slow transition back to a proprietary hardware model.

Shybug said:
from the age of 8 to 16 I used windows xp, than I moved to windows 7 when that came out, completely missed out on windows vista, but it was garbage from what I heard.

Vista and 8 were plainly rather transitory, being neither here nor there in terms of Microsoft's visions. They were definitely the ones to miss. I'm still running 7 on my 2010 Mac Pro, but I'm sure I'll put 10 on it at some point. Unfortunately, the Windows upgrade code doesn't like my disk configuration, so upgrading will have to be done at a time when I feel like shuffling some disks around and repartitioning them.
 
Cottontail said:
It would probably be a straightforward thing to break Hackintosh permanently with some additional cryptographic hardware, but then one has to wonder how many of the prevented Hackintoshes would translate into people buying genuine Apple hardware. The price gap is rather enormous. I'm sure the bean-counters at Apple did the math long ago, and decided that the anti-Hackintosh measures currently in place were the best balance of cost/benefit to Apple's bottom line. At some point, though, if Apple decides that maintaining compatibility with Windows and other OSes isn't easing the decision to purchase Macs, they might swap around the memory addresses of some basic components and then it would be game over for Hackintosh, Windows on the Mac, etc. We shall see! The expandable Mac Pro models are no more, which some might interpret as the start of a slow transition back to a proprietary hardware model.



Vista and 8 were plainly rather transitory, being neither here nor there in terms of Microsoft's visions. They were definitely the ones to miss. I'm still running 7 on my 2010 Mac Pro, but I'm sure I'll put 10 on it at some point. Unfortunately, the Windows upgrade code doesn't like my disk configuration, so upgrading will have to be done at a time when I feel like shuffling some disks around and repartitioning them.

They tried doing that with SMC, but a group of clever developers, wrote a SMC emulator called fakesmc. Practically SMC decrypted the Don't steal mac OSX.kext, and core components, obviously that never stopped them (you can never stop hackers really.), but I've got a feeling apple doesn't care, and uses the advancements from hackintosh for there own Macs, ie graphics card support, extra driver support etc. You've seen macs with AMD gpus in them, etc.

It seems like they don't really care too much, they only seem to go after people selling it. They could technically take down the major hackintosh sites, but they really don't seem to care too much about it.

Lets say I've had discussions with a specific member who deals with the boot loader, very bright guy, the entire clover efi project completely blows my mind. obviously wasn't going to read 2000 pages of UEFI specifications xD.

I think it's mostly due to not many people doing it, you have to tech savvy to pull it off, I've had perfect hackintoshes in the past, yet I see people who are too stupid to patch a kext themselves. I technically contributed some things, ie patched kexts etc.

Though, they do some fuckery with the AMD cards, they only code specific connectors into the kexts of mac supported gpu's, so it's a real mind fuck you actually have to patch the connectors, using hex I'm not shitting you, It's so much hassle that you're better of using NVIDIA they work out the box with the drivers.

Though honestly I like Ubuntu better than OSX, less hassle.
 
Shybug said:
It seems like they don't really care too much, they only seem to go after people selling it. They could technically take down the major hackintosh sites, but they really don't seem to care too much about it.

(snip)

I think it's mostly due to not many people doing it, you have to tech savvy to pull it off, I've had perfect hackintoshes in the past, yet I see people who are too stupid to patch a kext themselves.

Agreed. And because of the technical nature of it, and because you kinda have to keep up with it in order to keep your Hackintosh working, there's kind of no way to be a Hackintosh user and not know that you're doing something unsupported. In other words, no Hackintosh users are going to be hassling Apple tech support, trying to figure out why their home-built PC won't boot Sierra anymore. They know they're on their own.

Shybug said:
Though honestly I like Ubuntu better than OSX, less hassle.

Yeah. In the end, it's all about apps and other needs. In college, Linux saved me from probably hundreds of hours in a sweltering HP/UX terminal lab. Without it, I couldn't have done my homework at home! (Thank you, Red Hat Linux 3.03, circa 1996!) These days, most of the apps I run would require some sort of VM or emulator to run on Linux, if they could be made to run at all, and the geek-cred I might earn by doing things that way just isn't worth it to me. In most cases, I'd rather pay to do things in a supported fashion, because getting my work done is more important.

...but we'll see where things are in a few years. I'm also a miser when it comes to upgrading my computer hardware. My Mac is a 2010 model that I'm determined not to replace until 2020, and my 20-inch LCD monitors are from 2003. The backlights take a few minutes to reach full brightness anymore, but they're hangin' in there! I'm not a big fan of the current Mac Pro models with their utter lack of internal expandability, so a new Mac may not be in my future.
 
Status
Not open for further replies.
Back
Top