Once upon a time, colleges didn't exist. You didn't need a college or worse yet a university degree just to get a decent or even great job.
So, what created all of this emphasis for *everyone* to go to college? That only people who go to college gets good jobs? Because now people are graduating and not finding jobs that pay well enough to pay off the debt they racked up by going to college full time.
Not to mention some are stuff with nearly if completely useless degrees or worse certificates.
Do employers really value a piece of paper that says "____ passed this course at ____ college/university" this much?