Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
MIT dean to start new university: “No majors, no lectures, no classrooms” (tech.mit.edu)
376 points by ilamont on Jan 28, 2016 | hide | past | favorite | 274 comments


I think what we need are apprenticeships. I.e. a prominent research group hires kids after the remedial first two years of traditional college curriculum (or something like it plus a couple of years in an entry level job), and has them work on the actual problems at hand, filling the knowledge gaps as needed. It is no longer possible to know much of anything, so we should specialize and diversify at the same time, meaning that these students should be working at the seams between the fields: EE/CS, CS/Statistics, CS/Genetics, CS/Mechanical engineering etc. Further specialization should occur along the lines of whether the student would like to be a researcher, or have a more employable and practical set of skills. I never got much value out of lectures: few people can sustain rapt attention for hours a day. Most of what I know comes from the books. What I did get a lot of value from is working with people who know what they're doing, and from doing things myself, perhaps with their guidance and help. If this is heavily emphasized in this new model, I'm all for it.


I could not possibly agree with you more, having basically lived through this. When I look at the current tuition rates for my alma mater, Northeastern University, it makes me cringe. However, the fact that the default experience at Northeastern requires a student to undergo multiple "co-ops", 6 month paid internships, is genuinely a smart thing in my mind.

By the time I had graduated, I had work experience in embedded systems, a criminal justice stats analysis lab, the Red Sox, and a counter terrorism research center in DC. And that's all well and good, but the most important thing I took from it was that it taught me very quickly which sectors I did not want to work in.

You often come out of high school with this idealized version of what it would be like to work in X field. To learn what that field is really like, and to learn if it's something you truly want to pursue, is a huge win in my mind for an undergraduate.


Another thing that could be helpful is not forcing kids to go to school when they're 18. They have no clue whatsoever what they want to do with their lives and by and large motivation to truly learn something is very much lacking. Or at least I lacked it. My MSc with honors is a feat of willpower more than anything else, I ended up using maybe 10% of the knowledge I acquired in those years. Now if my career started in some kind of an apprenticeship situation, you can bet I'd have plenty of motivation to master the field. That's more or less what happened after college, except it's completely backwards from what it should be, and it's extremely wasteful: I ended up learning (and then, of course, forgetting) a ton of not even remotely useful stuff in hopes that I'd need it when I actually have a proper job. Some might say this is well rounded education, but I'd rather do it by choice, and not arbitrarily. The meta point is, traditional education is really broken, and it has been for decades. It's good that someone has the guts to go against the flow and do even a modicum of rethinking. Academic community is extremely clique-y and conservative, so it takes non-trivial courage to pursue something like this.


At 18 most kids have extremely strong passion about what they want to do. I know I did. I was wrong (maths vs computer science). But persuading them to take a break from that is going to be hard.


> most

My experience says the opposite.


In my experience, there are three groups of kids in high school, at least concerning future studies/employment.

One group maybe wants to do something because it pays well, with no motivation because they aren't passionate.

The next group is very passionate about one or two things, and have lots of motivation. However, they seem to view that as the only option, even though more likely than not it will change and shift.

The third group has motivation to do work that is interesting, but they have know idea what to specialize in and have a few things they don't want to do. They aren't necessarily good at it all, and it's overwhelming to think about plans for the future because of the open-ended nature of that question to them.

This is what I've observed. And of course, people change between these groups all the time.


The first group are mercenaries. They usually end up in finance, management, law, sales or some other (mostly) boring job, then politics.

The second group are crusaders. They begin a quest for fulfillment of their working potential, usually despite their parent's advice, in anything they deem challenging (even in the same management or law as the first group, but for different reasons).

The last group are peasants. These will either be stepping on their parent's footsteps (or following anything they are being sold by their entourage actually), or will feel themselves lost and helpless.

...and I noticed that people change between these groups too.


That's been my observation too. I don't remember one kid out of high school, or college who knew exactly what they wanted to do for the rest of their lives. (Actually one wanted to be a cop, and another wanted to be Maverick.)

As you age, you will find your interests, and passion will ebb, and flow. When I was in college, I was all about learning anything, and everything about medicine. Now, I don't even like disgusting lipid levels with family members.

In college I hated computing, and pretty much everything related to technology. I just didn't find much invented by man interesting. I felt like Programming was a waste of a life. I couldn't begin to comprehend how someone could stare into a screen for more than a few minutes. Now--I can't put this stuff down.

And to be completely honest, I still don't know how I want to spend my remaining days.


In SDLC terms, college is an education delivered via the waterfall method whereas the workplace requires a continuing agile process.


Not really. I would argue that today college is little more than a weeding out process that weeds out far too many. You sit in classes and solve bullshit problems with known answers that will have very little bearing on the realities of your professional life once you graduate. For the most part people go to college to have a piece of paper that says they went to college and had enough stamina to tolerate this bullshit and not drop out.

What we should be cultivating instead is the capacity for self-directed learning and deep motivation.


> You sit in classes and solve bullshit problems with known answers that will have very little bearing on the realities of your professional life once you graduate.

Gonna argue with this one from the standpoint of Electrical Engineering.

I would argue that almost everything up through junior year gets used every day. Basic circuits and basic digital never quits being relevant (I have pulled out a Karnaugh map every now and then to demonstrate to some programming folks that their conditionals were wrong). Basic electronics and op-amps similarly.

Once you hit junior year, the courses get specialized and don't get so much daily use--electromagnetics, digital signal processing, control systems theory, communications system theory.

The problem is that you don't know which one you like until after you take it. And before you take it, you're not useful to a company, either, so intern/apprenticeship doesn't help.


Karnaugh maps are taught to programmers (on college/university courses) too :) at least in my country, I'd be very surprised if they aren't in the U.S.

I'd actually mostly forgotten about them :) .


I only pull K maps out once or twice a year when some programmer absolutely insists that a conditional is correct when I can see that it isn't.

When I teach the class to programmers, state machines are the far more important concept I try to drive home. I use state machines all the time when programming.


...and state machines are hard to represent in most languages! That is a real pity. Its part of the one-dimensional nature of text-based programming. While state machines are two or three-dimensional ideas.


Stamina to tolerate bullshit and get things done despite thinking they're totally idiotic is valuable skill set in many industries. The piece of paper is also proof that you can follow directions well.


Getting shit done is valuable. Tolerating bullshit is absolutely not.


A failure of many youngsters is that they think everything annoying or boring is "bullshit".

For example, I have to practically threaten employees in their 20's to actually fscking pick up the phone and actually TALK to a customer rep. Yes, it can be annoying--they're going to try to sell you something as that's their job. But some of them also know what they're selling better than anybody on the planet. The only way you will figure that out is if you talk to them.


Don't threaten. Get them engaged, get them to set their own team goals (with your guidance) and get them to agree to the goals. Divide pleasant and unpleasant tasks visibly and fairly. Get rid of people with poisonous attitudes who sabotage this white glove treatment.

Threatening only works in the short term, right up until your best people pick up their shit and leave. In the long term it does not yield the kind of workforce you actually want to have.


> Get them engaged, get them to set their own team goals (with your guidance) and get them to agree to the goals.

And how many times am I supposed to let them order the wrong thing (at company expense) because they won't PICK UP THE DAMN PHONE AND CALL SOMEBODY? I've been mentoring interns and co-ops in engineering for almost 25 years and this set of 20-somethings is just like any other--except with respect to their absolute avoidance of using the bloody phone.

Making a genuine mistake is acceptable. Talk to the rep, order some stuff and see if it works. If it doesn't, no big deal. With the rep in the loop, it'll take one or two iterations, and you'll get what you need. After the third time I asked: "Did you call the rep?" "No." "Do it." After the fourth time: "Call the rep.". "Okay." "No, NOW. While I'm standing here. On conference." <rolls eyes> After 10 minutes of talking to the rep, he's got an order for the right stuff, the rep comped it, and sent it overnight.

The worst part--even after examples like this--THEY STILL WON'T PICK UP THE PHONE. It's absolutely infuriating. This isn't just one person getting old and complaining. It also isn't just my observation--I have heard the same complaint from different people from ages 35 to 55.

I've reached a point where I almost don't want to hire anybody in their 20's.


"Tolerating bullshit is absolutely not (valuable)."

I agree with this bottom line, but I sense a lot of politics in there around what is/isn't called bullshit. Getting opinions when you ask for them is valuable, having to fight your way against your employees (for them to do something you are paying them for) is absolutely not.


I've had a hand in hiring no less than three people from Northeastern University, and every single one of them has hit the ground running in their role. I can't speak highly enough of their program's co-op requirement. It's one of the smartest curriculum moves I can cite.


I think internships are extremely useful! I went through a lot in high school and college trying to figure out what exactly I wanted (or didn't want to do). Now, I have a number of interns and some of them have gone on to graduate school in physics. Others have decided that physics isn't for them and have decided to do computer science or math. I try to give them a chance to have a self contained meaningful project.

Do any of you have any suggestions for remote interns? I tried it once and though both the student and I were motivated, the time difference was a killer. Also, I rely a lot on random interactions with my interns. I think it would require a lot more structure to pull it off...


Startup idea #4096

---

A YC apprenticeship program that combines internships with YC companies with trade classes from various resources including techshop, crucible, and others.

There are so many creative people in the world that simply need mentors and resources.

The Bay Area can really provide all of this!

There are likely tens of resources I am glossing over, but if the YCcommunity is to prove that we can solve this issue, we have everything we need to do so and just need a leadership umbrella; YC, to prove it and the gumption !!

Who's willing (other than me, and I will dog food this!!)


How do you gel the 'move fast and break things' ethos with an internship / mentorship role? The former requires existing domain knowledge to hit the ground running and achieve large things quickly, whereas the latter requires going more slowly, explaining the reasoning behind decisions, allowing mistakes as learning processes, etc.

edit: Unless you mean established former YC companies who are no longer in the startup mindset. I didn't consider that originally but then why limit yourself to YC companies if you can snag some sort of vetted supply to the Facebooks/Twitters/etc. of the world?


Agreed, and that's not even limited to tech. I'm in the commercial real estate world, which practically revolves around apprenticeships, and the system works amazingly well. It also rewards competence over credentials, commitment to improvement over connections.

Trying to make the world better (with initial guidance) and solving real life problems is the best teacher of all.

College training for me, or for all the apprentices I've had over the years, was of minimal importance beyond learning the discipline of a rigorous approach. I could have started after sophomore year, and arguably done better (without having to have spent 2 more years at school, and 2 more years at a tech startup as an employee, and with 1/2 the student debt and bad cafeteria food).


How is this different from existing internships? Or are you suggesting that they be made graduation requirements?


This would be similar to internships in research labs (for the academically inclined) or startups (for product inclined). This is not at all similar to an internship at a traditional company, where you're given some kind of a bullshit problem that no one else wants to touch. Doing research or building a product (or, preferably, a mix of both) should be central part of the curriculum, not an afterthought which you might or might not pursue.


Yes, part of it should be that the student gets academic credits that count towards graduation. Even establishing minimum and maximum quotas of which credits come from apprenticeships is healthy with moderation.

But a bigger part of it is that the University must provide for oversight so that the actual tasks assigned to the students is relevant to their academic goals, and that the complexity of the tasks increases progresively in order to present the students with renewed challenges according to their growing skillset.

Without this formal oversight, the whole thing could easily degenerate in a shady personal business where the school's officials "rent" their pupil's slave labor to the highest bidder.


I would argue even if they do rent "slave labor", if the tasks are appropriate to the chosen specialization, and the cost to the student is lower overall (ideally zero for tuition), it'd be an improvement over the status quo.


What I am hearing is that, as long as you do "something with computers" and it does not cost any extra money, you'd be happy to exchange 4 years of your time for a Bs in CS?

What about getting - gasps, the horror - and actual job as a data entry clerk? You know... something that pays you money instead of making the dean richer with your efforts?


Compared to what, though? To paying tens (up to over a hundred for top schools) of thousands of dollars for the same thing but without practical skills? Ideally, people would be leaving school with something to show for it: open source code, launched projects, co-authorship on research papers, stuff like that. Or better yet, leave as cohesive teams, ready to rip in the public sector. Or even better: leave as a part of a startup with their own product (accelerator model), while school would retain minority interest.


So, now you want to write open source code, launch projects, co-author papers,... not on my watch, kiddo.

Now go put your nose down on those damned spreadsheets. Copy from column A to column Q - no, no macros, thankyouverymuch - but convert everything to UPPERCASE, and fix typos please, unless that was not a typo, in which case don't.

And remember, the Dean already cashed the check I sent him, so you better hurry up. I want no less than 100 rows per hour. Otherwise you flunk the semester, and while you don't pay tution your parents are going to get tired of having to support your lame ass if you do not start to bring a wage home one of this years.

Is that "practical" enough for you?


I'm not even going to dignify this with a proper response.


The only point of a modern day university is that they are (a) exclusive, and (b) "guarantee" that students "know" what they are "taught". That's at least what people who hire think.

I remember someone saying "It wouldn't matter if MIT only taught basket weaving as long as they still had such a thorough selection process". This holds true here. People only care about the pedigree of your degree, it's substance comes second in my experience.


"guarantee" that students "know" what they are "taught"

That used to be the case, but now corporations require you to pass their own exam before they hire you. Let that sink in a moment. No longer are the universities taken at their word about their product.

For some of you it has never been any different, but this change came about more or less when I was teaching at university/college. Some of us saw this as a quite profound change.

In my view, that was the first signal of the beginning of the inevitable disruption of modern higher ed.


> That used to be the case, but now corporations require you to pass their own exam before they hire you.

You mean an ... interview?!

Ok, that was purposely obtuse, but I don't see how an extension of a company's interview process (ensuring, through their own testing, that candidates meet the proper qualifications without school bias) is a bad thing. Or how it signals an 'inevitable disruption of modern higher ed'. Would you mind expanding on that a bit?

To me, that helps keep schools honest and continuously adapting their curriculum's, while ensure that the best candidate gets the position, regardless of the institution they attended.


Your conclusion is right. To me, that helps keep schools honest and continuously adapting their curriculum's (except for the misplaced apostrophe).

In other words: Sure, you have a 4.0 in engineering from Stanford. But now let's see if you can actually compute a Fourier transform. Or even if you know when to use a FT.

To my generation, that's damning. In other words, (quoting you here) ensuring, through their own testing, that candidates meet the proper qualifications without school bias is the new normal, and it's different in a way that says we don't/can't trust the universities to deliver their product any more. The status of the universities has changed in our society, and that, in my opinion, is an early signal for a coming disruption.


It just means that employers are now hiring for competency in job function, and not just plucking the upper class elite to get access to their rolodex.

IOW, it's not that schools are getting worse at teaching, it's that they are getting better at teaching, and less about filtering elites.


I think we are still quite a long way away from removing the school bias.

In my experience, the large tech companies actively recruit from select few universities (and prefer applicants from those universities). The resume screeners / sourcers are biased to look for top schools also. They often don't want to risk spending time on a candidate from a school with which they are not familiar.

Most qualified applicants never have the opportunity to prove their skills in a phone screen (e.g. coding interview). The school bias continues to exist at the resume screening stage.

It's not that different among startups (especially YC alumni).


We don't need to remove school bias. We need to create new/better schools that others can trust will put out better results than what comes out of universities.

I can't tell you how much I have urged, at times forced, my friends in CS who didn't learn programming since 7th grade (me) to do more before they graduated. Those who listened and did an intensive summer apprenticeship with a student founded web agency I worked at previously have progressed to the point where you probably couldn't tell they came from my university (which, is to say you probably never heard of it and if you got to know it, you wouldn't think very highly of it).


There's a good story in Ben Horowitz book the "Hard Thing About Hard Things" that angered me about how blatant good school vs weak school bias appears to exist within the startup world.

But it has good advice on the need to overlook people's "flaws" (if you can call them that): -----

"Hire a worldclass team" is about as helpful as telling someone to "Try their hardest." Anyone building a company likely already is, and if not, you telling them isn't going to suddenly make them try harder.

You know what's hard? Hiring a worldclass sales manager when you have a company that is trading at less than cash at the wake of the dot com bust. Astoundingly, the best sales managers in the world just weren't returning Horowitz's calls. So instead-- in one of my favorite sections of the book-- he describes hiring Mark Cranney.

It was a decision most of his board and his executive team were violently against. (Lesson to Horowitz: "No one else gets a vote.") Cranney actively made people feel uncomfortable-- not what you want in a sales guy. Horowitz describes him as physically looking like a perfect "square." But he was a savant at how to build an effective sales team.

My favorite passage is when Horowitz sat down to explain to his cofounder-- and to many, the face of the company-- Andreessen why he was hiring him:

I let Marc open the conversation...by listing his issues with Cranney: doesn't look or sound like a head of sales, went to a weak school, makes him uncomfortable. I listened very carefully and replied, "I agree with every single one of those isues. However, Mark Cranney is a sales savant. He has mastered sales to a level that far exceeds anybody that I have ever known. If he didn't have the things wrong with him that you enumerated, he wouldn't be willing to join a company that just traded at thirty-five cents per share; he'd be the CEO of IBM." Marc's reply came quickly: "Got it. Let's hire him!" That is the reality of how you hire as a startup CEO going through any degree of shit, which let's face it, they all are. Unless you are Facebook, you can't call whoever you want an offer them a job. (And truth be told, even Facebook doesn't have a 100% batting average on hiring.) You have to find the person the best at the single unique skill you need and tolerate everything else that comes with them. The reason they aren't running IBM despite their skills. That is helpful hiring advice.

(https://pando.com/2014/03/04/the-hard-thing-about-hard-thing...) ------

Anyways, personally I prefer companies using screening tests over degrees because it is more meritocratic.


That's a very awkwardly written story, but "know what league you are in" is solid advice. It's Moneyball in another domain.


i'm from germany and never visited a good university ( so its entirely possible that none of this happens to graduates from mit, i dont know), but i'm pretty this is what he meant.

i applied for several different companies for a pretty low position around 3 years ago. one of the companies was MAN and I had to pass a) an online test for preselection, b) a written exam on premises and finally b) an interview.

i fucked up in the interview, but they still mentioned that they would need at least a second interview, probably two, before seriously considering me for this position.


European companies are a while other thing though. In most big EU corporations you get hired for position, not skill. The position is ranked relative to all other positions and your rank is determined through your school/education.

Case in point, it's very difficult to get hired to a German or Swiss company with a US engineering degree, regardless of the universities standing. I had to go through that myself, it's ridiculous.


It's surprising that it's taking so long. The wealthiest have always found ways to preserve their wealth and Ivy League institutions have played a role in that. Take some smart people, a lot of people that are good at fitting the mold that looks best, mix your rich kids in there and call the paint one color.

Remember, George W. Bush (all politics aside, just looking at him in terms of intellectual prowess) graduated from Harvard and got his M.B.A from Yale. That's one easily identifiable example but you can find many. And then you ask yourself, if you can take a bus that size through a fortress wall, can you still call it a fortress wall? And then it's clear for what it is -- decoration. Worse: it's signalling, which is the opposite of a meritocracy.


What do you really know about George W. Bush?

http://keithhennessey.com/2013/04/24/smarter/


Love that piece. It's worth pointing out also that Keith was there only for the final year of office, rather than near the beginning, which is one of the best pieces of anecdata against the post-1994 ongoing dementia theory (which a priori I still think is more likely than the "playing dumb" theory). There is of course other anecdata on Bush not being as intelligent as Keith claims, but the big issue that should make most people have low confidence in either extreme is it's hard for people to form an accurate mental model of the President. The 4th image from http://www.ribbonfarm.com/2015/04/08/the-essence-of-peopling... highlights the issue.


I don't much like his policies, but I always had a big question mark in my mind when it came to Bush's raw intelligence. He spoke like a man who didn't know how to speak in public, not like someone who was ignorant or addled. There was a peculiar absent-mindedness to his gaffes ("misunderestimate", "is our children learning") and so forth but it had a more "way out of his element" feel rather than coming from a place of stupidity. I doubt I'd be able to do much better in front of a large and frequently very hostile audience.

Part of my problem with American politics is that so few know how to think clearly about the actors involved. I guve Bush considerable benefit of the doubt when it comes to mental capability but I don't agree with him. And I dislike Obama for his failure to uphold the values he got elected on, not (as many feebleminded urbane Democrat voters suppose) because of his skin color.


> He spoke like a man who didn't know how to speak in public, not like someone who was ignorant or addled.

His problem went beyond not knowing how to speak in public, which, by itself, is not a sign of not being intelligent. He was truly very dull, something he himself admitted to ("I'm not the sharpest tool in the shed").

Aside from that, if you passed through college, then through grad school and always referred to the Internet as "the internets", and to "the google", then public speaking is the least of your worries.


> He was truly very dull, something he himself admitted

You may wish to rethink this.


You'll have to give me good reasons to. No, downvoting is not one.


Was not the linked article sufficient?


It's revisionist history.

Bush is clearly not as dumb as he was caricatured. But he certainly couldn't lead in any way on his own without daddy's entourage telling him what to do.


Could any head of state? The depth of expertise required to hold that kind of position is far more than one man can deign to have.


The linked article was fluff piece by a friend of his.


What do you mean by this?


Dull people don't usually call themselves dull. "Reasonably intelligent" or "average" are more common.

Think, was there perhaps a strategic reason to paint himself so?


Think, was there perhaps a strategic reason to paint himself so?

Socrates, at his trial before a jury of Athenians, is said to have used a variation of the Unfrozen Caveman Lawyer defense to provoke sympathy: "I'm just a humble country boy, not worldly and sophisticated like you", etc.

Not that Bush is any Socrates, but the stratagem is the same.


It's actually better for GWB's legacy if he was an easily manipulated simpleton. If he's truly a secret (to all observable evidence) genius, that also makes him a heinous war criminal and corrupt crony capitalist.


To add to this, as a student in India, I would often run across people who thought Bill Gates was an Idiot. Not sure from where that caricature arose. But some people simply hated Bill Gates, and believed he didn't deserve to be the richest person in the world.

I'd have to talk to them and explain he was a work horse extraordinaire, and was the equivalent of great project manager, programmer, and a business guy all rolled into one.

Its very common to hate successful people, who don't fit our narrative of what a hero should look like.


Nobody I know hates GW because he's successful. They do, because of the mess he got us in. Yes, he was led to taking the actions he took by smarter and more devious people around him, but the ultimate responsibility was with him, or on him. He was the president.


I don't know about US politics. But I have seen people unjustifiably bash successful people. In fact, this is the common place ideology in today's feel-good culture to deride other people, to make their own shortcomings look acceptable.


Bush?

When asked a question, he'd never give an answer that illustrated a deep depth of analysis.

Now he certainly could have done the analysis all the time, but he never spoke about it so I did not have any insight into was his thinking process was.

Now, I never thought he was dumb but his persona of well-mannered everyman never slipped even one bit, so if it's a mask, I have no idea what is beneath it. He could be average or a genius or low intelligence with extreme charisma. I don't know.


> When asked a question, he'd never give an answer that illustrated a deep depth of analysis.

You wouldn't either if you were talking to the general public.


I've met a couple politicians I considered dumb, and was very surprised at how smart they were in person. I'd be very careful concluding any President is a dimwit.


that's really fascinating, I'm not a liberal, but very opposed to the Bush presidency - I'm entitled, to since I voted for him once. And I always believed him to be stupid. And I think these days, I do the "No intelligent person could conclude X, therefore President Bush is unintelligent." because if he's dumb, the policies he executed that I disagree with it just a matter of ignorance, and not evilness. I want to give him the benefit of the doubt.


Assuming that that article is correct, all that proves is that smart people often make dumb decisions, which we've known for a long time.


How on earth such things don't surface harder ? W Bush was an idiots idiot for decades. His 'speech fumbles' are still very bad in form and "content" to me. I'd never expect to read anything of that sort about the man ever.


If Bush is that smart, what does that tell us about his questionable decisions on other things? Why start a war in Iraq based on WMD's that weren't there, and a government that was not in league with Osama bin Laden? Why start dragnet surveillance? Why let the CIA torture people? Why do the ridiculous tax cuts at the same time we're spending big dollar on a war?

If Bush is that smart, he must have good motives for those things, or if not good motive, hidden motives.


There's a lot of people whose intellectual credentials are unassailable that supported all of those programs. The differences in opinion come down to values and ideology, not intelligence.


I've always been a believer that smart people can come to very different conclusions on how to do things.

Just look at the number of people who fall within the Alan Kay camp of mutable state vs those who like Haskell.

Look at those who think Lisp should have ruled the world versus those who believe Prolog just never had a fair shake.

There's a bunch of issues that reasonable people can reasonably disagree on.

Granted when it comes to politics I believe all motivation is hidden because ture radicals can't win the vote by coming out and telling people they plan a revolution. They have to claim to be only slightly off center, then continually reframe the issue till people begin to see things their way.

Guantanamo Bay was a great example. If you'd asked Congress "do you want to authorize a secret camp to torture dissidents just offshore from the continental USA?" then they would have said no. Yet now that the camp exists, the question has been shifted to "Do you want to close our black-ops torture camp, and move all those people with grievances (legitimate or not) against us to your home state's prison system?"

Now even Congressmen who hate torture feel unwilling to go down in history as the one who let 'terrorists' into gen-pop, even if in fact our prison system is perfectly capable of handling dangerous individuals (like all the homegrown terrorists we have).


I can't see any scenario where deciding against extending surveillance in that manner is the wrong decision for the executive: the potential downside of not doing it is very high (in the event of an actual attack you will be hit by your opponents for endangering the public); the potential downside of doing it is only criticism from the very small number of voters who take communication privacy very seriously.


This isn't a counterpoint to Bush actually being intelligent, but the most convincing theory I've heard about the war in Iraq is that Sadam Hussein was going to sell oil for Euros instead of Dollars (and he had formally announced those plans). The idea is that speculation about future oil not being bought and sold in dollars would devalue the dollar as countries would hold less in their reserves, and this would start a more rapid cycle of devaluation.

EDIT: Looks like people don't even want to deal with this as a theory.


you should really parse the WMD stuff over again. The history on this is as follows: Post Iran-Iraq war Hussein had a WMD development program, and then the UN inspections regime came and put several of them "under seal". In the runup to the Iraq war, the US repeatedly asked where these sealed WMDs were and Hussein refused to disclose where they were or allow inspectors to see them, and the US spun that as implying that he COULD use them to supply terrorists with WMD (technically true, unlikely to actually happen). Ultimately, after invading the US found them, and they were still "under the UN seal", unable to be used in any military context.

The US also never argued that there was a direct connection with Bin Laden, just that Iraq was covertly supporting terrorism. Which is similarly flimsy, limitedly true, and definitely manipulative of the US public.

So, the Bush administration was incredibly deft at taking minor truths and spinning them in the public's mind as molehills that justified bigger action. Not to justify their actions, but in the strictest sense, Iraq did have WMDs, and Iraq did support terrorists.


And why did Hussein refuse to disclose where the WMDs where or allow inspectors to see them? Because he wanted to bluff neighboring countries (primarily Iran, I believe) into thinking that he had them and would use them in a pinch. But that bluff ran into the US's desire to control who had WMDs, with disastrous results.


George W Bush plays much dumber than he is. It's hard to get an unbiased opinion on the matter since it can be used to score political points, but he definitely has >100 IQ.


I'll just leave this here: "George W. Bush, Republican, 1946–present, 2001–2009, IQ:138.5" --> https://en.wikipedia.org/wiki/U.S._Presidential_IQ_hoax#IQ_e...


I think GWB is probably pretty intelligent, but this is basically psychic "remote sensing" applied to IQ testing. Is there any evidence this technique of estimating IQs works?


I love it. For everyone who believes a study that gives him an IQ of around genius level (and for every US president -- look at that!), I actually own the Brooklyn Bridge and I'm looking to sell it.

I used the wrong example: nothing creates cognitive dissonance faster than politics or religion. But, that said, it's not a show. He's quite consistent in the types of grammatical mistakes he makes, such as simple subject-verb agreement, for example, and they indicate that he's either aphasic or mildly-retarded.


I am not in anyway endorsing the man. But I will point out that he flew a fighter jet...

'Upon its completion, Bush was promoted to the officer's rank of second lieutenant required for pilot candidates. He spent the next year in flight school at Moody AFB in Georgia from November 1968 to November 1969. The aircraft Lt. Bush trained aboard were the T-41 Mescelero propeller-driven basic trainer, T-37 Tweet primary jet trainer, and the T-38 Talon advanced jet trainer. Bush ranked 22 out of 53 students in his flight school class with a grade of 88 on total airmanship. His scores included 100 for flying without navigational instruments, 89 in flight planning, and 98 in aviation physiology. Bush also completed two weeks of survival training during this period.

Bush then returned to Ellington in Texas to complete seven months of combat crew training on the F-102 from December 1969 to June 1970. This period included five weeks of training on the T-33 Shooting Star and 16 weeks aboard the TF-102 Delta Dagger two-seat trainer and finally the single-seat F-102A. Bush graduated from the training program in June 1970. When interviewed by the Associated Press in February 2004, flight instructor Maj. Udell recalled that Lt. Bush was one of his best students saying that, "I'd rank him in the top five percent."' --http://www.aerospaceweb.org/question/history/q0185.shtml


Yeah, the Air Force tries hard to wash out dimwits before they get behind the controls of an advanced jet fighter.


Grammar (in English) is a tiny subset of human knowledge. Being bad at just one thing hardly makes you dumb. However, consistent grammatical errors during a speech will definitely project the appearance of stupidity...


...especially if it's the only language one speaks fluently.


Well 100 is average so you're pretty much riding the top of the bell curve with this comment.

But then, there are house plants that decided not to invade Iraq and Afghanistan...


Why are you lumping Iraq and Afghanistan together? Those were vastly different situations.

Note to down voters: do some research. First of all, there was widespread support for invading Afghanistan in both the US (over 90%) and in Afghanistan (also in the 90% range). This was not the case for Iraq.

Second, Iraq was based on questionable intelligence that there might be things going on there that could potentially lead to attacks on the US at some future time. Afghanistan was in response to an attack that had already occurred by groups that were operating out of Afghanistan.

Third, Congressional support for the Afghanistan invasion was 420 yes, 1 no, 10 not voting in the House, and 98 yes, 2 not voting in the Senate. For Iraq, the House was 297 yes, 133 no, 3 not voting, and the Senate was 77 yes, 23 no. FFS, even Ron Paul voted for invading Afghanistan.

There is no way you can make a plausible case that Bush supporting the Afghanistan invasion indicates low intelligence, whereas you can kind of make such a case for Iraq.


Well you can make a low intelligence claim for Afghanistan because of how many others have tried that in the past and failed miserably. But history and constitutions are just pieces of paper, he wanted to do what he wanted to do because he wanted to do it, even though neither country attacked the U.S. and therefore attacking both of them was a U.N. Charter violation, which we are of course signatories to. So I prefer suggesting he was low brain cell count rather than suggesting he's malevolent, but those are the binary options in my world view when it comes to George, and anyone else who voted for either one of those invasions.


How is invading Afghanistan against the constitution?

>even though neither country attacked the U.S

Afghanistan attacked the US by allowing an terrorist group who declared war on America plot an attack on the United States from their country.

>and therefore attacking both of them was a U.N. Charter violation, which we are of course signatories to.

The UN Charter allows for self defense AND the UN authorized the attacks. In fact the ISAF is a UN operation.


Your argument is: 1) George W. Bush went to Yale and Harvard Business School (not the other way around, btw) 2) George W. Bush is not intelligent

Therefore, one need not be intelligent to graduate from those schools

You may be right about your conclusion, but you don't support premise #2 -- rather, you just assume we all agree.

Why do you think he's dumb?


The word "meritocracy" was originally invented as satirical word to describing "signalling by people who pretended they weren't signalling"


While companies do sometimes give tests to candidates coming out of college, it is rarely the case that this is more than a crude in-house diagnostics tool that attempts to gauge very narrow job skill-sets or aptitudes.

There is no "Exam".


> corporations require you to pass their own exam before they hire you

If you're talking about software interviews, know that you need to have pedigree even to get an interview. See Twitter or Palantir recruiting.


> > corporations require you to pass their own exam before they hire you

> If you're talking about software interviews, know that you need to have pedigree even to get an interview. See Twitter or Palantir recruiting.

Counterexample: I currently work part-time (alongside my undergraduate studies) at very well-known Linux software company. When being interviewed, there was no mention of any "pedigree". It was simply that I had contributed to the free software project they were looking for engineers for. Simple as that. You'll find that competence is far more important than pedigree. I've always hated the concept of pedigree, it disadvantages people who have skills but are poorer than others. We ought to push for a meritocracy. As a separate matter, higher education should be much cheaper (or just make it free).


Not just software- it started in engineering before software. My point is just that- pedigree is no longer trusted.


On higher Ed:

There is far more to a degree than the classes and the piece of paper with a GPA. It is also an environment where young people grow as themselves for the first time in their lives in many ways. There is also more that can be done to ensure strong job placement, like my alma mater, RIT, in requiring a full year of paid co-ops before they grant a degree.

On hiring:

There are infinite metrics that assign value to a candidate regardless of the school they attended; culture fit, for instance.

However, hiring is an expensive and labor-intensive process, and every moment an engineer is looking over resumes or conducting interviews is time that could be spent fixing bugs or creating valuable features. Pedigree, for better or worse, simplifies the decision. When you have a stack of resumes to deal with, half from small colleges you've never heard of, and the other half from, say, Cornell, whom do you prioritize?


> No longer are the universities taken at their word about their product.

Exactly. Education is a thing where a student voluntarily decides how much effort he/she wants to put in learning, not some software that the professor can install inside their brains. All they can do is check out their past scores to determine how dedicated a student is and only admit such high-grade students. But again, time is fickle and people do change. Lazy and lethargic students suddenly have some life-altering experience and they turn into rank-holders, the opposite can happen too!


Graduating from a top 20 school increases the probability that you'll even get to take that "corporate exam". Recruiters use them as filters.


> That used to be the case, but now corporations require you to pass their own exam before they hire you

I'm seeing this as well and am developing a learning / training platform based on this thesis for professional dev.


It's good to know that the teachers/professors who still care are aware of how bad things have gotten.

Do you, as a former/current educator, see things changing in the future?


I see two things. One is what I've already discussed. The other is something perhaps just as interesting.

Colleges and universities go up for accreditation every ten years (perhaps it varies, I'm not the world expert on this but I've been part of the process). A large part of this process is a self-study conducted by the faculty and staff of the university. In the recent (one-to-two accreditation rounds) past there have been high profile universities that have failed that process. The thing that I find interesting is that the faculty at these universities (to the best of my knowledge) don't particularly give a shit about this process. This accreditation means very little, and is not worth them putting time into. Now, accreditation by their respective professional society is quite important- the Chemistry department cares deeply about what the American Chemical Society (or whomever it is) says, but not about what the North Central Association of Colleges and Schools says. This is exactly the way that certificate programs work: Learning to be a Honda mechanic, for example, requires a program accredited by Honda, not specifically by the NCACS.

So it could be that the idea of going to a university and spending $100K on a "broad education" becomes a thing of the past for all but the true scholars, and most people start turning to specialized education for jobs like engineering and computer science. Just a hunch.


By this reasoning, you should put a list of the universities you were admitted to on your resume, not the university that you actually attended. Under these terms, what's the difference between going to Harvard for a couple of semesters and dropping out, and getting admitted to Harvard but deciding to attend Dartmouth instead?


For me, being at MIT has been much more valuable than my time at RPI (my alma mater). And that's because lots of cool, innovative, and intelligent people flock to MIT. When hosting an event, MIT is able to get the best speakers in the world to show up, because the speakers are excited to be at MIT. I've had conversations with people who would never have come all the way out to RPI because RPI just isn't as interesting.

And so it reinforces itself. Intelligent people flock there because intelligent people flock there. But it also means that the time spent there is more valuable.


To be fair, RPI is in Troy, NY, and MIT is in Cambridge/Boston, MA.

Nonetheless, I agree with you. Location probably is a deciding factor in a minority of people who would rather go to MIT to talk than RPI.


And what do you think is the advantage of Cambridge over Troy? At least in part, it's the fact that it's where people who go to MIT live. And once they graduate from MIT, there are plenty of companies just across the street waiting for them, having located their offices in part for the convenient access to MIT. It's not clear that Cambridge would be at all the same without MIT and Harvard.


If you have not joined the university you are not part of the "in" group. There is a perception among people that if you attend a "prestigious" university that you are smarter than someone who attends a community college.

I find that crazy. Both teach basically the same thing. If they are both accredited, then it is almost indistinguishable.

Who would you think is smarter off the cuff, a Harvard graduate, or someone who attended an accredited community college and followed it with a masters from an accredited online school?

I'm sad to say that I've been programmed to assume that the Harvard graduate is smarter. It is just a bias that we are all taught as we are young.


> I find that crazy. Both teach basically the same thing. If they are both accredited, then it is almost indistinguishable.

Speaking just on computer science here, but the depth of CS courses varies radically across universities. Sure they all might follow the ACM model curriculum, but how much of the book you're actually reading (and which book you're using in the first place), the number and difficulty of practice problems you're assigned, etc. is definitely not basically the same.

I went to an average engineering school, and also took a handful of grad classes. Many of the assignments in Master's classes were a similar difficulty level to the assignments in undergrad MIT classes on OCW. Not all of the courses were this way, but many were.


It's not just the depth, but the breadth as well. Better universities do an amazing job of teaching their students how to learn. The curriculum is thoughtfully designed to produce well rounded students who are capable of achieving expertise in their chosen field (they don't leave college as experts by any stretch.. and THAT is the disconnect surrounding education these days).

The big difference between someone coming out of a higher-end school and a community college isn't (necessarily) intelligence or even knowledge, it's that the university educated ones have been prepared to reason about the world in a more complete way. Which is comforting for me to know when I'm hiring.

I should note: I didn't finish college (I made it through three years). My wife on the other hand went on to a PhD. At 24 I was convinced that I had missed nothing. It wasn't until about a decade later that I began to understand that she was capable of seeing so much more nuance and detail than I was. Partly because she's smarter, but partly because she had so much more substance to her education. It gives her a basis to reason from that I don't have. It's a gap I'm feverishly trying to close through completing my own education... but it sure would have been easier to have done it in my 20's...


> it sure would have been easier to have done it in my 20's

Not necessarily. It might have required less determination to go along with what others around you were doing, but a lot of people who try learning subjects like calculus as adults report that it seems easier. They have better focus, and most important, they understand why they are doing it.

In short: it's never too late! Study on!


For someone who can't see nuance, this is a remarkably perceptive comment.


An interesting thing about OCW and some of the moocs is that they allow us to see what different schools are like, to some extent. Even with the semi-watered-down versions of classes from Stanford and MIT, I definitely feel those schools in particular are more challenging than most others.

Other ivy league schools, not so much, at least from the watered down versions, and with some exceptions from notables (Financial Markets taught by Robert Shiller, for example)

Most schools, at least in the mooc versions, tend to be about the same level, regardless of prestige.


That's really cool. It would be interesting to see a list that highlights those courses that are the notable exceptions.


> Both teach basically the same thing.

They should, but they don't. Teachers make a difference. I'm at a mediocre university and watching MIT lectures has made me realize I'm missing out on literally mountains of knowledge. Never mind the qualification process, spending several years with intelligent people (both students and faculty) makes a huge difference.


Keep in mind that any videos you're seeing are out there because they're mostly very good examples of lectures. Any large research university has lots of classes with very poor lectures. Not to say that the material isn't in depth and peer/TA interactions aren't valuable but don't assume YouTube and OCW are representative of lecture quality.


I would assume the Harvard person was smarter (unless their father also went there AND they have a building named after them there), simply because anyone can get into the Community College but not Harvard. That, in and of itself, is a filter. So it's not biased to think that.


Intro courses are the same (calculus and basic chemistry are calculus and basic chemistry) but once you get past anything more difficult than 101 you see a wild disparity in difficulty across different universities (some universities will have material as junior-level undergrad that others have as graduate). Community colleges don't even try to teach the vast majority of courses taught at the smallest least prestigious 4-year university.


I went to two different state schools. 4000 level courses at one were 2000 level classes at the other, and vice versa. I lost 30+ hours transferring, because courses didn't meet the rigor, or they weren't even courses at the better university. For instance, a 4000 level linear algebra course counted as half of a calc 2 credit.


That in and of itself is a weird equivalence to draw. Calc 2 and Linear Algebra are radically different courses...


>Intro courses are the same (calculus and basic chemistry are calculus and basic chemistry)

I went to a tech school. I also went to community college over my first summer for some basic classes. The community college classes were laughably easy and most certainly weren't anywhere near college level, and this was considered a "good" community college.

I hear that in state schools they pack 100+ people in a lecture hall for biology 101 or whatever. That seems like it would be difficult to learn that way. I went to a private school and there was no giant classes.


Someone attending MIT has the opportunity to expose themselves to far, far more than someone who went to community college. MIT has far more resources, and cutting edge research is taking place there.


Don't forget access to the faculty and your peers.


When I was at Caltech there was an informal weekly thing where students in the freshman physics class who wanted some help or wanted to discuss some things from class in more depth could go as a group and get help from or discuss things with Richard Feynman. (One of my regrets is that I did reasonably well in freshman physics and was not smart enough to realize that spending time chatting about physics with Feynman would have been worthwhile even if I did not need it to pass the class).


I'm not U.S. based, so I don't have a total understanding of the whole "prestigious" university vs. community college debate. However, I find it hard to believe that they teach basically the same thing.

I'd believe that their curricula are essentially the same, but that's what's "on paper", so to speak. Even if they're both accredited, wouldn't better faculty lead to better teaching? Or better research programs and partnerships with industry? Even stricter requirements and evaluation criteria. If at least some of these aspects are true when talking about "prestigious" universities, then the bias you describe isn't that bad of a heuristic.


> Even if they're both accredited, wouldn't better faculty lead to better teaching?

No. Generally, at research institutions, faculty are not chosen for their teaching ability. They are chosen for their research background. They are often horrifically bad teachers, if they even teach the class at all and don't just assign it to a TA.

It also doesn't take a world-renowned genius to teach an undergraduate or even a Master's program. It takes someone with a very solid grounding in the subject, which should be any graduate, who also happens to be a great (preferably enthusiastic) communicator.


I can see your point, but I believe this roughly depends on the education level we're talking about. I did my degree quite a few years ago, but finished my PhD last year. At least in the higher-level, when "learning" isn't exactly the classroom-like experience, I'd rather have a brilliant researcher but lousy teacher as a professor (instead of the opposite).


Absolutely, at the highest levels, where you are doing research, working with researchers, and working in labs, then you definitely want brilliant researchers hired for their research background. I just don't think that's true for the majority of undergraduate or even graduate students. And (I think), universities are very good at hiring for research ability.

Hiring for teaching ability they are not good at, even when they intend to.


> No. Generally, at research institutions, faculty are not chosen for their teaching ability. They are chosen for their research background. They are often horrifically bad teachers, if they even teach the class at all and don't just assign it to a TA.

I will have to say that I have had a few, select few, of those few bad teachers.

As a TA, I can also say that I am thankful that my professor doesn't stick the load of our lecture work on me, but I have heard from many other TAs that this does happen. From what I understand, I'm VERY lucky that I am working with am amazing teacher.


I know this is more anecdotal, but I don't see that at least where I attend.

My school allows transfer credit from all of the neighboring community college.

I also have friends who attended my same high school who are attending community collage. We have traded notes on classes and it seems like we do learn the same material.

The only difference is that my school packs every class with 20+ people while their school has at max 15 people per class.

This is also very anecdotal but the teachers that I have met from their school are also most focused on helping students rather than getting all of their hours of teaching done for the week so they can continue their research.

Again, this is only my experience.


I've found that, for some classes, you may get a better experience at a community college than at a "real" university, just due to class size and personal attention from the instructors. CC's tend to have smaller classes, and teachers who aren't so swamped with students (and research) and can therefore spend more time helping the students.

This may not be universally true, but it's something I've observed.


I agree. I have been considering doing this with my local community college since we have 1:1 transfer with my university. The classes also cost like 1/4 of the cost.


The classes also cost like 1/4 of the cost

Good point. That was a big reason I started my education at a community college. Classes were cheap, small, and there was and agreement for automatic admission with full credit transfer between that CC and the university I wanted to attend.

In addition, the CC was also 10 miles from my home instead of the 35 miles to the university campus, which was one more reason. For me, it was a no-brainer to go to the CC for two years and then transfer.


>Who would you think is smarter off the cuff, a Harvard graduate, or someone who attended an accredited community college and followed it with a masters from an accredited online school?

"Smarter off the cuff"? After reading about Harvard's interview and admissions process, combined with their 98% 6-year graduation rate, I wouldn't think much about a Harvard graduate without speaking to them.

>I'm sad to say that I've been programmed to assume that the Harvard graduate is smarter.

Programmed by who?

>It is just a bias that we are all taught as we are young.

Whom are "we"?


Getting admitted isn't as valuable a signal as getting a degree. Both are a signal or proxy for intelligence (IQ), but getting a degree is also a signal of persistence and conformity to get through the grind of college courses.

Bryan Caplan has written much about 'signaling theory' and education in posts like this one: http://econlog.econlib.org/archives/2006/02/mixed_signals.ht...


>Getting admitted isn't as valuable a signal as getting a degree.

It depends. In the limited world of software startup hiring, a compsci student that got admitted to MIT/Stanford but drops out after his freshman year conveys a more valuable signal than graduating from the University of Alabama (a much lower ranked school) after 4 years of study.

If it's an older institution doing the hiring such as IBM or Goldman Sachs, they'll want that official piece of paper showing you graduated regardless of the school's low ranking.


What signal does that show? Other than not being able to follow through with things (although dropping out of college can be due to a multitude of factors that has nothing to do with the candidate's work ethic or dedication)


What exactly do you mean? Perhaps if that student founds a startup or joins as an early employee immediately after dropping out. I doubt a student who just washes out or lacks work ethic gets a particularly valuable signal if they decide to apply to a startup some time after, although it's probably easy enough to present the departure in a more favorable light, in which case the dropout would regain some prestige.


Not a bad idea. Bright person gets accepted to Harvard, Yale, etc, but chooses a state college. Kinda like the Spock2 maneuver declining the Vulcan Science Academy. If you're young, and don't have a particularly long resume, sure why not include it?


This advice exactly was given to me by my HS guidance counselor. He told me to apply everywhere I think I might be accepted and then use the list on my resume.

Edit: Excepted to accepted. Read what your damn phone types for you...


accepted = past tense of accept (to receive or admit formally)

excepted = not included in the category or group specified


Some people do that, for exactly that reason, although it comes off sometimes as a bit cringey.


tbh i had more than one (young) application who mentioned the universities that "were their options" in job interviews


This econtalk provides some nuance for your point:

http://www.econtalk.org/archives/2014/04/bryan_caplan_on.htm...

There are many reasons to believe that college is more about signaling than knowledge. While admission is an important part of the signal (certainly at the high-end), completing college is probably an important signal in the mid-range and below. Completing college is probably associated with work-ethic, the ability to follow rules successfully, etc.


I've not seen this before, I'll give it a look.

Also, I do admit, my point is made bluntly.


This is not entirely true. Institutes such as MIT also has much much harder class work. One of the premises of being so selective is to make sure students are capable of getting through grueling coursework that will be required of them. So the MIT graduates are very well "stress tested" in addition to have been very selectively chosen.


I wouldn't say that the substance of e.g. a M.D. degree comes "second". Maybe you're focusing too much on IT?


There's no doubt the substance is actually more important, but the question is what folks end up caring about.

As an interesting anecdote, I found that when I'm around my Chinese friends or family, I'd hear the question "where did you go to school?", but when I had conversations with White or Black folks we'd talk about "what did you study?". I think that's a pretty interesting cultural nuance. For me, the second question is more wholesome.


My ex girlfriend was Chinese. She thought I was making a mistake by deciding to go somewhere other than Stanford for graduate school, that I was settling for something beneath me.

When I explained to her that as a professor, she or I would probably teach at places other than the school she or I studied at, she conceded that maybe she too would be willing to study at a place like Harvard.

It seems in China, theres only a couple of good schools. In America, the quality of 100 or 200 of the more prominent schools is probably rather indistinguishable. I never even applied to the Ivy league schools in undergrad, I just applied to the state schools and went to the one I liked best.

Also in China, people seem to go to great lengths to prove that theyre not one of the peasants. In America, people dont seem to need to make that distinction as often, if ever.


In part it's that in China your network matters a lot in life. There's somewhat of a subdivision into friends, family and strangers, and university is viewed as THE place to broaden the first group, preferably with influential people. Tsinghua, Beida, Renda are the places where future party functionaries, intellectuals and businessmen go, which is why there's such an obsession with getting into these few universities. If the only thing students cared about was an education, any of the 101 重点大学s would do just fine.

Then there's this cultural obsession with investing into your offspring that stems back to the Imperial Exams. Hell, the words that are used for the student rankings still harken back to those practices.

Further pressure is put on the children of academics, where admission to one of the top 10 Chinese universities or an Ivy League university is used as a measurement of parenting prowess.

I've experienced the insanity from a bystander perspective, and it's not without cause. I would also like to differ on the point that Chinese people go to great lengths to prove that they're not one of the peasants. They may go to great lengths to show that they're wealthy, or for the old cultural families, to great lengths to show heritage, but there's plenty of things very well to do Chinese will do that wouldn't really go for class abroad.


If there were two medical schools, one a state funded community medical school and the other a big name like John Hopkins, what school would you want your doctor to have attended? Even if you were assured that they both teach the same material, you would want the Coca-Cola over the Pepsi.


Yes, of course. My intuition would be the same. But that doesn't mean that the only point of a university is to guarantee exclusiveness/selection and accreditation.


I know what you mean but the soda analogy doesn't really make sense to me. How is Coca-Cola more "prestigious" or something than Pepsi?


I use it to describe the "name brand" that everyone wants although the two are essentially the same.


Is Coca Cola really more popular though? I never got that impression.


Yeah, he could have at least used RC Cola as an example. Coca Cola vs. Pepsi is more comparable to Harvard vs. Stanford -- I'm sure some people have strong opinions favoring one or the other, but they occupy the same stratum, in Silicon Valley at least.


>>It wouldn't matter if MIT only taught basket weaving as long as they still had such a thorough selection process

The whole point of Ivy leagues has been reduced to being a hub for assembling smart/rich people at a place. This has all sorts of unintended consequences like for example, alumni network which exists solely for career benefits.

Though great works still comes out of these universities. They are not exactly making people smarter or helping ordinary people directly in any way.

They just exist to gather rich/smart people at a place, and see if they can go further.


Paul Graham said something similar in his essay How to Start a Startup[0], though he's referring to how driven these students are to succeed, and being surrounded by like-minded people.

> It's no coincidence that startups start around universities, because that's where smart people meet. It's not what people learn in classes at MIT and Stanford that has made technology companies spring up around them. They could sing campfire songs in the classes so long as admissions worked the same.

[0] http://www.paulgraham.com/start.html


Are you familiar with accreditation? ABET is the one that comes to mind. Systems whose whole purpose is ensuring students really do know what they are expected to have learned.


Nobody ever got fired hiring a college graduate.


Totally agree. He's basically saying: we have high IQ students. Hire them.


The current system was created for a reason: because for most people, it works.

I ran a language learning group for a couple of years and we tried many different styles of learning.

With no structure, most people flounder. Unless you already learned the material and just need it for review, it really doesn't work well.

The best was complete structure: tests, lesson plans, and homework. Most people are passive. They want a teacher to lead them and guide them through the learning process.

No structure will work with people that have the same discipline required to start a company, because they don't need someone telling them what to do.


> Most people are passive. They want a teacher to lead them and guide them through the learning process.

Bull. Most people have any interest in learning beaten out of them by our crappy factory/prison like educational system. Because you know what's not conducive to learning? Having no choice as far as subjects; having no choice how long you spend on them; having no choice on who to learn from; having no choice who you learn with.

PG has a classic essay on makers vs managers on scheduling, and he places emphasis on how for work like programming, getting into the material takes time. I see no reason this wouldn't apply to learning. How much work would most developers get done if they had their day broken into 50 minute segments, five minute transition times, and six different projects they were required to work on every day with equal consistency?

My mother runs a Montessori preschool. Consistently, the kids that go off to first grade come back with stories of attending traditional school and wondering when they get to start doing something instead of sitting at their desk. Or when they can do something new instead of going through material they already know with the class. This is not to say Montessori is a perfect system - but it was actually designed with the education of children in mind, rather than efficiency from a bureaucratic perspective.

It bothers me greatly when someone blames the nature of people for problems that are systemic in nature - it feels no different from the ridiculous number of ADHD diagnoses we have these days, which seems to largely stem from not giving kids space to be kids.


>>Most people have any interest in learning beaten out of them by our crappy factory/prison like educational system. Because you know what's not conducive to learning? Having no choice as far as subjects; having no choice how long you spend on them; having no choice on who to learn from; having no choice who you learn with.

Yes, the interest gets beaten out of many children due to bad schooling, but nowadays excessive usage of TV/internet/facebook/whatsapp/sports/entertainment are also equally or more responsible for that. That's where some kind of structuredness and discipline is required.

About choice - it would great if you get some of those choices - but life/Nature/society are not so kind. But one can try homeschooling. Homeschooling gives you a great deal of freedom to you as parents to offer a great deal of freedom to your kids as students in this respect. But beware of unschooling as it is not very helpful from education point of view as some amount of structuredness and discipline are a must to learn any non-trivial technical subject to a significant level of detail.


I disagree. I think most people need structure. It may be from personal issues or life issues, but most people need that guidance.


Structure and motivation are needed to learn about subjects you are not passionate about. For those things that make your heart race faster, structure is a hindrance.

However, to have real mastery of a subject (say, Computer Science), will require you to suffer through parts of the curriculum that don't seem interesting.


I think people need the structure. We need the guidance. We don't know what we don't know.

I grumbled when I had to take a class that had nothing do with my major. But it changed the way I think about things. It was a great class.


"Bull. Most people have any interest in learning beaten out of them by our crappy factory/prison like educational system. Because you know what's not conducive to learning? Having no choice as far as subjects; having no choice how long you spend on them; having no choice on who to learn from; having no choice who you learn with."

Most people don't really want to choose and if given too many choices, will be overwhelmed and not get anything accomplished.

New learners of language, for example, have no idea where to start, which teacher to choose, or which subject they need to learn.

I work for myself and I've been working on my current project for over a year. There were so many times I wanted to give up and so many boring parts, but I slogged through it all, because I know the end result will be a great accomplishment.

I've met so many developers that get bored when they have to work on something that has the slightest hint of monotony. I feel like that's one of my greatest assets and I

We should encourage discipline when learning..and the method you suggested with 50 minute segments, etc does the exact opposite.

Experienced teachers know exactly what someone needs to learn to get to the point where they can learn for themselves.

This is what k-12 is all about. Once you get into college, you most definitely have the choice.

There may be students that get to this point faster..which I think is what AP classes are all about.

"How much work would most developers get done if they had their day broken into 50 minute segments, five minute transition times, and six different projects they were required to work on every day with equal consistency?"

I have no idea about other developers, but for me, not much. This sounds like a terrible way to develop any project.

"PG has a classic essay on makers vs managers on scheduling, and he places emphasis on how for work like programming, getting into the material takes time. I see no reason this wouldn't apply to learning."

The mistake here is assuming everyone has the mind of an engineer/software developer.

"My mother runs a Montessori preschool. Consistently, the kids that go off to first grade come back with stories of attending traditional school and wondering when they get to start doing something instead of sitting at their desk. Or when they can do something new instead of going through material they already know with the class"

Children shouldn't be making these decisions. When I was in school, I hated math. If given the choice, I would have played video games all day. Looking back, I'm glad I was forced to sit and learn all of those math skills that I pretty much use every day. Part of learning is memorization, which can't be skipped. Another part is having the discipline to do something you don't really enjoy, but need to do it to achieve a goal.

"It bothers me greatly when someone blames the nature of people for problems that are systemic in nature"

Systems are created around human behavior, not vice-versa. My wife learned in an environment like you envision: lots of choice and very little structured learning. Since no teacher told her what to study, she now has huge gaps in knowledge that she really has no time to make up as an adult. I would never want to put a child through that as it depriving them of their future success.

"it feels no different from the ridiculous number of ADHD diagnoses we have these days, which seems to largely stem from not giving kids space to be kids."

ADHD is over-diagnosed, but removing the structure from our current education system has nothing to do with it.

I also think that we need to teach kids to put down the ipad and phones when they are learning. Having a thousand distractions doesn't help.


The trouble with learning things like math is it requires effort, work, and persistence. By default, the vast majority of people simply will not learn it (including me).

But without knowing math, you'll never get very far in science and engineering. You'll never get a chance at the fun stuff.


> With no structure, most people flounder.

I find the opposite -- it is impossible for me to get any kind of complex learning done while I am on a structured course. It's telling that I learned more in three weeks of being ill and choosing to passively read books on mathematics (While coughing my guts up, of course), than I did actively participating in a course. (The only reason I am participating in the course is because I need a bit of paper that says I can do foo. A few people working in foo field have said I am at their level, it's just mindless inane hoop-jumping.)

> Most people are passive. They want a teacher to lead them and guide them through the learning process.

Exactly, but IMO that is the fault of the schooling system itself. They've never had to learn anything themselves, because everything they know has been spoon-fed to them by their teachers. If we had a better (Differently structured) education system, then this would not be necessary.

Although I do not see much hope for this happening, both Russell and Whitehead made attempts to improve the learning systems in the UK, neither prevailed.

Apologies if this came across as a rant.


So they give their experiences with many, many students that they've taught over the years, and you say it's not true because you're different?


I have also had the experience with infomally teaching (read with a bias towards 'mentor') programming to people, along with the second hand experience of knowing numerous teachers, including my father. Along with this I was unschooled from the age of eleven, and I have seen the effects of this upon the children -- most notably with a child (Eleven years of age) teaching themselves to read to an adult level in roughly six months -> one year, simply because they were interested in a subject.

My example was anecdotal, however when combined with my (Also anecdotal) experiences concerning the education system, the second-hand experiences I noted, and the writings of Russell, Whitehead, Lockheart, and numerous others who also have expounded on the problems with overly structured teaching and similar methods, it provides me with enough evidence to convince me until equal evidence is provided to the contrary.

That is why I say it is not true -- or more accurately, that in my experience, it is not true. I accept that a structured environment is a requirement for some people to learn, but I dispute that it is a hard requirement, and I also dispute that it is natural for the general population to require structure to learn.


My anecdotal experience brought me to similar conclusions. Structure is required in schools because of the way children are taught in schools. And I'm not sure I'd do any different given the same requirements. Even though it doesn't have the best results for every child it scales well enough that most people at least do okay.

Take your average child away from that after several years of being part of it and any short term observations are going to make you think structure is critical. Before they do anything else they're going to have to learn that they don't need structure to learn and after that they're going to have to learn how to teach and push themselves.


> My anecdotal experience brought me to similar conclusions. Structure is required in schools because of the way children are taught in schools.

First off, that's a circular argument. But also note that research into education is riddled with issues (such as the researchers having moral issues with blind control groups, which results in basically useless data). Our education system is based on the gut instinct of certain administrators. People who clearly would prefer a structured system where one-size-fits-all. Not that I'm saying everyone learns differently (learning styles are also mostly bullshit), I'm saying that the current system could be much better and people who do well in the current system are very motivated to do well and will gladly push with all of their force through the bullshit so they can get on with their lives.


It's circular because it's a circular pattern that reinforces itself. If schools had a large amount of resources or didn't need to scale efficiently they wouldn't need the structure and if they didn't need the structure children (and staff) wouldn't become dependent on it.

That said there certainly are people that need structure to learn (particularly what they don't want to learn). There are several disorders or disabilities that would make unstructured learning incredibly difficult or impossible.


The example used is flawed to begin with, in my opinion. Some kind of structure is almost required to learn a language in a reasonable amount of time, assuming there is no immediate access to full immersion 24/7.


I too learn more deeply outside of the confines of a structured class. However, I find the structure of an OCW course nearly ideal for many subjects.

For me it seems to be at least partially a function of my ability in an area. The stronger I am in a subject, the less structure I need.

I suspect the optimal amount of structure varies from person to person and within a person from subject to subject. People learn differently.


>most people are passive

They are not naturally. But society designed by people like you turned them to being passive.

Imagine people start education from start in society like this university, everyone will be excited, because they are going to do what they like.

My problem with people like you is this.you are scaling your small experience to one of the fundamental things in human kind.

Maybe passiveness in your language learning group was other where? Maybe because people was forced to be there? Because of raise? Beacsue of earning respect? Or because of something deep in their unconscious part of their mind. But I am 100% sure if they were truly there, they would have been excited.

Things are simple.if you compare them with sex.almost every bode (healthy) excited about sex. Education should be like this.free.and whenever whatever you like.

Every person is/will be excited if he/she truely become free to do what they love.


>>Imagine people start education from start in society like this university, everyone will be excited, because they are going to do what they like.

Anybody and their grandmom may be excited to do what they like; it takes education to learn to do something even if you don't necessarily like it. What he/she seems to be saying by "structured-ness" may have to do with this. Nature forces you to hunt for food, else I am happy to just keep on pondering and doing philosophy because I like it.


What happens when you find out that what most people think they "like" is to hang up with friends, drink beer, play games and have as much sex as you possibly can?

What happens when said group of people finds out there's such horrible things as bankrupcy, or debt collection, or evictions?

What happens when they figure out that the things everybody expects them to do to pull themselves out of the mess is complex stuff that cannot be learned on the fly, and that requires years of practice before results can be obtained by investing a reasonable amount of effort?

It is good and proper to enjoy ourselves in life, but sometimes you need to force yourself to do stuff that either you must do right now in order to avoid extremely unpleasant outcomes, or that you should do in order to position yourself for more good things to come your way in the future.


And how do you pay for it? How do you turn that into being able to support yourself and your family?


I agree with the necessity of structure at a younger age. The current public education system in America has strict schedules and classes geared towards getting the lowest common denominator a high school diploma. However in a college setting, which is supposed to prepare you for the real world, it's more of a sink or swim scenario that separates those who are disciplined and are willing to work than those who don't know how to thrive without someone telling them what to do at all times. I'm really interested to see how this new university works out.


I think you're mistaking socialization for some broader natural truth. My partner was unschooled, and her peers are on average doing much better than mine, having attended the best charter/magnet high school in my state.


"most people are passive", that's probably true and why we have lessons for all subjects for the majority.


This kind of hippiesque, free-spirit approach sounds fine for MIT-level students, but I have my doubts that this model can be effective beyond those highly driven individuals.

Also, while I agree that you don't need a college degree to be a successful programmer or business person, most other science professions require some sort of structured learning environment and equipment to effectively teach students.

Maybe I'm dumb and this professor is onto something. She has a degree in material sciences, so perhaps she knows first-hand that all the knowledge you need to learn advanced chemistry can be gained by someone with a garage and aspirations.


> I have my doubts that this model can be effective beyond those highly driven individuals.

C.f. Punished By Rewards.


I like the idea of project based learning. I think it will end up being a more lasting method of instruction.

I am interested in how they will still teach the breadth of the subject matter.

For example, you can get a lot of work done with linear algebra without really understanding vector spaces.

I assume this method of instruction would require more teacher-student interaction to be effective. The teacher could review the student progress on a project and make a note of where the application of a new technique would be helpful. Guiding the student towards a more complete understanding of a subject.


Yes you are right. This sort of teaching is very effective, but very expensive. The reason we have the current system we do is because it is the cheapest solution that somewhat works.


Except our current solution has become very expensive. Perhaps we need to move out of our local maxima of education if people are willing to pay so much.


You can basically do this now if you really want. There are heaps of adjunct lecturers who will provide one-on-one or small group tutorials for cash. Wave a few thousand dollars a term in front of some starving adjunct and you will get exactly the education you desire.


Maybe I'm wrong - and I kind of hope I am, as there's a noble motive behind this movement - but I always feel that people are focusing too much on IT and the stories of a guy who became a full-blown IT professional learning by himself, Google and open courseware.

I think this kind of approach to learning works for IT, because a) the logistics to learn it are easier to handle; but mostly because b) most IT jobs actually don't require a degree - or "degree-deserving" knowlege; most programming is "trade-school"-like knowledge.


So all of those people with psychology and business degrees are going into jobs that require "degree deserving knowledge"? If you're going to say something special about a particular field, at least compare it to something else otherwise the statement is meaningless.


I didn't mean to talk about the field in general. Computer science is, beyond doubt, "degree deserving knowledge". I just meant that most IT jobs don't require such knowledge and, in my opinion, would benefit from a more flexible approach to teaching. Something more trade school-like.


We don't need a "new" university system. We need to update our expectation about higher education and the work force. Let's start but dropping our mantra that everyone should go to college.

Here's another: Maybe not everyone should even go to high school.


But everyone should go to high school. Hell, if you aren't going to college I do hope there is an extended high school. I don't care if its taxpayer sponsored but even high school is not enough to foster a critical thinking process.

I picked that up somewhere between high school and end of college. Some people may be faster than me but that skill is really important.

Either that, or you don't get to vote.

Edit: The last statement was more about the ability to think critically and less about if you finished high school/college.


In my opinion, the problem with high school is due to the large amount of wasted time. So many classes had dozens of days if not the large majority of the time spent on nothing or busy work. Busy work is a large part of bureaucratic organizations where most of the cogs will end up, but it shouldn't be the goal.

I think many high school courses could be pushed down into earlier education and a great deal of knowledge a university provides could be offered there.


I think some schools are adopting new models, but when I graduated from high school <10 years ago, all but a handful of classes were two semesters long (i.e., autumn and spring would be the same class) and all classes met 5 days a week for approximately an hour, with 7 classes a day. With 180 days of school, with most classes meeting every day of that (save exam time), that is over a thousand hours spent per class.


>Either that, or you don't get to vote.

What? Graduating from school (highschool or university) is about learning how to follow directions and play the game. To say that a degree somehow guarantees that a person knows how to think critically is ludicrous.

But you're not even making such a modest claim, you're actually claiming that it is the only way to assure people can think critically (even though it's not even a valid way to begin with). Some of us can think critically as children. Do you honestly believe just because you couldn't think critically until you were an adult, that means that hundreds of millions of people that you've never met should be forced into the same fruitless, and arbitrary indoctrination you had to undergo to become adequate in your own eyes?

Are you aware of how absolutely stupid school seems to people who are really, really smart? http://www.lambdassociates.org/blog/bipolar.htm


I have a place for you that your might really like, https://www.reddit.com/r/topmindsofreddit

Go there and be happy.

The important thing in high school and college wasn't what I learned in the classroom, although some of it does help me in life. The important this was what I learned in that environment with my peers.

I don't know how bad your college was, but I did go to a pretty good college in my country. It might not compare to ivy league colleges, but it taught me important life lessons none the less. My opinions are biased by my college experience but it sure goes to show that college can be great and not completely useless and a lot of people like to claim.


You've addressed none of the points that I made. You claimed that school is the only way to learn how to think critically and should be used to take away the suffrage of millions of people. In response to me calling you out on this very dangerous and ignorant opinion, you've chosen to derail by linking to reddit, talk about "life lessons" you learned as a teenager, and brag about the college you went to. Nice.


Well, if you are an american, the college I went to is nothing to brag about. But we do with what we have.

> Graduating from school (highschool or university) is about learning how to follow directions and play the game.

Really? Is that what college is about, or is that what some colleges have become? The progress from middle school to high school to college is less about learning what's on the board and more about being able to stand on your own among your peers. It gradually moves you from a smaller echo box to a larger and larger one. Until the time you start working and living independently, where the echo box becomes your workplace and the city.

> But you're not even making such a modest claim, you're actually claiming that it is the

I never said that was the only way, but it has been one of the better ways historically. If you know of a better way to teach a good portion of the population critical thinking abilities I am all for it. I am not supporting going to a college with and being $200K in debt when you graduate and still not knowing what you want to do in life. But I am opposed to the fact that the r/hn hivemind thinks that all college is a waste of time.

> Are you aware of how absolutely stupid school seems to peop...

Well good for them, maybe they can broaden their horizons a bit earlier, but what kind of smart people are you talking about? 3 standard deviation from the mean? So the 0.1%? Or 2 sigma? So the 2.3%?


I can't say I've ever been convinced that those who rant against school are "really, really smart". Mostly they just sound like people who are insecure in their choices.


I'm not talking about people who "rant against school". I also hope that you realize that my post was not a "rant against school", but a rant against taking away millions of people's right to vote just because they didn't learn how to shut up and do what they're told. The people I'm referring to would likely not rant about school, because they likely wouldn't have graduated or even attended in the first place. When I see people ranting it is usually people who have taken on debt via student loans.


I know that I'm slightly bitter. I played the game and got the job but what an ultimate waste of time. The best thing about college for me was networking.

I remember many classmates in high school that hated it and considered it a chore. They didn't play the game.


The only constitutional voting test is, are you a citizen of voting age, who is not a felon?

I would rather see that test applied as a prerequisite to run for public office.


I didn't finish high school nor did I go to college or university. I'm now a systems engineer. Should I not be able to vote?


Did you develop an ability to think critically? Can you rationalize your position? Congratulations, you can do well on your own and you are not a liability. But you are also in the minority, I envy your the intelligence and will to learn you were born with, but not everyone has that.

Most discussion about college not being important is about the fact that we don't use anything we learned in our majors after college, but your major is hardly the important thing to learn in college.


Thanks for the comment. I understand what you we're trying to say, however, the last part about not being able to vote rubbed me the wrong way. I've been through the pain of trying to find a job with no "education" as the system currently is. Making things more difficult doesn't help. Companies don't know how to judge critical thinking, that's why they rely on degrees. It's a cultural problem. I got lucky because someone was willing to take a gamble on me and it worked out for both of us. Not everyone with my "intelligence and will to learn" get so lucky as me no matter how capable they are.

Tech seems to be a strange exception to the rule. It's easier to get a job being self educated in tech than anywhere else it seems.


Why do you think the only environment people can learn these skills is sitting at a desk for extended periods of time?


I don't know what you did in your college, but I wasn't sitting at my desk for extended period of time.


Well in most highschools and universities children/people sit for hours at a time behind desks while someone talks.

Did you stand for your lectures? Did you not have lectures? Is 6 hours a day not an extended period of time?


Honestly, 6 hours a day in high school is nothing. For most of us that is the best use of the time that we have in a day when we are in high school. But it does provide the required base that is needed in college for a majority of the population. Are you saying we should stop the whole system for the benefit of the 0.1% (3 sigma)?

In college lectures was from a minimum of 3 hours to maximum of 5 hours depending on how many credit you took in the semester. But the actual learning happened outside of those hours and once again lectures were important only to create the base, maybe even only to provide an introduction to the base.


Critical thinking is a must, besides we need some STEM knowledge also because you cannot do much of meaningful critical thinking without it, especially in today's world. So high school is definitely a must and college if you can.


It's exciting to see some thought and movement at this level.

The existing system is broken when it comes to actual student knowledge. A lot of systems are to fault, but as the modern university has bowed to their corporate overlords in filling the pipeline of modern corporate key punchers, college has essentially become tech school for Power Point and Excel in a lot of cases.

So many of our current college students would be so much better served by tech school.

Heck, so much of this community often ends up eschewing college at large to go build something - often because the value proposition of the university model right now simply doesn't make sense.

We're always going to need people who can think - maybe a bifurcated path where a lot of our university experience returns more to a true liberal arts education.

This experience-based model wouldn't have been out of place historically. That it appears so revolutionary to us today shows how broken the current model is.

Something has to change. I'm excited to see some movement around the borders.


The "42" (school) seems very promising, it's a private French computer programming school created and funded by Xavier Niel.

To apply, there is no degree or diploma requirement, and it's completely free.

The training focuses on project-based learning via a peer-to-peer pedagogy and allows students to set their own pace for learning.

IMO The 42 school is not research-oriented, but rather emphasizes practical training suited to the needs of startups.

www.42.fr


Sounds quite a bit like a more radical version of Olin College of Engineering not too far from MIT.

From what I've seen that sort of education is more accepted by industry than academia so its a good solution for me. Having an expansive portfolio to show is a very powerful thing.


It would be interesting to compose your own 'major' program out of smaller granularity subject modules. So e.g. if one were interested in majoring in robotics, you could assemble a study program of something like basic CS, basic MechE, basic EE, then stack some other intermediate level subjects modules on top... Controls, Visual processing, etc...


Interesting, but maybe not all that useful. The purpose of a major is to give others an implicit idea of what you studied. If I say I have a major in Robotics, then everyone can infer what that consisted of, even if it varies across universities (in which case that knowledge usually disseminates). What happens when I say I majored in robotics, but I failed to actually include anything on the hardware side of it and only really studied AI and control algorithms?


A company hiring someone out of this new kind of university would look at projects the student worked on as well as their subject modules. But large projects can get fuzzy as to what the specific contributions might be, and the subject modules then give you an idea of that candidates background skill spread. So if the company were looking for someone in robotics for more of the software side, then it would be apparent from the studied modules that they might be a good fit.

But more than that, it would give universities a model to continue creating a range of advanced or specialized subject modules. This would allow students or employees to come back and pickup new subject modules in a more integrated way to how the whole university operates.


Unless it becomes the norm, what's more likely is that they'd just ignore people from that university unless they had a good reason not to.


I wonder how she plans to teach them mathematics. I've yet to see anyone learn math on the job. The difference between a technician and an engineer is the latter knows the math.


Something that concerns me, unless I missed it, is that there is no mention of the liberal arts. What is described here is closer to my view of what a university should be. (I'm in year 14 as a professor.) I think it would be a huge mistake, though, to not have a liberal arts core.


Sounds like a hacker house. Why call it a university if it doesn't grant degrees?


Honestly it probably makes it more acceptable to the "everyone needs to go to college" crowd. Trade schools have been around for a while, but for some reason people think you need a "university degree" to get anywhere. I assume they are trying to bridge that mental gap by using similar language to that of traditional colleges / universities.

Also "trade schools" and / or specialty schools rarely have the same social aspects of a traditional school, which (even if you personally don't value that) is a big selling point for traditional schools. Maybe they are trying to capture some of that and mix it with the specialization of a trade school.

All speculation of course, but seems reasonable I think.


Honestly, I wish there were major respected trade schools that taught CS/Software Engineering/IT/IS. The university environment, in my opinion, is adverse to the development of good CS knowledge.

Most of us on here, I'd say, have learned through trying and experimentation and that is why we are fairly good at what we proclaim to do.

University doesn't foster that kind of understanding.

I also hate that currently I need to take 5 physics classes, an accounting class, a social sciences class, and 3 "study" classes to get my major.


"Good CS knowledge" isn't 'learning git'. It's mathematics, theory of computation, recursion theory and type theory and PL, compilers, complexity theory and algorithms, advanced data structures, concurrency and distributed systems etc. If you know these, you should be able to pick up more applied subjects on the job quickly.

You very much need a university to teach you this.

Social sciences will make you a more well-rounded person able to see past just technical concerns.

Physics and social sciences will help you understand the world around you.

And it sounds like you want university to help you write business logic for accounting software, so an accounting class will do you well.

If all you want to do is write simple code for say CRUD applications then you're right University isn't the place to learn that and a trade school would be better. In tech you already have the best trade school in the universe, the internet, and the best resume in the universe, a github.

We have learned a long time ago that letting students decide to study exactly what they want when they are 18 gives you lopsided uneducated graduates who know a tiny bit about the things they were good at already; being well-rounded is EXTREMELY valuable, even more valuable than being an excellent programmer even in tech. The hard parts of software in practice are not writing the code, they are understanding what the user/client wants and how to translate that from vague fuzzy human communication to something specific and technical.


> theory of computation, recursion theory and type theory and PL, compilers, complexity theory and algorithms, advanced data structures, concurrency and distributed systems etc

All of those are fairly basic topics that most people can pick up in an afternoon or two. These are all things that anyone who has been programming, reading, and interacting in the field of computer science understand. I could understand someone requesting that knowledge be provided to them in a college environment.

What I object to is universities charging for the "well-rounded"-ness that you speak of. Can you please define for me what it means to be well rounded?

From what I have seen is that no, no one can define what it means to be well rounded.

What do you have to do to be well rounded? No, no one can tell me that either.

Some how having gone through a university you are proscribed this trait of being in the "know". You automatically are transformed into a scholar, a person who is now magically "well rounded".

Don't get me wrong, I think that there needs to be a structured curriculum, but only of things you need to understand make it in your respective specialization.

For me, I'd love to get into building compilers, operating systems, digital signal processing, and general software development.

For me, I'd say this is the perfect regiment for me:

  * Full understanding of the computer booting process, understanding of modern processor features.

    - Write a self hosting operating system, bootloader, and learn to write a BIOS for an example computer

 * Software maintenance and revision control

    - Following the current industry best-standard practices

 * High level math with a focus in signal processing

 * Familiarization with X top programming languages at the time of graduation

    - This should include standard formatting for each language, understanding implementation details, and full familiarization with X top used libraries and standard library

 * Understand logic and problem solving
You may notice that I have annotated some of the things I feel I should learn, others I have not. If I have annotated it, it is because I have learned to do this on my own time and I want to expand that knowledge and I know it will help me professionally. The things I have not annotated are things that I have a basic understanding of, but I need someone to teach me although I am not getting that at a university.

I don't know, maybe I'm wrong about this. I think these are fairly rational claims that I am making. Maybe I'm completely wrong and being able to quote shake-spear will help me when I need to learn the documentation for int 0x13 when I need to load a file on a 30 year old IBM PC when fixing a kernel level bug at some company, or when I need to write software to process data coming from sensors at 100GHz.

Edit: Updated formatting.


> theory of computation, recursion theory and type theory and PL, compilers, complexity theory and algorithms, advanced data structures, concurrency and distributed systems etc

All of those are fairly basic topics that most people can pick up in an afternoon or two.

...

I think these are fairly rational claims that I am making

Um... no. Not fairly rational claims at all.


When I get into my classes at my university I get the textbook, read into the book a little and say "ok, I remember learning about this from X, Y, and Z project I've worked on".

I then go on to goof off in every class, as I am doing right this moment.

Nothing I have learned is of substance in my algorithms, computer science, or programming classes that I did not already know before I enrolled. I'm bored, and I want to learn something new.

I'm holding out for some 300 & 400 level courses, but from what I hear the courses I am looking forward to on operating systems is nothing more than "This is how you open a file in C" and the data structures classes I look forward to is basically "this is how you define a tree and how you traverse a tree".


Been there, done that. I remember sitting in a database class an being excited that we were covering a topic I didn't know - normalization. Then I realized that I had already learned it through experience at work, I just didn't have a name for it. I just thought I was taking a crappy schema and turning it into something I'd like to use. I knew that there are times denormalization can be good in the right places, but I wouldn't have known what to call it.

I learned stuff in school, just not often and it wasn't due to HN - I didn't have that option back then. I had a few great classes - some with only a dozen people and great professors. A shout out to other people 12 people who might remember - "Horses are pretty".


I've had those experiences too. I'd say that the teacher that I am TAing with, who was my teacher for the corse, is actually one of the best educators I've ever had the pleasure of meeting.

I also have met a few teachers in other classes that have been very inspiring.

I just feel that there is something lacking. I wan't something a little more.


Understanding the material from the lower level courses in whatever university you attend in an afternoon is not even remotely comparable to picking up any of those topics fully in an afternoon. Are you under the impression that after you've been through your 200-level algorithms course you've now exhausted the topic and have nothing left to learn?

If you haven't even taken 300 and 400 level courses, how would you know whether "the university environment... is adverse to the development of good CS knowledge"? If that's the case, you should probably drop out of school and save some money. Just start submitting some research papers now! Your lack of a PhD won't matter once you've shown everyone how fully you know each topic.


If you read my reply to "brodawg" I agree, I say "I don't mean you can understand the full depth of each topic in an afternoon or two, I mean at a working level you could get an understanding of the topic".

I don't mean to imply that someone should be teaching courses after an afternoon of flipping through a textbook.

That being said, do you need to have a PhD to be a good computer scientist? Most of the skills needed are basic analytical abilities. Sure, someone might not know all the terms that we use and might even think some of the ways we talk about problems are flat out wrong, but it is those individuals who truly leave a mark.

I'd argue that the best in our field have just been random people who have stumbled in and said "This looks cool, let me see what I can do with this".

Here is a list of people who, in my opinion, fit this category.

* Donald Knuth - Do I even need to describe him? (Is a PhD in Mathematics, not CS)

* Brian Wilson Kernighan - The Practice of Programming, The C Programming language, The Unix Programming Environment (All books), and first Hello World

* John Carmack - Or even him (Dropped out after the 2nd year)

* Douglas Crockford - JSON

* Carl Sassenrath - REBOL

* Bram Cohen - BitTorrent

* Michael Widenius - MySQL AB and MariaDB

* Larry Wall - Perl

I'd love to see you sarcastically mock these people for not having the correct pedigree to even approach understanding our field.

I know for certain that since he dropped out of college and does not have a degree in CS John Carmack has no valuable input to give when discussing optimization or algorithms.

I don't know honestly. All I can say is that someone can definitely succeed as a computer scientist if they do not have a degree in computer science, and I don't think publishing papers has anything to do with professing your knowledge (mainly my gripe is with how corrupt and obtuse that entire system is). What I do know is that it is extremely unlikely that I will get a job if I don't get a piece of paper from a place that promises it will teach me something.


Well, maybe if you listened in class you would realize the benefit of what they're trying to teach you.

If you really think you can understand complexity theory, algorithms, and advanced data structures in "an afternoon or two", I think about 90% of it went right over your head.


I don't mean you can understand the full depth of each topic in an afternoon or two, I mean at a working level you could get an understanding of the topic.


I still disagree entirely. You can understand one or two basic algorithms in an afternoon.


Sounds like you need to transfer to a better university.


I think you're missing the point here.

Those physics classes will teach you to think about problems abstractly. In my experience, my knowledge of physics has helped me think about algorithmic problems significantly. Even if you don't use the physics itself, it's good to learn.

The accounting might not help you so much for CS, but it could be useful. And a few liberal arts classes are always good for breadth of education, and making you realize that you can always pick up a good book and read it, and that not everything you do has to be so math and science oriented.

Also, for the record, what you would learn at a trade school would be "programming". What they teach at university is "computer science". There is a pretty serious distinction, and I don't really believe you can learn computer science properly without a formal education (even if that comes from self-studying formal books - but teaching yourself javascript is not computer science).


>Those physics classes will teach you to think about problems abstractly

So do the classes on computer science and algorithms. The reason universities choose to teach those things using physics classes instead of logic class or algorithms class are just historical accidents.

>And a few liberal arts classes are always good for breadth of education, and making you realize that you can always pick up a good book and read it

You can get that for almost free at a bookstore or a library.


Yes, the traditional courses do teach you that as well - to some extent - but the point is it's important to see how different fields do it. I think my classes in algorithms are as useful to my research in physics as my classes in physics are to my research in algorithms.


I disagree. One of the most enjoyable classes I took was Weather and Climatology. Completely irrelevant to my current work and I paid a significant sum of money to take that required science credit.


You're disagreeing with the guy below me, right? It sounds like you're agreeing with me.


I guess that depends on what you call "good CS knowledge". It's true in my experience that universities are not particularly good at teaching the actual engineering part of software engineering. But I that that is because the major is called "Computer Science", not Software Engineering. In my opinion, universities' main focus should be on educating potential scientists, not churn out workers for the industry. That implies a heavy focus on theory, with the practical parts being mostly self-taught when necessary.


Standardized tests can get you out of most "breadth requirements". That's how I picked up minors in both chemistry and cognitive science.

Had I known then what I know now, I would have skipped even more of the fluff and spent my last two years splitting my time between classes and internships instead of learning additional facts and skills that ultimately had no marketable value.

I can count on zero hands the number of times I have profited by taking those university classes not directly relevant to abstract mathematics, computer software, or electronics hardware.

The supposed value of "well rounded graduates and alumni" seems very much like self-serving propaganda now. How about instead of spending three hours a week writing literary analyses of famous books, the university sponsors and organizes three mandatory hours of weekly networking events with real workers from relevant industries? That would have been worth the tuition. To this day, I still can't understand why anyone would want to read _Walden Two_ instead of doing something more useful, like separating and categorizing all the marshmallows from a box of Lucky Charms, or removing potential airway blockages in the nasal passages with the fingers.


The accounting and other non-concentration classes are meant to give you a broader base of knowledge and better prepare you for adult life/career. Understanding basic principles of account is pretty important, if less fun.

While this not realistic, if people were better educated about finance and understood that there really is no free lunch, the 2008 financial crisis wouldn't be as bad. Unfortunately, human nature is to say that this time it's different...


Five physics classes? Really? So... what subjects are those?

Because you should definitely be taking at least a little physics and quite a lot of math.


I can opt out of the physics, but I'd have to take something like chemistry or biology. I'm taking the physics route because I feel that will benefit me specifically based on the carries that I'd like.

I just rechecked, I think I need to take Biophysics, Astronomy, and nuclear physics with 1 bridge class between the three. Why? Because.

I'd love to take the nuclear physics because that sounds fun, but and biophysics will make me claw my eyes out.

Also, for my university I for some reason need to take 3 random 300 level classes and three electives as well as 3 PE classes, 8 social science classes. I can't for the life of me justify these in my situation. To be honest, I'd trade the time I would waste on this with CS classes branching into different fields. Maybe it's just me and that I know exactly what I would like to do with my career but this seems useless to me.


I'd pass on the nuclear physics, biophysics is where it's at these days. (Now if you meant particle physics you might have a point). But nuclear physics itself is kind of considered as a field whose heyday is past.


Or I can take thermodynamics hate my life. I have a few options, I just don't like many of them.


Spoilers: if you learn statistical thermodynamics, machine learning is easy.


Absolutely true. I basically slept through a grad level machine learning class and picked up an A.


This isn't even a trade school. Trade schools definitely have classes and lectures.


Because "universities" are primarily about research -- which is why that's the main criteria in hiring faculty -- and secondarily about credentialing students by granting degrees, and this new experiment is jettisoning the secondary function (leaving project completion -- including, e.g., research publications) as, apparently, the main concrete résumé-applicable output.


I love projects, I hate school. Lots of details to work out (and questions of scale), but still sounds very promising and exciting. Not sure I understand the "no majors" bit though.


A one year sabbatical to start a new university, where her long-term commitment depends on its progress after just one year.

Doesn't inspire much trust to prospective students...


Sounds like a cool idea, and I wholeheartedly support attempts to distribute education so that all can benefit and contribute to the sum of human knowledge.

However, one of the primary reasons for higher education (for better or for worse), is to get a job. Currently, most places are going to ask for your degree, or experience. What's someone coming out of this program going to have? And how long until most employers accept it?


I personally think that's a highly interesting project. Is there any way to get regular updates on the progress?


Sounds similar to Hampshire College in western Massachusetts.

https://en.wikipedia.org/wiki/Hampshire_College

The College is widely known for its alternative curriculum, socially liberal politics, focus on portfolios rather than distribution requirements, and reliance on narrative evaluations instead of grades and GPAs. In some fields, it is among the top undergraduate institutions in percentage of graduates who enroll in graduate school. Fifty-six percent of its alumni have at least one graduate degree and it is ranked 30th among all US colleges in the percentage of its graduates who go on to attain a doctorate degree (notably first among history doctorates).


Personally I love this idea. If I could send my high junior there I would in a heartbeat. But I learn by doing. Pattern for me is do, learn, do learn, repeat until complete. Never cared for the learn learn learn, forget 90%, now do.


"I’m looking at a new model, where the whole sort of vocabularity is different"

Impressive! Creating a new word in the same sentence you declare you'll be using new words. Assuming I understand her cromulent new word.


I have had the idea lately that liberal arts universities should be replaced with a year or two of critical thinking, logic, and emotional intelligence training.

These are the skills necessary to enter the "modern" economy outside of STEM and trades, where the most valuable skills are thinking and emotional intelligence (relationship development and management).

Link up with progressive employers who recognize the value of this training and are willing to take on young graduates and unleash them on their own internal domain knowledge for a couple of years.


This sounds overly dreamy. At the very least, their new "university" should bother to award degrees when specific breadth-and-depth levels of knowledge are adequately demonstrated.


Do you have an email address I could use to get ahold of you. I have a question about a post you made a few years back.


I do think my HN profile lists my email address, but if it doesn't, it's "$FIRSTNAME$LASTNAME@gmail.com".


...So long as she wasn't the one that killed SICP.


Everything about this sounds like how an university use to work. I for one will back this idea. Nothing will stop a person from getting a 'real world' job after completion of the program. But it is clear that the program is not just specialist training, but geared to moving up the abstraction and generalization layers.


Founding a new university can't be cheap. Who's going to pay for this? Does she have funding lined up?


Now project based learning is fine, but CS for example has lots of wonderful theory, where does that fit in?


That probably depends on which parts of CS theory you're thinking of... the way my university taught CS theory for topics like data structures, algorithms, state machines, applied machine learning, etc. was fairly interwoven with programming projects.


How much of that theory do you have to understand in order to be a competent professional programmer? Some, sure. The more the better, sure. But much of it can be learned independent from much of the rest of it - you don't need one mammoth 4-year curriculum to "master" CS.


I learn a lot faster from the people who teach it well at university, and from focusing on the subject. But yes if you're goal is to become a competent programmer it is not strictly necessary.


I think degrees and accreditation are fine, but the classroom should be inverted. Universities should be a place for labs, socializing, and individual tutoring and exploration. Not lecturing. Unless it is just to give a talk that will later be put online.


They are talking about this, right? http://4pt0.org/programs/startup-weekend-education/

EDIT: No, probably not. Different Christine Ortiz...


The only point of a modern day university is socializing


So... Y Combinator?


these days you can learn from online and books and forums if you're a self-driven learner, however the best universities do still provide a place for the like-mind people to get to know each other, which is important and hard to replace.


... "no grades, no teachers, no students, no books, no learning!"


"You are a product of your environment." --Clement Stone




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: