Agile Zone is brought to you in partnership with:

Dave Bush is a .NET programmer and Certified ScrumMaster who is passionate about managing risk as it relates to developing software. When he is not writing or speaking about topics related to Application Lifecycle Risk Management (ALRM), he is an example to his peers as he develops web sites in the ASP.NET environment using industry best practices. Specific topics Dave can address include: • Project management, with an emphasis on Scrum • Test Driven Development (TDD) • Behavioral Driven Development (BDD) • Unit testing and Integration testing using NUnit, Jasmine and SpecFlow • Web Application testing using Selenium • Continuous Integration • Extreme programming (XP) • Coding best practices • Architecture • Code Reviews Dave has "an insatiable curiosity and is always learning." He has been called "the miracle worker" and "hard to replace" by clients he has worked for recently. Contact Dave via LinkedIn (http://www.linkedin.com/in/davembush/) to find out more about how he can help your organization reduce software development risk Dave is a DZone MVB and is not an employee of DZone and has posted 55 posts at DZone. You can read more from them at their website. View Full User Profile

Do programmers even NEED a degree?

09.12.2013
| 12404 views |
  • submit to reddit

Well, this post yesterday really got things going in the blog-o-sphere.  Shoot!  Even Joel Spolsky got involved.  I saw another post, but I’m sorry to say, I can’t find it right now.

The original article put forth the idea that we are teaching the right stuff in our undergraduate Computer science degree programs.

Joel’s article suggested that what we really need is a bachelor of fine arts program for programmers, that programming and computer science are really two entirely different disciplines.

I think Joel’s on to something and his solution is probably the closest to what our current educational system can handle.  But, I have another solution to the problem that goes further.

Why get a degree at all?  Most of you reading this post know as well as I do that a good 80 percent of what most programmers learn, they didn’t learn from college.  Let’s face reality here.  First, any degree program is, at best, 20 years behind.  It’s just a fact.  Ours is not the only industry facing this reality.  Second, the really good programmers are already doing what these schools are trying to teach.

Joel suggest that we have people from the industry come in and teach this BA course work he’s suggesting.  There are several major problem with this.  First, most (not all) of the really good programmers can’t teach.  Some could with some training, but the school isn’t going to train them.  And those who can teach, probably don’t know the recent material.  There ARE exceptions.  My point here is that the exceptions won’t fill the need.

But, what would happen if we went back to a really old way of doing things?  It worked.  We are almost doing some form of this already, mostly after graduation.  What if we just skipped the programming degree completely.

I’ve been in this field long enough (20+ years) to know that most of the really good programmers got into our field through some back door.  I was a camp and recreation major that converted to programming via DuPaul’s career change program.  A hoop I jumped through simply so I could get the first job.  At least 80 percent of what I was taught I was already doing as I had experimented with Basic, Pascal and C.

I know another guy who dropped out of school his Sophomore year because he already had the job he was going to school to get.

And don’t even get me started on certifications.  Let’s say we stop testing knowledge and start testing aptitude.  The aptitude test I took nearly knocked me out of the DuPaul program.  At the end of the program, the main professor told me I was the best natural talent he’d seen come through the program.  Wonder if he knows he almost never saw me?

Let’s face it, what we as programmers need to know to do our jobs effectively can’t be taught.  And most of what can be taught could be taught on the job.

So, here’s my recommendation.  Why don’t we go back to using the apprentice system?  This allows a good senior programmer to get a feel for what kind of programmer the new guy is going to be.  It would help the new programmer find out really quickly if this is what he really wants to do with his life.  He give the formal training the new programmer needs.

Unless you’re just starting out, you know that most of what you learn, you learn from experience.  Why not just admit that and stop trying to fit programming into a degreed program that largely doesn’t work for our industry?




Published at DZone with permission of Dave Bush, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Andreas Schilling replied on Thu, 2013/09/12 - 1:27am

 For the technology and programming part you're right. Usually no need for a degree. Still, having (and especially the process of getting) one is good for several reasons: many things that seem "raw programming" at first are based on loads of theory where it's better when you at least heard of it (e.g. big-O notation). But maybe more important: on the way to your degree yo learn how to tackle problems, how to gather information, how to write documents, how to present things, how (aweful) politics work and and and. I think this is really worth something.

I see more of a problem in the fact that higher education is rated as some kind of "job experience". It isn't. Not at all. Though I probably believed that myself when I dropped out of university.

Kevin Sapper replied on Thu, 2013/09/12 - 2:47am in response to: Andreas Schilling

I agree with you. For learning programming you don't need to go to university but for the computer science part it's totally worth it. When I decided to go to university me initial thought was just to get the degree because it would get me much more salary. Looking back now I realize how wrong I've been. I would probably never get into touch with embedded system hardcore C programming and lots of other really interesting stuff.

Brett Child replied on Thu, 2013/09/12 - 7:32am

I asked the recruiter who hired me a couple of years ago if my degrees made any difference in being hired or not.  He said it didn't, it was my experience.  However, I am positive that what I learned at the university opened the door for me to get that experience.   

I should probably note that I didn't feel my undergraduate was 100% applicable to the 'real world', but it was in liberal arts so that's probably not too surprising.  On the other hand, my masters in MIS was a subject I was really interested in so I felt that was valuable, and I still have the student loans to prove it.



David Lee replied on Thu, 2013/09/12 - 4:18pm

A degree matters.  It's not necessary but you're better off for it.  I meet too many developers that don't have a fundamental understanding of very basic things like resource management, basic oop, database normalization, etc.  Sure you can learn these things without a degree, and people w/degrees often don't know them.  But if you attend a university with a decent CS program and you're motivated, you can learn these things early and correctly rather than on the job.

You often hear developers say certifications don't matter either, well if you actually studied for one and passed you probably learned some things you didn't know before and are better off for it.  Some certs are also actually worth getting.

Don't be the guy with no credentials if you can avoid it.  Because when all things are equal the credentialed candidate will likely get the interview over the non-credentialed candidate.  


Charles Doty replied on Wed, 2013/09/18 - 9:18am in response to: David Lee

I think that's a misconception about a degree; it teaches you an agreed upon lingo. The Navy taught me an agreed upon lingo. Referring to left as port or a wall as a bulkhead didn't demonstrate a deeper understanding; it just confuses people 'on the outside'.

If you can't talk about Big O notation (for example), you don't understand the fundamentals. But, does n log(n^2) really mean anything to you? How much faster is that than n log(n)? What about after a cache miss? In my case, if you ask me Big O notation, I would stare at you blindly, but if you ask me to pick the fastest container (for random access), I would, of course, pick std::vector. I understand, deep down in the code, that the data is guaranteed to be continuous in memory. And, I can access each element with a address increment (++address), rather than a multiply based on the index or seeking through a singly or doubly linked list. I also understand that speed isn't always the most important factor, or that typically execution speed isn't the biggest bottleneck. I pick the container that best fits the solution or std::vector if there's no reason to choose a better one.

I've also read through the software design patterns only to realize how many of them I've 'used' to solve a problem. Of course, I didn't run around referring to my solution by the same name, or know about them when I started working on a solution.

Granted, there are things that I might have a better understanding of if I had went to college; but the reverse is also true.

I'm reminded of the statement that was made to me after finishing my final qualification board in the naval nuclear power program... "The purpose of this training wasn't to learn a specific nuclear power plant, it was to learn how to learn a nuclear power plant."

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.