When theory meets practice

RawrRawr REGISTERED, Tester Posts: 503 Seed
Going through the usual motions of university/college study. Assignments, tests, exams, HOLIDAYS! assignments, tests exams, HOLIDAYS! repeat, repeat, repeat.

I'd regard myself as one of the more engaged students within lectures and tutorials. Regardless, there are times of absolute mind numbingly boring material. During such times I'll most probably get stuck on the question of theory vs practice.

If I phase the question as: Do businesses see value in the material I'm learning? Then unfortunately, I believe the answer is no. This is for two reasons, either it's a given that I understand it or it has no relevance to the sphere in which the business operates.

As a student amounting a sizable debt I need to be able to rationalize the situation. This is becoming harder and harder for me to do. For example, when someone asks me a tech question about should I do this, can you fix this or why this, the knowledge and experience that I draw on is not that which I gained from studying.
That is a problem. That is a huge red flag.

If nothing else, I'm looking for some affirmation from within the industry. Affirmation that the things I'm learning are useful/valuable AND not just things I should know. The fundamental theorem of calculus is something a mathematician knows but if nobody needed to know the total area below a function then was it worth 12 weeks study and 3-4 thousand dollars... google could have told you that 0.27 seconds for free.

I swear, if someone had handed me a reference sheet of all the ideas taught throughout a undergraduate Computer Science degree. 1) I wouldn't have gone to a university to study it. 2) I would've learnt it all by now. 3) I would've saved everyone time and money.

This has been my small rant about how the goal posts are moving and I'm losing confidence in the traditional approach to learning/education.

Insanity is doing the same thing over and over again and expecting different results.
Programmer, designer, artist.


  • ZakeZake REGISTERED Posts: 216 Seed
    I haven't been in industry that much but I have had a peek into the academics (I'm a CS minor). It's my impression that for CS, the important part of the formal education vs self-teaching is that if you have a good teacher following a decent curriculum, then it can be assumed you know a certain set of skills. If you take a course on C programming, it's assumed you know everything about programming fundamentals, pointers, etc., which you can learn yourself, but also, they teach you potential pitfalls and ways to be a better programmer. So if you learn from StackOverflow, you might write code that technically works but could be buggy, inefficient, or prone to exploits. At least with a degree, they know that you've at least heard the important things.

    As far as calculus, etc.: I've found that we're taught far more in school than we'll ever possibly need, because for everything they teach you, there's a chance that you might need it, and in such a case you'll need to have the tools.

    As for the time commitment and Googling: without knowing the basics, you don't know what to Google. And the more experience you have, the better programmer you'll be. Because if you've been practicing for 4 years, then most times you won't have to open Google, but if you don't have experience and you're trying to learn from reference sheets, then you'll spend 3 minutes on Google for every 1 minute programming. This last thing I know, because I'm still a rookie at programming, and almost every practical application hasn't yet been covered in my study. So I actually do spend lots of time on Google when I would be programming TUG or Minecraft or whatever.
    I think more than I say and say more than I do, but I do more than I used to and plan to continue.
    TUG modder (at least in the ounce of free time that occasionally flits by)
  • RawrRawr REGISTERED, Tester Posts: 503 Seed
    And then there's reality. I'd like to bring up a recent example of mine. The first year of CS study was all about getting to terms with python and learning a couple of different interesting aspects (admittedly ones that I wouldn't have easily discovered by myself). The part that has gone terribly wrong however is that this year the lecturer's have said "You guys know python so now by extension you know java".
    We're being dragged through a curriculum change that hasn't been addressed across all the different levels.
    No. Just no. I would call us slightly above novice in python and that in my view would be pushing it for some of the students. The overview would be, I was getting to know parts of the standard library and not having to think about my syntax and then thrown into another language that forced me to setup differently as well as not knowing enough to print strings nicely.

    I mean, I agree with what you're saying. Good teacher + good curriculum = good outcomes. Looks great to academics and it will in some cases actually is a true equation. I'm of the opinion however that learning is a bit more messy than that nicely balanced equation. Flexibility within the realm of teaching/learning is what I'm after.

    Drawing out one of my major points earlier, if given pointers i.e. have a look at this, find out what's going on here, would be all the structure I need for my learning. Noticing that I've haven't said no to teachers/mentors but also that this kind of strategy could be implemented for a lot less (both time and money) than sitting down in a classroom for a couple of hours each week. I mean, does learning only take place within the classroom?
    Programmer, designer, artist.
Sign In or Register to comment.