Click here to monitor SSC

Simple-Talk columnist

Learn Many Languages

Published 20 June 2013 9:07 am

Around twenty-five years ago, I was trying to solve the problem of recruiting suitable developers for a large business. I visited the local University (it was a Technical College then). My mission was to remind them that we were a large, local employer of technical people and to suggest that, as they were in the business of educating young people for a career in IT, we should work together. I anticipated a harmonious chat where we could suggest to them the idea of mentioning our name to some of their graduates. It didn’t go well.

The academic staff displayed a degree of revulsion towards the whole topic of IT in the world of commerce that surprised me; tweed met charcoal-grey, trainers met black shoes. However, their antipathy to commerce was something we could have worked around, since few of their graduates were destined for a career as university lecturers.

They asked me what sort of language skills we needed. I tried ducking the invidious task of naming computer languages, since I wanted recruits who were quick to adapt and learn, with a broad understanding of IT, including development methodologies, technologies, and data. However, they pressed the point and I ended up saying that we needed good working knowledge of C and BASIC, though FORTRAN and COBOL were, at the time, still useful. There was a ghastly silence. It was as if I’d recommended the beliefs and practices of the Bogomils of Bulgaria to a gathering of Cardinals. They stared at me severely, like owls, until the head of department broke the silence, informing me in clipped tones that they taught only Modula 2.

Now, I wouldn’t blame you if at this point you hurriedly had to look up ‘Modula 2′ on Wikipedia. Based largely on Pascal, it was a specialist language for embedded systems, but I’ve never ever come across it in a commercial business application. Nevertheless, it was an excellent teaching language since it taught modules, scope control, multiprogramming and the advantages of encapsulating a set of related subprograms and data structures. As long as the course also taught how to transfer these skills to other, more useful languages, it was not necessarily a problem. I said as much, but they gleefully retorted that the biggest local employer, a defense contractor specializing in Radar and military technology, used nothing but Modula 2. “Why teach any other programming language when they will be using Modula 2 for all their working lives?” said a complacent lecturer. On hearing this, I made my excuses and left. There could be no meeting of minds. They were providing training in a specific computer language, not an education in IT.

Twenty years later, I once more worked nearby and regularly passed the long-deserted ‘brownfield’ site of the erstwhile largest local employer; the end of the cold war had led to lean times for defense contractors. A digger was about to clear the rubble of the long demolished factory along with the accompanying growth of buddleia and thistles, in order to lay the infrastructure for ‘affordable housing’. Modula 2 was a distant memory. Either those employees had short working lives or they’d retrained in other languages.

The University, by contrast, was thriving, but I wondered if their erstwhile graduates had ever cursed the narrow specialization of their training in IT, as they struggled with the unexpected variety of their subsequent careers.

25 Responses to “Learn Many Languages”

  1. Robert Young says:

    – I tried ducking the invidious task of naming computer languages, since I wanted recruits who were quick to adapt and learn, with a broad understanding of IT, including development methodologies, technologies, and data.

    In many years of doing this, I’ve *never* been interviewed by anyone who didn’t insist on particular language (even down to particular compiler), OS, database, framework, and the like. Talk of generalized ratiocination had no place in such discussions. Perhaps I should aim at some higher ranking, such as programming manager?

    Nor have I seen postings requiring: “smart critical thinker; computer language and platform not important.”

    Oddly, or not, I nearly tossed out a special section of the NY Times (about which I’m considering scribbling) this morning. Which section included an interview with a Googler (Laszlo Bock), who had this to say:
    “On the hiring side, we found that brainteasers are a complete waste of time. How many golf balls can you fit into an airplane? How many gas stations in Manhattan? A complete waste of time. They don’t predict anything. They serve primarily to make the interviewer feel smart.”

    No mention in the interview that Google interrogates for language background (although I expect they do).

    Phil == Laszlo?????????

  2. Phil Factor says:

    Maybe Google now just want code-jockeys rather than software engineers, but I suspect it is more likely that they asked the wrong questions. Selecting the right questions is a difficult task because they have to be validated. You have to find the sort of programmer that you’d like more of in your company, find out what they have in common and try out your fancy questions on them. if they just look baffled, or give you the wrong answer, it is unlikely that your set of questions will do any good. We had done the exercise of studying our best people in IT and identified their common identifying characteristics. It had nothing to do with their current skills. The biggest problem with IT recruitment is that the people doing the recruitment selection process aren’t IT people; they are HR people and professional recruitment agents. They slide into the mindset of recruiting purely for skills

  3. Robert Young says:

    – the sort of programmer that you’d like more of in your company
    –they are HR people and professional recruitment agents

    I agree, from experience, on both points.

    LinkedIn, and such, are touted as ways around the second point: engage the hirer before HR; I’ve no evidence that it works, or not. The HR folks on this side of The Pond have for some years used keyword matching alone; never bothering with actual skills. Which they don’t anything about, anyway. Bureaucratic baleen.

    On the first point, that also begs the question: is one just really looking for the right School Tie? Smart, engaged folks tend to be prickly, but bureaucracies tend to weed out such for not being “team players”. While the days of Edison and Einstein are past us (although both did have help, as it happens), team playing inevitably leads to lemming cliffing. (Jimmy Carter, if you’re old enough, pointed out that the fish rots from the head.)

    There’s a report in today’s news that US home prices are going up by leaps and bounds. More lemmings are being led to the cliff. OTOH, the stories I’ve seen have quoted some in the industry as warning that the arithmetic doesn’t work. Not team players, those.

  4. Timothy A Wiseman says:

    Mr. Young, you do have a point that most jobs, especially those that are above entry level, look for specific skills rather than asking for a broad base.

    With that said, I still consider the broad base with theoretical knowledge more useful over the career of a working programmer, just as Phil says. An aggressive programmer will seek out and learn new useful languages without needing to be prompted, but it is that broad theoretical knowledge that lets them do that.

    Also, there are some employers out there that are willing to train in the particular language they use if the programmer can display the right aptitudes and proper general background. They do seem to be a minority but they certainly exist and they tend to be some of the better employers in the long term.

  5. Andrew Watson says:

    At school the only programming language we used was Logo (little turtle moving around the screen)… if you got into it you could teach most programming fundamentals…. unless you count rocky’s boots which was lots of fun, but just computer circuitry. At home I used basic on my MS DOS PC.

    At university I used (in no particular order), C, C++, Pascal, Lisp, Modula 2, Turing (mac), X86 machine codes (punched in on a numeric keypad into specific memory addresses with a 8 segment led display output), Watfor77 (fortran variant).

    In my first job I used Visual Basic (4 to 6) and SQL (somehow missed databases at uni)…

    The next job added Java, Javascript, flash actionscript, and xslt (not a ‘real’ language?).

    Now I program mostly in C#, T-SQL and javascript.

    Most of these are of the C heritage, but it goes to show that; Yes over the course of your career, you will need to adapt with the technology. Yes university learnings don’t always match with what you will use in the workplace. Yes being exposed to more than one language is a good thing.

    One day I might even try Python, D, or F#

  6. DrNetwork says:

    I recently took the US State Department Foreign Service Officer Test (FSOT). Interestingly, it asked a wide array of questions across a broad swath of subject matters. Everything from government inner workings to geography to socialology to technical questions to statistics to literature. You also had an pretty thorough English-usage exam and a 30-minute essay. Now the flip side is that the first part was multiple-guess and was more like trivial pursuit than critical thinking. But, it was an interesting first gate to enter the process. But then I suppose that the 30-minute essay balanced out the process in that you can’t be a futz and write a cogent argument on a subject in 30 minutes.

  7. DrNetwork says:

    Goodness gracious it is early in the morning. Article usage as well. Hope I did better on the English usage portion than my original posting would lead one to believe.

  8. Alan G. Stewart says:

    Just like spoken languages, computer languages have a build-in point of view or bias – a particular way of looking at how to solve problems.

    The more languages you have used, the more different ways you have of looking at problems – and the better you are at solving them.

    I stopped counting the number of languages I have worked in since I first learned BASIC in 1974. After the first 30 or so, it’s largely a question of looking up the syntax in the one I’m using today.

    However, most recruiters just don’t get it. I recently had a short phone interview with Electronic Arts (their job posting didn’t even mention what languages they wanted and specifically said that it wasn’t important).

    Wow, somebody who understands that writing great software isn’t about what language you know. WRONG – the interview ended with “Well, I’ll pass on your resume to the team, but we’re really looking for Java.” Haven’t heard a thing since.

  9. Bill says:

    When speaking with an HR person prior to the technical interview, they only know what’s on the job spec. If it says “3 yrs of C# (or java or apl)” then that’s what they’re going to ask for.

    When my company posts for a new hire, we try to emphasize the types of skills we’re looking for rather than a specific language. We know that if you have the skills, you can pick up any of the languages. If HR insists on specifying a language, we’ll pick similar languages to get a broader sampling (if we need C#, we ask for Java, VB.NET, C, C++, etc).

    I was fortunate in school to have a comparative languages class that covered about eight different languages. Different programs had to be written in each one so that we could get an idea of how each language approached a given task.

    I have to agree with Phil. To focus on a single language and do so only because the local large company used it exclusively was a dishonor to their student base. It seems rather short sighted to me in assuming that every graduate would work for the that company and that they would never a) change languages or b) work anywhere else.

  10. DrTechnical says:

    There seems to be no limit on the number of language zealots out there, who claim their chosen language is the sole solution to the world’s problems. The trouble with that notion is the same as with any dogmatic single factor solution: (to paraphrase a wise person who first spoke this) “When all you have is a hammer, everything looks like a nail.”

    I believe that learning new tools that can improve your productivity is essential for any developer, so it only stands to reason that new languages and language techniques will emerge that make better productivity possible. But those languages and techniques are not universally beneficial to all development scenarios and problems, so it is up to the developer to select the toolset that allows them to craft the best, most reliable solution in the shortest time.

    Obviously, if you know only a few languages, your choices are limited, and your decision to use only your current toolset will be a crucial factor to your success or failure of your effort.

    The time and resources available to a developer to upgrade their language knowledge and to master new skills is limited. So it may be a natural tendency to try to get by on what you already know, rather than invest time to learn a skill that may have a limited lifespan. But I believe that continual learning of new languages and techniques is both useful and enlightening, and I’m willing to invest the time to gain that new knowledge.

  11. Ross Presser says:

    Unlike spoken languages, where an American can easily go his whole life (except for secondary school requirements) without needing more than a few words in any other language at all, language fluency is a must in a long IT career. I don’t think there is any single language that has lasted 30-40 years at the top of popularity, except perhaps COBOL, and again perhaps BASIC if you count all the variants as a single language. But even with Visual Basic, new language features are introduced regularly. LINQ made it almost a whole new language.

    It’s impossible to stand still on the language front. Just as no auto mechanic could stand still, using the same manual tools for 40 years.

  12. paschott says:

    Reminds me of my days @ college. We started with MacPascal for our intro to programming. The MechE/EE majors learned FORTRAN. It wasn’t until my last year or so that they started offering courses in C++. I’ll admit that being exposed to C, C++, Pascal, FORTRAN, and Lisp all helped in my time there, but the idea that someone might write an actual app to run on Windows was a foreign concept. Databases were never mentioned. When I left, I picked up some basics for other languages, but concentrated on TSQL and SQL Server-related tech.

    The idea that any local university would be so narrow-focused as to only offer courses in the language used by a large local employer doesn’t surprise me, but it does a disservice to their students. I’m glad I work for a company that specifies we want certain languages, but when we interview we’ll consider people who can quickly come up to speed on the language because they have a solid foundation of programming knowledge.

    Sadly, I’ve seen otherwise decent programmers have trouble when their guaranteed livelihood went away. They got stuck with RPG and DB2/400 systems in a world where a lot of people aren’t looking for those solutions. There will be a place for mainframe programming, but I can only imagine the difficulty in selling that solution around the time of the tech bust.

  13. noogrub says:

    Phil, a nice post that exemplifies a life-long attitude I’ve had toward programming. I too began my career learning bizarre “teaching” languages (Scheme, anyone?) but discovered quickly that there is a difference between engineers and bricklayers. I have worked with many programmers who are happy to be bricklayers – that is, they specialize in one language or another, one environment or another, and they move from one company to the next selling their highly-honed .NET / Java / Flex skills. I have no problem with them. It’s simply that I do not wish to be that kind of builder.

    Instead, moving from FORTRAN M77 into Pascal, I began to realize that I wanted to be more of an engineer. My efforts in C, and C++, and Java, and Perl and PHP and Flex and Ruby on Rails became more and more abstracted until I was able to quickly figure out the overall structure and business requirements, no matter what the environment. (OOP? MVC model? legacy COBOL copybooks interpreted via scripts to Oracle? Flex 4 front ends migrated to AngularJS? It’s all good!) Along with this, though, I found that interviewing for tech-specific positions is a waste of time. Instead, I interviewed for (and found quickly) a company that needs my skill set.

    So far as the goofy interview questions, they must just be there for entertainment. The last guy who asked me why manhole covers are round got a good laugh, and handshake, and was advised to hire someone else. I was definitely NOT his candidate. I urge other candidates to do the same. The PHP position I interviewed for that gave a pencil-and-paper code test got similar treatment.

    In my experience, I end up being the person helping the more specialized teams accomplish their goals by picking up the projects that require polyglot agility. It’s been a great career, and I have never had difficulty finding excellent, challenging positions. Right now I am working on a project that involves no less than 7 different technologies, not counting the database imports/exports between MS SQL and Oracle and the supporting technologies such as Spring, Hibernate, Roo, AdobeCQ, and so forth. I find it astoundingly satisfactory to develop agility between all these. The Java bricklayers on my team shake their heads and happily pass me those tasks. We have had solid, consistent success.

    In fact, many companies which are moving to in-house software development are doing so in response to the need to support a number of projects which were built by consulting firms over a period of several years – which often means a menage of technologies built for rapid deployment, not long-term maintainability. So although I have seen many potential employers who thought they just needed a specific programmer (Java, for example, or Ruby on Rails), they end up needing staff who have taken a broader view of IT over a long period of time.

    When I encounter the individuals who espouse one religion or the other (Perl vs Python, Mac vs Windows, you name it) I just smile and remember that there are many kinds of bricklayers, but really only one kind of engineer. These truly are completely different roles.

    So far as the academics are concerned, they are in their own world. At my alma mater, the narrow-minded, dusty old geezers got pushed out when the computer science school was merged along with other departments into informatics. The free thinkers and quick adapters were the ones who survived that transition and moved on into exciting new areas of well-funded research. The Modula2-ers, Schemers and other lofty sorts took their retirements and harrumphed home.

    Some universities may indeed be doing a “disservice” to their students, but those of us who continue to pay attention, look around at what is happening, and develop good professional discernment have done quite well, whether we became fabulously talented in one technology, or whether we strive to become engineers who can solve problems apart from any particular flavor of code.

    Cheers from the US.

  14. Keith Rowley says:

    I have little patience for academic snobbery. People in the “real world” of commerce pay for the salaries of these academics who think they are above teaching practical skills. (Albeit indirectly I admit.) Hopefully the university eventually learned better than to snub the business community the way they did in your experience.

  15. Robert Young says:

    @Keith:
    Phil — I said as much, but they gleefully retorted that the biggest local employer, a defense contractor specializing in Radar and military technology, used nothing but Modula 2. “Why teach any other programming language when they will be using Modula 2 for all their working lives?”

    What it boils down to: the other company got there first, and so got the sort of ” practical skills” which suited them. Phil just lost, nothing more. One might argue that the school should have segued to some other teaching language, and perhaps they did. As it is, they were MS/DOS languages, and not *nix, which was/is the core of university computing.

    Given the last decade or two, one might argue that sticking with Bill’s Company rather than teaching *nix is a larger sin.

  16. aneedham says:

    I went to a university that also taught Modula 2 BUT just for the first semester or 2 for teaching the basics. I think that professor had come from the ‘military industrial complex’ world too. That industry has all but left our area now too.

    Most courses after that used C++ but with a course using C here, one teaching Windows/GUI programming with VB there, and even assembly (with some exercises in ML) in a course too. So we got a mix of languages and exposure to others, both OOP and procedural, and hopefully learned to transfer what we learned with our previous languages to the next one.

    And then my first job out of school for the next >10 years was with a consulting firm where we did:
    1) legacy support of systems: one language of which I had in school (Pascal) and one of which I had barely been exposed to (COBOL), so I learned/was taught it
    and
    2) custom programming in the language(s) our company was strongest in and recommended (which was another that I hadn’t used: so I learned/was taught it)
    and
    3) did work for shops where the language of choice was already decided (and they wouldn’t/couldn’t change) and if we didn’t already know that language, we learned it to get the job done for the client

    During my stay there I was exposed to several new languages and strengthened ones I knew.
    If I had used the same language (especially one of those ‘legacy’ ones) for the whole time I was there, I wouldn’t have stayed >10 years (and/or probably would have a drinking problem by the time I did leave).

    Fast forward to my current job: using a language I was exposed to a little in school and a little at that first job but didn’t use much. Getting back up to speed in it was pretty easy, thanks to the many jumps between languages I’ve made over the years.

    I came to expect language changes over my career and didn’t expect to be a C++ programmer my whole life (or any other single language).

    And, as a student, I would have loved to have someone like Phil, employers in the area, just let us know they were there and had needs now or potential needs down the road.
    There were 3 main large employers in our area (one being in the ‘military industrial complex’) and if those 3 didn’t appeal to you, we mostly assumed you had to leave town to use an IT degree.
    Although I never worked for those 3 and am still in the area and know better now.
    Of course I also would have wanted to know what language(s) they used but I wouldn’t have put blinders on.

    Andy

  17. paschott says:

    Robert and Andy – good points, but if the schools aren’t teaching the fundamentals of programming they’ll turn out students who are ill prepared to work. MS programming isn’t necessarily bad, as long as it involves some .NET and UX work. I think a solid grounding in an object-oriented language as well as understanding what makes a program work is important as well, though. If you get that and can be a little flexible, you can likely handle whatever language is required by the clients. Phil’s example showed a pretty short-sighted college specializing in one language to suit a large local employer. That might help some students, but if they didn’t offer any other skills the school was doing a disservice to their students.

    For my school, I learned some basics with Mac Pascal, but would have been better served with the mainframe C++ they started using 2 years later. Even better, it was just ANSI C++ so you could get whatever compiler you wanted as long as it could compile standard C++ apps and it would compile/run when uploaded to the mainframe. I could build on that knowledge because it wasn’t just about the language, but why various ways of doing things worked.

  18. Doug Baker says:

    It appears from previous replies that one of the main points a out academic education is whether or not you went to a good school. As Phil said, the “university” was then a “technical college”, which may have been a factor in how they taught. Having worked in IT/development for 15 years without the benefit of A formal computer science education, I’ve always felt I missed out on some very important learning in fundamental concepts. However, I often find myself in roles requiring critical analysis and the ability to assess any situation and apply or determine the best approach. In fact, I often credit my critical analysis skills to my philosophy courses more than the relatively few CoSci courses I completed. I recently took a new job as DBA in an organisation that hires the majority of its IT staff from the local university. While many of our developers and interns are extremely bright, adaptable individuals, many of them also bemoan the fact that their college professors taught than little that is of use in the real (I.e., business) world. On one hand I think they are right, but on the other hand the school is my alma mater, I know several of the faculty, and I know they are teaching a range of languages and concepts which, if one is open to the idea that any learning is beneficial, can provide a solid basis for comprehensive insight into the IT world. I’ve always been a fan of learning for the sake of learning and broadening horizons, and would happily continue my education in the academic environment.

  19. paschott says:

    @Doug, I can agree with that somewhat. I went to a pretty good school, but definitely got little in the way of “how does this relate to life outside school” education. I did learn a lot of good concepts and critical thinking. I took far fewer comp sci courses than I would have liked. However, while there we concentrated so heavily on the intangibles that it was hard to see how those concepts would apply in a real-world scenario. Sometimes we’d get there and sometimes we would have a whole course that just never seemed to link the two. I think that most schools could do a little better painting a picture of where some of the skills/lessons might apply once you leave the academic environment. I also think that using tools that really are found in the real world makes for a better education. Learning programming concepts using COBOL or FORTRAN isn’t necessarily bad, but if you get out and people are using Java, C++, or C# you have a bit of adapting to do. The deeper skills are great to have, but if you have trouble finding a job in which to use them it’s a bit moot. :) In our source story, the focus was too tight on the local big employer – great for people who would stick around that area at the time, but not so great if planning to leave that area.

  20. I find it sad / funny / interesting that the situation Phil describes could have just as easily been the university I attended, Illinois State University. Just down the street we had not just one, but two national insurance companies: State Farm and Country. State Farm is one of the largest employers in the area (http://en.wikipedia.org/wiki/Bloomington,_Illinois#Notable_companies — not sure how many are local vs other cities / states) and happened to be a large donor to the Applied Computer Science department at ISU. Hence, we did things the State Farm way: IBM Token Ring network, OS/2 (a few labs at least), COBOL and PL/I classes, etc. In fact, the first two required classes for all ACS majors and minors — fundamentals of programming — was based in PL/I: an entirely useless language outside of State Farm and few other places, at least in the mid-1990′s. We had to submit work via 3270 terminals that were scattered all across campus (as well as register for classes on them, etc) using JCL (if you don’t know what JCL is, consider yourself blessed). I think they had a single C class or maybe a second using C++. I took mainframe Assembler and thankfully they gave us a PC-based simulator so I could do my homework at my apartment. We had one or two RS/6000s (IBM all the way!) that we used for email and eventually web pages (again, mid-90s) but no classes in Unix. A group of students requested some Unix classes and more classes dealing with C, C++, and other stuff we felt would be more indicative of what employers were looking for. Initially this was turned down because the administration felt that they were mostly preparing people for work at State Farm or maybe Country, or that the concepts we learned via PL/I and COBOL would suffice for anywhere we wanted to go outside of the local area.

    To be fair, the department was APPLIED Computer Science: they were training people to be good employees who programmed and not computer scientists who were great programmers. This is partially why we had a required class in the Communications department for working in small groups and another required ACS class was essentially putting together a group (about 6 people) project: learn very basic data structures and what an ERD is and how to use the horrible OS/2 software to create them, find a local company with no computer system and design a system to manage their business (in the end giving them the report for free), and present to the class at the end of the semester as a group. And fortunately both of those were really good skills to have.

    My last year there they had FINALLY come around and retooled the department. They were using C for the 101 and 102 classes and even giving more challenging assignments.

    All that said, I completely agree with the ideas expressed here in both the main article and the comments about the value of learning (and using!) multiple languages: more job opportunities AND seeing different ways of solving problems as no one language can suffice for all situations .

    I also will 2nd the others who commented about:
    1) Brain teasers are useless in interviews. Joel Spolsky mentioned this as well in his “The Guerrilla Guide to Interviewing” ( http://www.joelonsoftware.com/articles/GuerrillaInterviewing3.html — towards the bottom, paragraph starting with “Finally”).

    2) My philosophy classes helped out enough to be seriously considered as required classes for all Computer majors and minors (and as a philosophy major, actually, I would argue that a couple intro philosophy classes should be required for all students because there is no downside to getting more practice at thinking). As programmers and/or administrators, we are problem solvers. And while computers are a science and any theory can be tested and proven or disproven, when problems arise you need to be able to think quickly and abstractly about how things WILL work once you try them but you can’t just tinker with Production until you get it right. And even more so as a programmer who is, alone or part of a group, tasked with creating a new feature. And one of the skills you pick up taking more philosophy classes is communication! You have to get your ideas across verbally and in writing. How can we work in a group, or for clients, if we can’t get our ideas across, identify questions to ask and ask them in such a way as to get a meaningful answer, and write up our ideas or our findings on a bug and how to fix the issue. Just like learning multiple programming languages, philosophy classes expose you to other ways of thinking and have the added benefit of building the soft skills of problem solving and communication.

    I would lastly like to offer a counter-point to the idea expressed in a few of the comments about places hiring for specific languages rather than overall skill set. Like many of you I have been involved in the hiring side of things and giving interviews and not just looking for a job and taking interviews. When I am looking for a job I feel like my ability to learn computer languages and problem solve should be more highly regarded than my particular expertise in T-SQL as opposed to PL/SQL. But on the hiring side I understand the value of experience. Yes, a good company / interviewer will spot an exceptional programmer and be more than willing to pay for the training to come up to speed on a language the candidate is not yet fluent in. But since most of the people interviewing are, by definition, not exceptional then experience is highly valuable. Yes, JAVA and C# are VERY similar, but not knowing the differences and nuances can have a very good JAVA programmer writing very bad code in C#. Likewise, someone very good at SQL Server will most likely make some very poor choices when designing tables / queries in PostgreSQL or even Oracle. Each language and system has its best practices which can be things to never do in another language or system. Knowing what I know now after working with SQL Server for 12 or so years and how I came to learn all of these things not just by reading or going to presentations but trial and error, I wouldn’t presume to become an Oracle expert in the first month of the job; it takes years. And when I interview people, if we have projects that have tight deadlines then I certainly place a large weight on experience and have no problem tossing out resumes with little to no SQL Server experience without having some top-notch accomplishments to indicate that the person will learn quickly and thoroughly.

  21. Celko says:

    I wish more people had a week or so of COBOL! It is still the way that most data is modeled and stored in the commercial world. With that knowledge, you can read and understand your data.

  22. grenac says:

    Probably the number one reason that 50% or more of perfectly capable IT graduates walk away from the industry every year. I see it that 95% of the time it’s the recruiters lack vision or understanding of what IT graduates hsve been equipped with that is the problem.
    As for Modula-2 (and -3), I may be biased, having learnt it 25 years ago. Anyone use: TRY/CATCH/FINALLY error handling, abstract datatypes (a pre-cursor to OO languages), the newer parallel programming frameworks on .NET 4? So, in summary, bits of the language are all over our modern languages and virtually all of the principles it taught are today’s best practices (25 years later – industry is still catching up).
    Fine if you want a wall made of brick – hire a brick-layer. If you want 1,000 walls constructed in a week (probably pre-fab concrete) or just one “big wall” to hold back 50 cubic/km of water – you might want to talk to an Engineer first (preferably a “Civil” one)!
    Finally, no particular language or series of lectures is going to help graduates overcome the shock of finding how widespread and persistent the damage has been from mis-management and lack of applying best practice. It’s like a new MD finding that half of Doctors still use leeches as their treatment of choice (yes – I know, they have actually come back into vogure recently for treating putrified wounds).
    – A DBA / developer.

Leave a Reply