Question

I'm a relatively new developer, fresh from college. While in college and during subsequent job-seeking, I realized that there were a lot of "modern" software development methodologies that my education was lacking: unit testing, logging, database normalization, agile development (vs. generic agile concepts), coding style guides, refactoring, code reviews, no standardized documentation methods (or even requirements), etc.

Overall, I didn't see this is a problem. I expected my first job to embrace all of these ideas and to teach them to me on the job. Then I got my first job (full stack web development) at a big corporation and I realized that we do none of these things. In fact I, the least experienced on the team, am the one who is spearheading attempts to bring my team up to speed with "modern" programming techniques - as I worry that not doing so is professional suicide down the road.

First I began with logging software (log4J), but then I quickly moved on to writing my own styleguide, then abandoning it for the Google styleguide - and then I realized that our Java web development used hand-written front controllers, so I pushed for our adoption of Spring - but then I realized we had no unit tests, either, but I was already learning Spring... and as you can see, it becomes overwhelming all too quickly, especially when paired with normal development work. Furthermore, it is difficult for me to become "expert" enough in these methodologies to teach anyone else in them without devoting too much time to a single one of them, let alone them all.

Of all these techniques, which I see as "expected" in today's software development world, how do I integrate them into a team as a new player without overwhelming both myself and the team?

How can I influence my team to become more agile? is related, but I'm not an Agile developer like the asker here, and I'm looking at a much broader set of methodologies than Agile.

Was it helpful?

Solution

It sounds to me like you are putting the cart before the horse.

What is the major problem your team is facing and which technologies would help fix it?

For example, if there are lots of bugs, particularly regression-type bugs, then unit testing may be a starting point. If your team is lacking time, perhaps a framework may help (medium to long term). If people have difficulty reading each others' code, styling may be useful.

Remember that the purpose of the business you work for is to make money, not to make code.

OTHER TIPS

Java? Modern?! You've failed at the first hurdle. If you want to be truly modern and avoid "professional suicide" then you must be writing Rust code. Of course, next week, it'll all change and you'll have to learn something even newer to keep up!

Or, you could accept that no amount of buzzword technologies or methodologies or frameworks or any other du jour will change the fact that you want to build quality products that work. It doesn't matter if you don't use agile if you are successfully producing the right output. Of course, if you're not, then you might want to change things but chances are no particular practice will fix the problems. They will remain human problems that can be fixed in any number of different ways.

As for professional suicide, if you know what you're doing and are flexible then you don't need any of the things you mentioned. People will continue to come up with new ways of doing the old work and you'll always be catching up. Or you can simply use whatever techniques your current company uses regardless. When you change your company, you simply learn and use the techniques they use.

Too many kids get hung up on all the new tools they could use, forgetting that these tools are worthless in the hands of a novice. Learn the practice first, once you're an experienced developer you can start to "fix" the development practices with "cool new things", but by then you will hopefully have realised that they are not nearly as important as you currently think, and only to be used when they are really needed.

Many companies are stuck like this; you might even be surprised to find that some of your developer colleagues are self-taught and became developers with no formal background whatsoever. These developers are often better at their jobs, since they will be the ones that are driven to learn new skills and succeed instead of simply doing the job. Unfortunately this can also mean that, while they're excellent at programming, they might not be aware of the benefits of these practices. The fact is these are best practices, not common practices. The best use them, but they are not at all requirements to succeed, they are simply tools to help make success easier.

You're absolutely right, it's going to be overwhelming to try to implement everything all at once. You'll likely burn yourself (and maybe your team) out trying to do it, which is going to demotivate future pushes to adopt new methodologies/technologies. The best thing to do in a situation like this is pick one thing (logging is probably a good start, as you've probably got a tough road ahead finding bugs without logging, and there are sure to be bugs) and talk to the team about it. You don't have to singlehandedly implement this; you'll do much better discussing the pros and cons with the team (and your boss, who absolutely MUST be on board with something like this) and coming up with a plan to implement it. It's going to have to be as painless as possible (remember, you're telling people that they now have to write extra code in addition to what they already do).

And let me say again, make sure your boss buys in. This is crucial; you will probably find that the speed of fixes/releases slows down as you implement new things. The point is that you're paying upfront to save down the line; they MUST understand this and be on your side. If you don't get them on board, you're fighting a losing battle at best, and at worst they may consider you as actively sabotaging the team (ask me how I know).

Once you bring these items to the table, discuss them with the team, plan how to implement them, and follow through, the second, third, eighth, etc. will be easier. Not only that, there's a potential for the team and your boss to gain respect for you as your suggestions are implemented and recognized as adding value. Great! Just make sure that you stay flexible: you'r pushing against inertia here, and change isn't easy. be prepared to slowly make small changes, and make sure that you can track progress and value earned. If you implement logging in a new process and it helps you save hours finding a bug in three weeks, make a big deal about it! Make sure everyone knows that the company just saved $XXX by doing the right thing ahead of time. On the other hand, if you get pushback, or have a tight deadline, don't try forcing the issue. Let the new change slide for the moment, and circle back. You won't ever win by trying to force the team to do something they don't want to do, and you can be sure the first thing they'll suggest dropping is the new 'extra' work (like writing logging, or following a styleguide instead of just 'getting it working').

I hope you have not presented the issues to your coworkers as you did to us in your post. THAT would be professional suicide.

The first issue is that you are trying to teach technologies and methods that even you do not have experience with to a group of programmers that, maybe are a little outdated, but get the job "done". The possibilities of that backfiring are endless, and will probably bring a lot of joy to your coworkers. It is interesting and admirable that you want to improve yourself and your department, but avoid using terms like "spearheading". Sincerely, don't use that word.

As a side issue to the above, check that you are doing some work. I have been working as a lone, self-learning programmer for a lot of time, and I know how easy is to set aside the actual work in order to explore promising frameworks, technologies and the like. Ensure yourself that your performance is within the expected parameters (no, nobody cares that you spend 20 hours researching Spring if that report they asked you is not done).

From all of the above, avoid being the teacher (unless it is related to a field/tech in which actually you have enough experience). A more neutral presentation would be pointing the advantages of, say, automated testing, and letting management chose who they want to research the pros and cons of those practices.

Now, for presenting those "best practices", there are two ways of explaining them to your team:

  • Because I say that they are the best practices, and that is enough.
  • Because they are useful and help solve problem.

Using the first argument, unless you are the boss or a very senior member of the team, it is unlikely that they will give you any attention. And "I read a book from Knuth that says so" or "the guys of the SE say that" will not cause any impression, neither ("those guys do not work here so they do not really know what is good for this IT shop"). They have their methods, routines, procedures, and things "more or less" work, so why take the effort and risks of changing?

For the second approach to work, there must be the realization that a problem exists. So:

  • don't go pushing day and night for automatic testing. Wait until an update breaks some features, and the team has to work overtime to fix it, and then propose to build an automated test system.
  • don't ask for code reviews. Wait until Joe is in a extended leave and there is a need to change in that module that only Joe knows about, and point to your boss how much time was lost just trying to understand Joe's code.

Of course, change will be slow and progressive (more so in a big corporation). If you get to introduce code review and automated testing in five years, you should congratulate yourself on a work well done. But, unless there is a complete rewrite due to external causes, forget about any fantasy that they will switch the core IS to, say, Spring (Joel explained that way better than I ever could, even before you were born1); in that time you could, at most, get Spring in the list of supported platforms to write non-critical systems.

Welcome to the world of enterprise IT, boy! :-p

1: Ok, maybe I am exagerating a little, but not too much.

You should start with the book Working Effectively with Legacy Code by Michael Feathers. From the book's introduction, "It's about taking a tangled, opaque, convoluted system and slowly, gradually, piece by piece, step by step, turning it into a simple, nicely structured, well-designed system." He mostly starts with automated testing, so that you can refactor safely (you'll know if you break anything), and he includes lots of strategies for bringing difficult code under automated testing. This is useful for every project that's still under development. Once you've got some basic order in place, you can see what other modern technologies your project could really benefit from--but don't assume you need all of them.

If you want to learn a new framework (or something) for professional reasons rather than because your current project actually needs it, then you should use it in some personal project (on your own time).

Source control.

You did not mention it, hopefully because it's already in place, but, in case it's not, start there.

Source control has the biggest bang-for-buck, except in rare circunstances pathologically in need of something else.

And you can start alone if no one initially buys in.

A Direct Answer

Other answers make good 'meta-points' about adopting better practices but, just to give you some directly relevant guidance, here's a rough ordering of the best practices I'd suggest your team (or any team) adopt (first):

  1. Source control
  2. Issue tracking (project and task management)
  3. Automated builds1
  4. Automated deployments

1 A very much related practice is to automate, or at least document, setting-up the build and development environment of each app or software project you're developing or maintaining. It's much less useful tho because you're (hopefully) doing this infrequently or rarely.

Everything Else

You mention several other good practices – "unit testing, logging, database normalization, ... refactoring, ... documentation" – but these are all practices that can and should be adopted gradually and incrementally. None of them need to be adopted all at once and you'll probably better adopt them at all by adopting them carefully and mindfully.

The four practices I listed above will also make adopting, and experimenting with, new practices as easy as possible. For example, unit testing can be incorporated into your automated builds and documentation can be published as part of your automated deploys.

Some of the other practices you mention – "agile development, ... coding style guides, ... code reviews, ... standardized documentation methods" and frameworks (e.g. Spring) – are really optional or of dubious value. And that's true of a lot (most?) possible practices you'll discover or encounter.

Agile development isn't obviously superior to any other methodology used. And a lot of people (including myself) have had horrible experiences with it. But lots of people really like it (or love it) too. Try it!

Coding style guides can be helpful, especially for large projects, or in large teams, but it's also a lot of work to enforce those guidelines and that may not be the best use of the time of whomever is doing so.

Code reviews can be very helpful too – have you asked your coworkers to review your code? Keep in mind that you don't need a formal process to adopt good practices!

Documentation is great – do you have any at all? If so, good for you! Are you facing lots of extra work that could be prevented by having (more) "standardized" documentation? If you are, then that's probably something worth doing. However, say if your software is being used by a small group of people, you may not need any documentation. (Or you could directly incorporate the documentation into your software. That's always my preference.)

Frameworks are ... a (very sharp) double-edged sword. A nicely encapsulated and well-maintained solution for a non-core feature of your software is great ... until it isn't. I'm not sure what "hand-written front controllers" are exactly but there's no obvious explanation as to why they're inferior to code that leverages Spring. Have you considered encapsulating the common logic in all of these controllers into your own in-house framework? Adopting Spring, and rewriting all your existing code, could be an immense refactoring (or, more likely, rewriting) project and that might not be the best change you could make to your code. Of course you shouldn't write all of the software you use – frameworks (and libraries) are great! But maybe you don't have to use Spring (or an alternative) to write great (or good) web apps.

Look around the team that you're a part of. Can you see any evidence that test driven development or database normalization will improve the quality of the software you're writing or make people more productive?

Have you tried speaking to one of the development supervisors about it, or the head of development? A really informal chat would be a good start. What makes you think that the people above you haven't had the same ideas but can't / won't implement them because the business won't allow it?

I find that leading by example is a good way to go. People are a lot less resistant if someone else has already done the work and can show them how to replicate it. Introduce TDD into a project you're working on. Ask to give a presentation to the rest of the team, or even a couple of others and show them what you have done. What @DrewJordan said about getting buy in from the boss is more important than you probably realise.

Find a flaw. Fix a flaw. Show the fix.

Let's take normalisation* first. And indeed, I'd suggest you take it first, because lack of normalisation is likely to result in actual buggy data that couldn't exist otherwise, whereas the rest are cases where best-practice could likely help but it's harder to say "Bug A was caused by not following policy X". If you have a database that isn't normalised, then you have a place where data can be inconsistent.

It's a good bet that you will be able to find an actual case of inconsistent data. You have now found two things:

  1. A bug in your data.

  2. A bug in your database schemata.

You actually knew about the second bug first, but the first one is more easily demonstrable and also something that is causing a real problem, not something that theoretically could.

Unfortunately one of the real reasons for resisting normalising denormalised database is that the question of what to do with such buggy data is not always easy, but you will have found an actual bug.

(Do be aware though that there are reasons why one might sometimes denormalise some data on purpose. Don't mistake knowledgeable breaking of the rule for ignorance of the rule; if you normalise a table that is deliberately denormalised for speed of lookup, you won't win any kudos. Even here though, denormalising being fraught is something that should be done procedurally, so if the denormalised table isn't created automatically based on the contents of normalised tables there is still progress to make).

For the rest, introduce them when they help in the short-term, to later build on them in the long term. E.g. if you are given a small piece of code to build, write a unit test for it. Better still, if you are given a bug to fix, write a unit test that fails because of the bug and then when you have fixed the bug mention the fact that it passes when you close off the bugs (or send an email saying it's fixed, or whatever).

*Incidentally, not very modern. The reason it is called normalization and not normalizing or something else, is that at the time it was a topical joke to stick -ization on the end of things to make fun of the name of Richard Nixon's Vietnamization policy.

I'm going to go against the grain and say: find a new job after spending some time at this one to build your resume a little. Aim for a year or so. Although many things are "buzzwords," issues like a complete lack of unit testing are intractable for a single developer, and chances are that if the programmers working there have no desire for testing, you'll never get purchase and may even jeopardize your position at the company by making people think of you as a blowhard. You need to be in a place where you can get mentoring, not trying to push the culture towards basic competence.

There have been plenty of proposals for improving the programming paradigm. The hottest buzzwords now seem to be agile programming and object-oriented. Or are they? Both have faded substantially compared to what they were just five years ago.

You can be fairly confident that whatever methodology put in place is trying to accomplish the same end result: help engineers economically produce a final product which is good enough.

There is one paradigm that was controversially introduced in the 1960s: structured programming: only use "high level" constructs like while, for, repeat, if/then/else, switch/case statements instead of the heavily used goto statement which had been accepted by default. There are still debates about whether goto has any legitimate use at all.

I accept that minimizing use of goto is a good thing, but like all good things, it is possible to go too far.

You mention agile methodologies as something positive. I was in one development team for about six months that intentionally followed a prescribed agile regimen. I found it to be just like enlightened project management methodologies from decades ago, except everything is renamed. Perhaps by re-bundling and reselling ideas for communicating someone makes a living and companies can feel good about "seeing" the emperor's new clothes.

Agile's most valuable lesson, which was known long ago, is that flexibility to find a better path to the finished product is a good thing and the ability to find that path can come from anyone—not just upper management.


From a writing by anti-goto ringleader Edsger Dijkstra:

The art of programming is the art of organizing complexity, of mastering multitude and avoiding its bastard chaos as effectively as possible.

—Dijkstra, in: Dahl, Dijkstra & Hoare 1972, pg. 6. (see page 6 here.)

Licensed under: CC-BY-SA with attribution
scroll top