Software Development Team Leads

Software development “Team Leads” do not write code.

(Except in small business.)

Any job that asks for a “lead” and expects them to write code is really asking for a “senior developer” while paying lip service to “managing the team as a leader”.

True team leads don’t have time to write code, because they are:

  • Meeting with the business
  • Planning
  • Designing
  • Reviewing
  • HR-ing
  • Thinking
  • Learning

Command-line editors: there is no debate

It’s 2019.

Why the heck are we still having the vim vs emacs vs other command-line vs basic-text-editor debate for writing code?!

Don’t use either!

There are editors, IDEs and tooling today for a far superior writing, debugging and supported experience.

It is NOT “cool” or “good experience” to write in command-line editors. That’s like a modern mechanic having a Model T Ford in the workshop to use a reference for repairing a Tesla.

IT people and software developers are here to serve businesses and consumers in the best way possible, not scratch around in an old sandpit. We even have modern tools that give us a superior experience for FREE (I’m looking at you Visual Studio Code).

I certainly won’t hire someone if their tool of choice is command-line based or they use “grep” in conversation. That’s a time and financial loss waiting to happen.

The Cloud (is not the rose that it seems)

The Cloud doesn’t remove the need for infrastructure and operations people. Nor does it particularly make infrastructure easier.

It simply shifts where the hardware runs and who owns it.

And for small software teams it still means the developers and project manager are “the” operations and infrastructure people.

The big difference now is The Cloud makes it much easier to accidentally spend vast amounts of money on infrastructure you had no idea you deployed (or who deployed it), where you struggle to understand what it does or why you need it, and you can never seem to figure out how it is charged on your bill.

I know this because I live it.

 

(Also published on LinkedIn)

The difference between old media and social media

There are two clear differences between almost all published media that has come before and the social media we know now.

1. Available audience
2. Cost

Not cost to the consumer.

I’m about talking cost to the publisher – the person who wants to distribute their content.

In the past there has always been a greater cost to produce and distribute content, whether it in time, effort, materials or straight up money to get others to do work.
There were higher barrier to publishing and distribution: paper, printing, physical distribution of real media, a limited audience who would see the material. It was not readily scalable, nor was it cheap to scale.

That cost generally translated to a need in higher quality content, because the producer had to maximise on their return.
By that I mean: they couldn’t afford to produce shit no once would waste their time on, whether they were paying or not.
They needed maximum return on their money spent, for what was generally an expensive outlay to spread their content.

In addition, and partly due to the cost of media a type of distribution, the audience was limited.
Distribution was limited, harder, and bound my the physical medium.
Again, the producer needed to ensure maximum adoption of their content.

But with the Internet and social media that has all changed.

Social platforms allow “free” and easy distribution of content.
They also allow a “free” and easy inclusion of a massive audience.
That’s an incredibly low barrier of entry on both sides of the equations.
Combine that with increasingly easier ways to create content – written, audio and video – through readily available consume tools, and an abundance of cheap add-on services offered via outsourcing, and we have now have social platforms open to publish almost limitless amounts of content, of every quality, at almost zero cost to both the publisher and consumer.

But there is a cost.

Some may say it’s distraction. Some say privacy. And others see pathways for disinformation.
All are true to varying degrees.

What is see as the ultimate cost to the average person is:

Time and attention.

Because every piece of content generated and distributed on the social platforms is calling for our attention. And it take away our precious time, to both consider and digest, regardless of value.
People are hungry for it (for whatever their individual reason).
And worst of all, in the world of social media:

There are no editors.
No one to check for quality or facts.
No one to curate the chaff from the wheat.

Our time – our precious, limited time alive – is second-by-second being consume by other people’s shit.
Shit that is now without the barrier of economics that anyone can produce and distribute things to suck our time.
Yes, we get a momentary hit of joy or a hint of interest, but how often do we get value that actually enriches us or is usable?

I wonder: what if the social networks flipped the model and started charging for content to be published? How will the world look in 10 years time?

Accessibility: the last consideration of software developers

Accessibility is often the last consideration of software developers.

Not because we don’t want to do it – we would if we could.

It’s for one simple reason:

Building non-accessible software is already bloody difficult!

We’re taxed to the limit as it is. Accessibility is just another layer that, I’m sorry to say, means an 80+ work hour week instead of a 60 hour week.

And business – unless they are specifically designed for or required to provide accessibility – is not interested in paying for that extra consideration.

“Exposure” is not a valid form of currency

This comic from The Oatmeal says it all:

Exposure
Source: https://theoatmeal.com/comics/exposure

Sorry folks, but “exposure” is not a valid form of currency.

If someone is good enough to do the work for you,
and what they produce is good enough for you to use,
then it’s good enough to pay them with real currency.

Simple as that.

And a pro-tip for all you “creative” types (artist, designers, developers, engineers, and even trades peoples) – usually when someone says to you “it will be good for exposure”, they are the last person to spread the word offer referrals.
Ask me how I know.

Whether you have been doing your work for 2 months, 2 years or 20 years, your time, effort and skills are worth real money if someone is prepared to use what you produce.

The Best Practice Fallacy

I consider “Best Practice” in software development to be a fallacy.

Why?

Yesterday’s best practice is replaced by something new today.
And today’s best practice will be replace by something else tomorrow.

I don’t have a problem with setting good guidelines and habits, but let’s not call it “best” – that implies one right way (and there are enough knuckleheads in our industry who latch onto ideas with zeal that I really don’t want to encourage further).

Instead, let’s think of it as:

A “good” approach for what we are trying to achieve today.

Any way you cut it, any practice is just someone’s opinion of how things should be done, and it’s not necessarily based on varied experience or hard lessons.

In my own business I sometimes dictate how things should be done. A decision needs to be made, a pattern set in place and direction set. But I’m flexible and often review, improve and adjust.
(I also pay the bills so in absence of a better option what I say goes.)
But in no way are the decisions I make “best practice” or based on what others consider to be best.

I regularly make decisions contrary to current belief but are still valid and appropriate for the situation. I do analysis, consider options and put a lot of thought into decisions (other time there’s not much thought but a desire to learn through experimentation).

The reality is, in software there are very few things you need to adhere to. Create code and systems others can understand and maintain. Expect change. Don’t be an asshole.

Apart from that our industry is so young, so fast moving, and has so many possibilities and facets it’s impossible to define “best”.

So let’s just drop the bullshit, call a spade a spade, and admit we’re all learning and making this up as we go.