Saturday, 25 February 2012

JFDI > evaluation != lack of innovation

It's late. I'm tired. But this needs writing.

JFDI died a little when some people decided it wasn't the way to do things - too much rushing in, not enough evaluation of impact. Then we tried to work out how to evaluate impact and it all went quiet.

Well. I've come to my own conclusion and it's working out okay, I think. It's this. JFDI has its place. It really does. Experimentation leads to people using LinkedIn for slightly offball reasons which yield some excellent revenue results. When someone came to me and asked if I thought it was a good idea, I told her it was kickass. I didn't _know_ it was kickass but it made my tummy do that little jump of excitement it does when someone says something awesome - so off she went and it worked.

I could have said no. I couldn't set pre-emptive performance indicators on her actions. I could have decided that it was time ill spent and asked her to focus on the already not inconsiderable successes in social media she'd achieved. But that's not what this is about.

JFDI is not rushing blind. It's using all the informed knowledge you have amassed and at any given moment someone suggesting something and you using all that knowledge and experience to say yes or no. You might be wrong. Part of that momentary decision needs to be a risk analysis on that. Time invested, money invested, users time wasted. But, still, I believe there is a place for saying yes, go for it, lets see what happens.

On the flip of this, evaluation is necessary. Not as necessary as JFDI but still nearing vital. How do I know the experiment on LinkedIn worked? I can't tell you because it's not something as ridiculously simple as advertising a job post there and I'd have to be pretty dumb to not know how to pre-emptively set evaluation of success for that.

No. Someone decided to bend the rules slightly. And why does evaluation have to be positive anyway? I can say before actioning something with surety that it will be either a success, not a success, or a bit meh. If it's a bit meh, examine what went wrong, see what could be improved, re-implement, come back in 3 months time. If a success, yattah! If not, bin it. Lessons learned, move on. But without any measurement of outcome, how do I get to the lessons learned bit? If I never learn any lessons then what on earth is the point of doing anything? No one gets it right 100% first time.

Ah. But then we are talking about local government and public money. Not getting it right first time can result in job loss, public ridicule and all kinds of such mayhem. So we must temper all our innovation, our testing, our ideas and our curiosity. We have a responsibility to do so to the people we serve. And yes, we do serve them, they pay our wages.

So that momentary decision? Which needs to be momentary or else you're taking way too long and the digital world has moved on without you? Bit more tricky. Suddenly a lot more tricky. But if you made that decision in seconds, I'd argue it was too fast, and if you made it in days you were too slow. You've got hours and minutes to assess all the risks, dangers, opportunities and potential successes before you say go on a new idea.

Be quick or be:
dead
called
asked
avoidably contacted
ridiculed
evaluated?

No. If you can't evaluate fast enough, change something. Change your idea of evaluation, talk to your performance team. Because if you're flying on the seat of your pants without your performance team, my friend, you are doing it wrong. They need to be the JFDI'ers best friend. But you're going to have to explain to them why the evaluation matrix which didn't include blog evaluation 6 months ago needs to do so now and to do that, you kind of need to a) know where they sit b) know how to talk to them and c) understand they know more about evaluation than you could ever hope to.

JFDI > evaluation != lack of innovation.

Just be quick.

No comments:

Post a Comment