Mark Needham

Thoughts on Software Development

Archive for the ‘Books’ Category

Functional Programming in Java – Venkat Subramaniam: Book Review

with one comment

I picked up Venkat Subramaniam’s ‘Functional Programming in Java: Harnessing the Power of Java 8 Lambda Expressions‘ to learn a little bit more about Java 8 having struggled to find any online tutorials which did that.

A big chunk of the book focuses on lambdas, functional collection parameters and lazy evaluation which will be familiar to users of C#, Clojure, Scala, Haskell, Ruby, Python, F# or libraries like totallylazy and Guava.

Although I was able to race through the book quite quickly it was still interesting to see how Java 8 is going to reduce the amount of code we need to write to do simple operations on collections.

I wrote up my thoughts on lambda expressions instead of auto closeable, using group by on collections and sorting values in collections in previous blog posts.

I noticed a couple of subtle differences in the method names added to collection e.g. skip/limit are there instead of take/drop for grabbing a subset of said collection.

There are also methods such as ‘mapToInt’ and ‘mapToDouble’ where in other languages you’d just have a single ‘map’ and it would handle everything.

Over the last couple of years I’ve used totallylazy on Java projects to deal with collections and while I like the style of code it encourages you end up with a lot of code due to all the anonymous classes you have to create.

In Java 8 lambdas are a first class concept which should make using totallylazy even better.

In a previous blog post I showed how you’d go about sorted a collection of people by age. In Java 8 it would look like this:

List<Person> people = Arrays.asList(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));
people.stream().sorted(comparing(p -> p.getAge())).forEach(System.out::println)

I find the ‘comparing’ function that we have to use a bit unintuitive and this is what we’d have using totallylazy pre Java 8:

Sequence<Person> people = sequence(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));
 
people.sortBy(new Callable1<Person, Integer>() {
    @Override
    public Integer call(Person person) throws Exception {
        return person.getAge();
    }
});

Using Java 8 lambdas the code is much simplified:

Sequence<Person> people = sequence(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));
System.out.println(people.sortBy(Person::getAge));

If we use ‘forEach’ to print out each person individually we end up with the following:

Sequence<Person> people = sequence(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));
people.sortBy(Person::getAge).forEach((Consumer<? super Person>) System.out::println);

The compiler can’t work out whether we want to use the forEach method from totallylazy or from Iterable so we end up having to cast which is a bit nasty.

I haven’t yet tried converting the totallylazy code I’ve written but my thinking is that the real win of Java 8 will be making it easier to use libraries like totallylazy and Guava.

Overall the book describes Java 8′s features very well but if you’ve used any of the languages I mentioned at the top it will all be very familiar – finally Java has caught up with the rest!

Written by Mark Needham

March 23rd, 2014 at 9:18 pm

Posted in Books

Tagged with

Thinking Fast and Slow – Daniel Kahneman: Book Review

with one comment

I picked up Daniel Kahneman’s ‘Thinking Fast and Slow‘ after a recommendation by Mike Jones in early 2013 – it’s taken me quite a while to get through it.

The book starts by describing our two styles of thinking…

  • System 1 – operates automatically and quickly, with little or no effort and no sense of voluntary control.
  • System 2 – allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

…and the next 30+ chapters describe situations where System 1 can get us into trouble.

The book focuses on different cognitive biases that humans get fooled by and there was a spell of about 100 pages starting around chapter 4 where it started to feel repetitive and I felt like putting the book down.

That said, I kept going and enjoyed the remaining 2/3 of the book much more.

These were some of my favourite parts:

  • Outcome bias – this bias describes the tendency to build a story around events after we already know the outcome and come to the conclusion that the outcome was inevitable. For example, many people claim that they ‘knew well before it happened that the 2008 financial crisis was inevitable’.

    Kahneman points out that we only remember the intuitions which turned out to be true, noone talks about those intuitions which turned out to be false. He also cautions us to consider the role that luck plays in any success.

  • Intuitions vs Formulas – Kahneman quotes Paul Meehl who suggests that most of the time human decision makers are inferior to a prediction formula even when they are given the score suggested by the formula.

    In the world of software this reliance on intuition is often used when interviewing people for jobs and choosing who to hire based on an intuitive judgement. Kahneman suggests making a list of about 6 traits that are pre-requisites for the position and then ranking each candidate against those. The candidate who scores the highest is the best choice.

  • Over reliance on representativeness – often we judge the probability of something based on stereotypes rather than taking the base rate of the population into account.

    e.g. if we saw a person reading The New York Times on the subway which of the following is a better bet about this stranger?

    • She has a PhD.
    • She does not have a college degree.

    If we judge based on representativeness we’ll bet on the PhD because our stereotypes tell us that a PhD student is more likely to be reading the New York Times than a person without a college degree. However, there are many more non graduates on the subway than PhDs so the likelihood is that the person doesn’t have a college degree.

    Kahneman encourages us to take the base rates of the population into account even if we have evidence about the case at hand. He uses the example of predicting how long a project will take and suggests using the data from previous similar projects as a baseline and then adjusting that based on our specific case.

  • Framing – Kahneman talks about the impact that the way information is presented to us can have on the way we react to it. For example, he describes a study in which the merits of surgery for a condition are considered and the descriptions of the surgery are like so:
    • The one month survival rate is 90%.
    • There is 10% mortality is in the first month.

    People who were presented with the first option were much more likely to favour surgery than those shown the second option.

  • The default option – people feel much more regret if they deviate from the normal choice and something goes wrong than if they stick with the default and something equally wrong happens. This theory is well known on forms where authors will make the option they want people to select the default.

There are many more insights and these are just a few that stood out for me.

Overall I enjoyed reading the book and I feel it may the same impact as Malcolm Gladwell’s Blink and Outliers by making you think about situations differently than you may have done before.

I’d encourage you to read the book if you find human decision making interesting although I’d try and forget everything I’ve said as you’ve probably been inadvertently primed by me.

Written by Mark Needham

October 27th, 2013 at 10:53 pm

Posted in Books

Tagged with

On Writing Well – William Zinsser: Book Review

with 3 comments

I first came across William Zinsser’s ‘On Writing Well‘ about a year ago, but put it down having flicked through a couple of the chapters that I felt were relevant.

It came back onto my radar a month ago and this time I decided to read it cover to cover as I was sure there were some insights that I’d missed due to my haphazard approach the first time around.

What stood out in my memory from my first reading of the book was the emphasis on how words like a bit, a little, kind of, quite, pretty much and too dilute our sentences and I’ve been trying to avoid them in my writing ever since.

Other things that stood out for me this time were:

  • Avoid unnecessary words e.g. blared loudly or grinned widely. It’s difficult to blare in any other way and if you’re grinning it implied that your mouth is open widely.
  • Delete troublesome phrases. If you’re having trouble working out the structure for a sentence, it might be beneficial to get rid of it and see if the rest of the paragraph still makes sense. Often it will.
  • Rewriting. The author emphasises over and over again the importance of rewriting a piece of text until you’re happy with it. This consists mostly of reshaping and tightening previous drafts. The way it’s described sounds very similar to code refactoring. My colleague Ian Robinson recommended ‘Revising Prose‘ as a useful book to read next in this area.
  • Assume the reader knows nothing. The advice in the chapter on science and technology was probably the most applicable for the type of writing I do and I thought this section was particularly apt:

    Describing how a process works is valuable for two reasons. It forces you to make sure that you know how it works. Then it forces you to take the reader through the same sequence of ideas and deductions that made the process clear to you.

    I’ve found this to be the case multiple times although you can achieve the same benefits by presenting a talk on a topic; the benefits aren’t unique to writing.

  • Explain your judgements. Don’t say something is interesting. Instead explain what makes it interesting and let the reader decide whether or not it deserves that label.
  • Use simple words, be specific, be human and use active verbs. I often fall into the trap of using passive verbs which makes it difficult for the reader to know which part of the sentence they apply to. We want to minimise the amount of translation the reader has to do to understand our writing.
  • Use a paragraph as a logical unit. I remember learning this at secondary school but I’ve drifted towards a style of writing that treats the sentence as a logical block. My intuition tells me that people find it easier to read text when it’s in smaller chunks but I will experiment with grouping my ideas together in paragraphs where that seems sensible.

I gleaned these insights mostly from the first half of the book.

The second half focused on different forms of writing and showed how to apply the lessons from earlier in the book. Although not all the forms were applicable to me I still found it interesting to read as the author has a nice way with words and you want to keep reading the next sentence.

My main concern having read the book is ensuring that I don’t paralyse my ability to finish blog posts by rewriting ad infinitum.

Written by Mark Needham

September 30th, 2013 at 10:48 pm

Posted in Books

Tagged with ,

9 algorithms that changed the future – John MacCormick: Book Review

without comments

The Book

9 algorithms that changed the future (the ingenious ideas that drive today’s computers) by John MacCormick

My Thoughts

I came across this book while idly browsing a book store and since I’ve found most introduction to algorithms books very dry I thought it’d be interesting to see what one aimed at the general public would be like.

Overall it was an enjoyable read and I quite like the pattern that the author used for each algorithm, which was:

  • Describe the problem that it’s needed for.
  • Explain a simplified version of the algorithm or use a metaphor to give the general outline.
  • Explain which bits were simplified and how the real version addresses those simplifications.

The first step is often missed out in algorithms books which is a mistake for people like me who become more interested in a subject once a practical use case is explained.

Although the title claims 9 algorithms I counted the following 8 which made the cut:

I enjoyed the book and I’ve got some interesting articles/papers to add to my reading list. Even if you already know all the algorithms I think it’s interesting to hear them described from a completely different angle to see if you learn something new.

Written by Mark Needham

August 13th, 2013 at 8:00 pm

Posted in Books

Tagged with

Book Review: The Signal and the Noise – Nate Silver

with one comment

Nate Silver is famous for having correctly predicted the winner of all 50 states in the 2012 United States elections and Sid recommended his book so I could learn more about statistics for the A/B tests that we were running.

I thought the book was a really good introduction to applied statistics and by using real life examples which most people would be able to relate to it makes a potentially dull subject interesting.

Reasonably early on the author points out that there’s a difference between making a prediction and making a forecast:

  • Prediction – a definitive and specific statement about when and where something will happen e.g. a major earthquake will hit Kyoto, Japan, on June 28.
  • Forecast – a probabilistic statement over a longer time scale e.g. there is a 60% chance of an earthquake in Southern California over the next 30 years.

The book mainly focuses on the latter.

We then move onto quite an interesting section about over fitting which is where we mistake noise for signal in our data.

I first came across this term when Jen and I were working through one of the Kaggle problems and were using a random forest of deliberately over fitted Decision Trees to do digit recognition.

It’s not a problem when we combine lots of decision trees together and use a majority wins algorithm to make our prediction but if we use just one of them its predictions on any new data will be completely wrong.

Later on in the book he points out that a lot of conspiracy theories come when we look at data retrospectively and can easily detect signal from noise in data when at the time it was much more difficult.

He also points out that sometimes there isn’t actually any signal, it’s all noise, and we can fall into the trap of looking for something that isn’t there. I think this ‘noise’ is what we’d refer to as random variation in the context of an A/B test.

Silver also encourages us to make sure that we understand the theory behind any inference we make:

Statistical inferences are much stronger when backed up by theory or at least some deeper thinking about their root causes.

When we were running A/B tests Sid encouraged people to think whether a theory about why conversion had changed made logical sense before assuming it was true which I think covers similar ground.

A big chunk of the book covers Bayes’ theorem and how often when we’re making forecasts we have prior beliefs which it forces us to make explicit.

For example there is a section which talks about the probability a lady is being cheated on given that she’s found some underwear that she doesn’t recognise in her house.

In order to work out the probability she’s being cheated on we need to know the probability that she was being cheated on before she found the underwear. Silver suggests that since 4% of married partners cheat on their spouses that would be a good number to use.

He then goes on to show multiple other problems throughout the book that we can apply Bayes’ theorem to.

Some other interesting things I picked up are that if we’re good at forecasting then being given more information should make our forecast better and that when we don’t have any special information we’re better off following the opinion of the crowd.

IMG 20130514 011256

Silver also showed a clever trick for inferring data points on a data set which follows a power law i.e. the long tail distribution where there are very few massive events but lots of really small ones.

We have a power law distribution when modelling the number of terrorists attacks vs number of fatalities but if we change both scales to be logarithmic we can come up with a probability of how likely more deadly attacks are.

There is then some discussion of how we can make changes in the way that we treat terrorism to try and impact the shape of the chart e.g. in Israel Silver suggests that they really want to avoid a very deadly attack but at the expense of there being more smaller attacks.

A lot of the book is spent discussing weather/earthquake forecasting which is very interesting to read about but I couldn’t quite see a link back to the software context.

Overall though I found it an interesting read although there are probably a few places that you can skim over the detail and still get the gist of what he’s saying.

Written by Mark Needham

May 14th, 2013 at 12:16 am

Posted in Books

Tagged with

Book Review: The Retrospective Handbook – Pat Kua

without comments

My colleague Pat Kua recently published a book he’s been working on for the first half of the year titled ‘The Retrospective Handbook‘ – a book in which Pat shares his experiences with retrospectives and gives advice to budding facilitators.

I was intrigued what the book would be like because the skill gap between Pat and me with respect to facilitating retrospectives is huge and I’ve often found that experts in a subject can have a tendency to be a bit preachy when writing about their subject!

In actual fact Pat has done a great job making the topic accessible to all skill levels and several times covers typical problems with retrospectives before describing possible solutions.

These were some of the things that I took away:

  • One of the most interesting parts of the book was a section titled ‘Be Aware of Cultural Dimensions’ where Pat covers some of the different challenges we have when people from different cultures work together.

    I found the power distance index (PDI) especially interesting:

    The extent to which the less powerful members of organisations and institutions accept and expect that power is distributed unequally

    If you come from a culture with a low PDI you’re more likely to challenge something someone else said regardless of their role but if you’re from a culture with a high PDI you probably won’t say anything.

    The US/UK tend to have low PDI whereas India has a high PDI – something I found fascinating when participating in retrospectives in India in 2010/2011. I think the facilitator needs to be aware of this otherwise they might make someone very uncomfortable by pushing them too hard to share their opinion.

  • A theme across the book is that retrospectives aren’t about the facilitator – the facilitator’s role is to help guide the team through the process and keep things moving, they shouldn’t be the focal point. In my opinion if a facilitator is doing that well then they’d be almost invisible much like a football referee when they’re having a good game!
  • The ‘First-Time Facilitation Tips’ chapter is particularly worth reading and reminded me that part of the facilitator’s role is to encourage equal participation from the group:

    A common, shared picture is only possible if all participants give their input freely and share their view of the story. This is difficult if one or two people are allowed to dominate discussions. Part of your role as a facilitator is to use whatever techniques you can to ensure a balanced conversation occurs.

    I think this is much easier for an external facilitator to do as they won’t have the burden of inter team politics/hierarchy to deal with.

    Pat goes on to suggests splitting the group up into smaller groups as one technique to get people involved, an approach I’ve found works and from my experience this works really well and gets around the problem that many people aren’t comfortable discussing things in big groups.

  • There’s nothing more boring than doing the same retrospective week after week, nor is there a quicker way to completely put people off them, so I was pleased to see that Pat dedicated a chapter to keeping retrospectives fresh.

    He suggests a variety of different techniques including bringing food or running the retrospective in a different location to normal to keep it interesting. I’ve heard of colleagues in Brazil doing their retrospectives outside which is another angle on this theme!

  • Another good tip is that when creating actions we don’t need to spend time getting someone to sign up for them right there and then – an alternative is to encourage people to walk the wall and pick ones they feel they can take care of.

I think this book compliments Esther Derby/Diana Larsen’s ‘Agile Retrospectives‘ really well.

I find their book really useful for finding exercises to use in retrospectives to keep it interesting whereas Pat’s book is more about the challenges you’re likely to face during the retrospective itself.

There’s lots of other useful tips and tricks in the book – these are just a few of the ones that stood out for me – it’s well worth a read if you’re a participant/facilitator in retrospectives on your team.

Written by Mark Needham

August 31st, 2012 at 9:18 pm

Posted in Books

Tagged with

The Lean Startup: Book Review

with 5 comments

I’d heard about The Lean Startup for a long time before I actually read it, mainly from following the ‘Startup Lessons Learned‘ blog, but I didn’t get the book until a colleague suggested a meetup to discuss how we might apply the ideas on our projects.

My general learning from the book is that we need to take the idea of creating tight feedback loops, which we’ve learnt in the agile/lean worlds, and apply it to product development.

Eric Ries talks about the ideas of the Minimum Viable Product (MVP), which is something that I’ve heard mentioned a lot in the last few projects I’ve worked on so I thought I knew what it meant.

I’d always considered the MVP to effectively be the first release of any product you were building but Ries’ frames it as the minimum product you can release to get feedback on whether your idea is viable or not. For example Dropbox’s MVP was a video demonstrating how it would work when the team had written the code to sync files on all operating systems.

On a lot of the projects that I’ve worked on we start after the point at which the business has decided what their product vision is and we’re responsible for implementing it. They haven’t necessarily then gone on to make a big return from building the product which I always found strange.

The most frequent argument I’ve heard against releasing an ‘incomplete’ product early in the organisations that I’ve worked for is that it could ruin their brand if they took this approach. One suggestion the book makes is to release the product under a spin off subsidiary if we’re worried about that.

The book also discussed the ways that we need to treat early adopters of a product and mainstream customers differently.

For example early adopters won’t mind/may actually prefer to play with an unfinished product if they can help influence its future direction.

By the time we have proved that we have a viable product and are looking to aim it at the mainstream market it will need to be more feature complete and polished in order to please that crowd.

There is a big focus on making data driven decisions such that we gather metrics showing how our product is actually being used by customers rather than just guessing/going on intuition as to what we should be doing next.

Facebook released an interesting video where towards the end they describe the metrics which they have around their application such that they can tell whether a deployment is losing them money and therefore needs to be rolled back.

One particular thing that the book talks about is cohort analysis:

A cohort analysis is a tool that helps measure user engagement over time. It helps UX designers know whether user engagement is actually getting better over time or is only appearing to improve because of growth.

We tend to use metrics to help us see the quality of code and which things we might want to work on there but I think the idea of using it to measure user engagement is really cool and should help us to build a more useful product.

I especially enjoyed the parts of the book where Ries talks about ways that some of the ideas have been applied with startups which are doing well at the moment although I think it’d be fair to say that the lean startup framework has been retrospectively fitted to explain these stories.

I think the danger of thinking that they were following lean startup principles is that it can lead to us not thinking through problems ourselves which I guess is the same problem with any framework/methodology.

I’m intrigued as to whether it will make a difference to the overall success rate of startups or not if they follow the ideas from the book.

I imagine we’ll see some ideas failing much more quickly than they might have otherwise and the suggestion is that when this happens we need to pivot and try and find another approach that will make money. Despite that, there there will come a point when the startup runs out of money without finding a way to monetise their product and it’ll be game over.

Overall the book is quite easy reading and worth a flick through as it has some cool ideas which can help us to spend less time building products which don’t actually get used.

Written by Mark Needham

December 18th, 2011 at 9:00 pm

Posted in Books

Tagged with

Discussing the Undiscussable: Book Review

without comments

I came across the work of Chris Argyris at the start of the year and in a twitter conversation with Benjamin Mitchell he suggested that Bill Noonan’s ‘Discussing the Undiscussable‘ was the most accessible text for someone new to the subject.

In the book Noonan runs through a series of different tools that Chris Argyris originally came up with for helping people to handle difficult conversational situations more effectively.

I really like the way the book is written.

A lot of books of this ilk come across to me as being very idealistic but Noonan avoids that by describing his own mistakes in trying to implement Argyris’ ideas. This makes the book much more accessible to me.

He also repeatedly points out that even though you might understand the tools that doesn’t mean that you’ll be an expert in using them unless you spend a significant amount of time practicing.

These were some of the ideas that stood out for me from my reading over the last few months:

  • Advocacy/Inquiry – Noonan suggests that when we’re discussing a topic it’s important to advocate our opinion but also be open to people challenging it so that we can learn if there are any gaps in our understanding or anything that we’re missing.

    This seems quite similar to Bob Sutton’s ‘Strong Opinions, Weakly Held‘ which I’ve come across several times in the past.

    One anti pattern which comes from not doing this is known as ‘easing in‘ where we try to get the other person to advocate our opinion through the use of various leading questions.

    The problem is that they tend to know exactly what we’re doing and it can come across as being quite manipulative.

  • The Ladder of Inference – I’ve written about this previously and it describes the way that humans tend to very quickly draw conclusions about other people based on fairly minimal data and without even talking to the other person first!

    When Jim and I worked together at ThoughtWorks University we were constantly pointing out when the other was climbing the inference ladder and it was quite surprising to me how often you end up doing so even when you don’t realise it!

    What I find most interesting is that even when I was absolutely sure that my inference about a situation was correct it was still frequently wrong when I discussed it with the other person. They nearly always had a different perception of what was going on than I did.

    I think it’s a step too far to believe that I won’t ever climb the inference ladder again but it’s useful to know how frequently I do it so at least I’m aware that I might need to climb down from time to time.

  • Recovery time – There is constant reference throughout the book to our recovery time i.e. how quickly do we realise that we’ve made a mistake by participating in a defensive routine.

    Argyris’ tools are quite useful for helping us to reduce our recovery time because they are reflective in nature and when we reflect on a situation we tend to see where we’ve gone wrong!

    Noonan suggests that it’s inevitable we’ll make mistakes but the key is to try and detect our mistakes sooner and then hopefully reduce the number that we make.

Of course there are several other tools that Noonan describes, such as the left hand right hand case study approach, double loop learning, espoused theory vs theory in action and the mutual learning model.

I still make loads of the mistakes that the book points out and I’ve noticed that I only really reflect on how my conversations are going when I’ve been flicking through the book relatively recently.

It’s also useful to be hanging around other people who are studying Argyris’ work as you can then help each other out.

One of the initial books that Chris Argyris published describing these tools was ‘Action Science‘ (available as a free PDF).

I initially tried reading that before this book but I found it a bit hard to follow but I’ll probably try it again at some stage.

Written by Mark Needham

May 7th, 2011 at 12:45 am

Posted in Books

Tagged with

Books: Know why you’re reading it

with 2 comments

Something which I frequently forget while reading books is that it’s actually quite useful to know exactly why you’re reading it i.e. what knowledge are you trying to gain by doing so.

I noticed this again recently while reading The Agile Samurai – it’s one of the books we ask ThoughtWorks University participants to read before they come to India.

Implicitly I knew that I just wanted to get a rough idea of what sort of things it’s telling people but I somewhat foolishly just started reading it cover to cover.

I only realised that I’d been doing this when I’d got a third of the way through and realised that I hadn’t really learnt that much since the book effectively describes the way that ThoughtWorks delivers projects.

In Pragmatic Learning and Thinking Andy Hunt suggests the SQ3R reading comprehension method which I always forget about!

  • Survey – scan the table of contents and chapter summaries
  • Question – note any questions you have
  • Read – read in entirety
  • Recite – Summarise and take notes in your own words
  • Review – Re-read, expand notes, discuss with colleagues

I don’t think it always needs to be quite as organised as this but I’ve certainly found it useful to scan the chapter headings and see which ones interest me and then skip the ones which don’t seem worth reading.

When reading The Art of Unix Programming I felt that I was learning a lot of different things for the first ten chapters or so but then it started to get quite boring for me so I skimmed the rest of the book and ended up reading just half of the chapters completely.

The amusing thing for me is that I knew about this technique a couple of years ago but I still don’t use it which I think that comes down to having a bit of a psychological thing about needing to finish books.

At the moment I have around 15 books which I’ve partially read and at the back of my mind I know that I want to go and read the rest of them even though there will undoubtably be varying returns from doing that!

I need to just let them go…

Written by Mark Needham

February 26th, 2011 at 3:06 am

Posted in Books

Tagged with

The Adventures of Johnny Bunko – The Last Career Guide You’ll Ever Need: Book Review

with 3 comments

I read Dan Pink’s A Whole New Mind earlier in the year but I hadn’t heard of The Adventures of Johnny Bunko until my colleague Sumeet Moghe mentioned it in a conversation during ThoughtWorks India’s XConf, an internal conference run here.

The book is written in the Manga format so it’s incredibly quick to read and it gives 6 ideas around building a career.

I’m generally not a fan of the idea of ‘building a career’ – generally when I hear that phrase it involves having a ‘five year’ plan and other such concepts which I consider to be pointless.

I much prefer to just focus on what I’m most interested in at the moment and then have the flexibility to be able to focus on something else if that interests me more.

Luckily this book is not at all like that and these are the ideas Dan Pink suggests:

There is no plan

The most interesting part of this piece of advice is Pink’s suggestion that when we make career decisions we make them for two different types of reasons:

  • Instrumental reasons – we do something because we think it’s going to lead to something else regardless of whether we enjoy it or think it’s worthwhile.
  • Fundamental reasons – we so something because it’s inherently valuable regardless of what it may or may not lead to.

I think this pretty much makes sense – it’s very difficult to predict what’s going to happen so you might as well make sure you’re doing what you want at any given time.

Think strengths not weaknesses

Marcus Buckingham has several books on this subject but the general idea is that we should look to focus on things that we’re good at or are really motivated to become good at.

By inference this means that we want to avoid spending time on things that we’re not good at and/or not interested in becoming good at.

I recently came across a blog post written by Robbie Maciver where he suggests that we can use a retrospective to help us identify team member’s strengths and work out how best to utilise them.

I think there’s sometimes a reluctance for people to volunteer their strengths so we end up with people working on tasks that they hate that other people in the same team would enjoy.

It’s not about you

Pink then goes on to point out that even if we do work out how to play to our strengths we still need to ensure that we’re focusing on our customer/client.

Pink calls this focusing outward not inward.

In terms of working in technology consulting that would therefore be along the lines of ensuring that we’re focused on solving our clients’ problems in the best way for them rather than the most interesting way for us.

Persistence trumps talent

This idea has been covered in several books including Outliers, Talent is Overrated and The Talent Code.

The underlying idea is that people aren’t born brilliant at any skill, it only comes through practice and if we’re intrinsically motivated to do something then we’re much more likely to put in the time it requires to become really good.

To add to that I think intrinsic motivation is highest when we’re focusing on our strengths so this piece of advice is linked with the earlier one.

Make excellent mistakes

Pink talks about making excellent mistakes which he describes as ‘mistakes that come from having high aspirations, from trying to do something nobody else has done’.

I think this is somewhat linked to the idea of failing fast although it seems to be perhaps one level beyond that.

Sumeet gave a talk at XConf where he described the new approach to ThoughtWorks University and pointed out that one of the most important goals was to allow attendees to make ‘excellent mistakes’

One observation I have here is that smaller companies seems more willing to make excellent mistakes. As organisations grow the aversion to risk because of worries about loss of reputation increases which makes excellent mistakes less likely.

I find Eric Ries’ lean startup ideas particularly interesting with respect to failing fast and my former colleague Antonio Terreno recently linked to a video where the General Manager of Forward 3D explains how they do this.

Leave an imprint

Pink talks about the importance of doing work which actually has some meaning or leaves the world in a better place than it was before.

This is probably the one that I can relate to least. I’m writing software which is what I’m passionate about but I wouldn’t say the problems I work on are having much impact on the world.

This one therefore leaves me with the most to think about.

Written by Mark Needham

November 21st, 2010 at 5:02 pm

Posted in Books

Tagged with