Mark Needham

Thoughts on Software Development

Archive for the ‘Books’ Category

Badass: Making users awesome – Kathy Sierra: Book Review

without comments

I started reading Kathy Sierra’s new book ‘Badass: Making users awesome‘ a couple of weeks ago and with the gift of flights to/from Stockholm this week I’ve got through the rest of it.

I really enjoyed the book and have found myself returning to it almost every day to check up exactly what was said on a particular topic.

There were a few things that I’ve taken away and have been going on about to anyone who will listen.

2015 03 20 06 52 51

Paraphrasing, ‘help users acquire skills, don’t throw knowledge at them.’ I found this advice helpful both in my own learning of new things as well as for thinking how to help users of Neo4j get up and running faster.

Whether we’re doing a talk, workshop or online training, the goal isn’t to teach the user a bunch of information/facts but rather to help them learn skills which they can use to achieve their ‘compelling context‘.

Having said that, it’s very easy to fall into the information/facts trap as that type of information is much easier to prepare and present. You don’t have to spend much time thinking about how the user is going to use, rather you hope that if you throw enough information at them some of it will stick.

A user’s compelling context the problem they’re trying to solve regardless of the tools they use to solve it. The repeated example of this is a camera – we don’t buy a camera because we want to buy a camera, we buy it because we want to take great photographs.

2015 03 17 23 49 25

There’s a really interesting section in the middle of the book which talks about expert performance and skill acquisition and how we can achieve this through deliberate practice.

My main take away here is that we have only mastered a skill if we can achieve 95% reliability in repeating the task within 1-3 45-90 minute sessions.

If we can’t achieve this then the typical reaction is to either give up or keep trying to achieve the goal for many more hours. Neither of these is considered a useful approach.

Instead we should realise that if we can’t do the skill it’s probably because there’s a small sub skill that we need to master first. So our next step is to break this skill down into its components, master those and then try the original skill again.

Amy Hoy’s ‘doing it backwards‘ guide is very helpful for doing the skill breakdown as it makes you ask the question ‘can I do it tomorrow?‘ or is there something else that I need to do (learn) first.

I’ve been trying to apply this approach to my machine learning adventures which most recently has involved various topic modelling attempts on a How I met your mother data set.

I’d heard good things about the MALLET open source library but having never used it before sketched out the goals/skills I wanted to achieve:

Extract topics for HIMYM corpus ->
Train a topic model with mallet ->
Tweak an existing topic model that uses mallet ->
Run an existing topic model that uses mallet -> 
Install mallet
2015 03 20 00 11 48

The idea is that you then start from the last action and work your way back up the chain – it should also act as a nice deterrent for yak shaving.

While learning about mallet I came across several more articles that I should read about topic modelling and while these don’t directly contribute to learning a skill I think they will give me good background to help pick up some of the intuition behind topic modelling.

My take away about gaining knowledge on a skill is that when we’re getting started we should spend more time gaining practical knowledge rather than only reading but once we get more into it we’ll naturally become more curious and do the background reading. I often find myself just reading non stop about things but never completely understanding them because I don’t go hands on so this was a good reminder.

One of the next things I’m working on is a similar skill break down for people learning Neo4j and then we’ll look to apply this to make our meetup sessions more effective – should be fun!

The other awesome thing about this book is that I’ve come away with a bunch of other books to read as well:

In summary, if learning is your thing get yourself a copy of the book and read it over a few times – so many great tips, I’ve only covered a few.

Written by Mark Needham

March 20th, 2015 at 7:30 am

Posted in Books

Tagged with

The Hard Thing About Hard Things – Ben Horowitz: Book Review

without comments

I came across ‘The Hard Thing About Hard Things‘ while reading an article about Ben Horowitz’s venture capital firm and it was intriguing enough that I bought it and then read through it over a couple of days.

Although the blurb suggests that it’s a book about about building and running a startup I think a lot of the lessons are applicable for any business.

These were some of the main points that stood out for me:

  • The Positivity Delusion – CEOs should tell it like it is.

    My single biggest improvement as CEO occurred on the day when I stopped being too positive.

    Horowitz suggests that he used to be too positive and would shield bad news from his employees as he thought he’d make the problem worse by transferring the burden onto them.

    He came to the realisation that this was counter productive since he often wasn’t the best placed person to fix a problem e.g. if it was a problem with the product then the engineering team needed to know so they could write the code to fix it.

    He goes on to suggest that…

    A healthy company culture encourages people to share bad news. A company that discusses its problems freely and openly can quickly solve them. A company that covers up its problems frustrated everyone involved.

    I’ve certainly worked on projects in the past where the view projected by the most senior person is overly positive and seems to ignore any problems that seem obvious to everyone else. This eventually leads to people being unsure whether to take them seriously which isn’t a great situation to be in.

  • Lead Bullets – fix the problem, don’t run away from it.

    Horowitz describes a couple of situations where his products have been inferior to their competitors and it’s been tempting to take the easy way out by not fixing the product.

    There comes a time in every company’s life where it must fight for its life. If you find yourself running when you should be fighting, you need to ask yourself, “If our company isn’t good enough to win, then do we need to exist at all?”.

    I can’t think of any examples around this from my experience but I really like the advice – I’m sure it’ll come in handy in future.

  • Give ground grudgingly – dealing with the company increasing in size.

    Horowitz suggests that the following things become more difficult as a company grows in size:

    • Communication
    • Common Knowledge
    • Decision Making


    If the company doesn’t expand it will never be much…so the challenge is to grow but degrade as slowly as possible.

    He uses the metaphor of an offensive linesman in American football who has to stop onrushing defensive linesman but giving ground to them slowly by backing up a little at a time.

    I’ve worked in a few different companies now and noticed things become more structured (and in my eyes worse!) as the company grew over time but I hadn’t really thought about why that was happening. The chapter on scaling a company does a decent job.

  • The Law of Crappy People – people baseline against the worst person at a grade level.

    For any title level in a large organisation, the talent on that level will eventually converge to the crappiest person with that title.

    This is something that he’s also written about on his blog and certainly seems very recognisable.

    His suggestion for mitigating the problem is to have a “properly constructed and highly disciplined promotion process” in place. He describes this like so:

    When a manager wishes to promote an employee, she will submit that employee for review with an explanation of why she believes her employee satisfies the skill criteria required for the level.

    The committee should compare the employee to both the level’s skill description and the skills of the other employees at that level to determine whether or not to approve the promotion.

  • Hire people with the right kind of ambition

    The wrong kind of ambition is ambition for the executive’s personal success regardless of the company’s outcome.

    This suggestion comes from the chapter in which Horowitz discusses how to minimise politics in an organisation.

    I really like this idea but it seems like a difficult thing to judge/achieve. In my experience people often have their own goals which aren’t necessarily completely aligned with the company’s. Perhaps complete alignment isn’t as important unless you’re right at the top of the company?

    He also has quite a neat definition of politics:

    What do I mean by politics? I mean people advancing their careers or agendas by means other than merit and contribution.

    He goes on to describe a few stories of how political behaviour can subtly creep into a company without the CEO meaning for it to happen. This chapter was definitely eye opening for me.

There are some other interesting chapters on the best types of CEOs for different companies, when to hire Senior external people, product management and much more.

I realise that the things I’ve picked out are mostly a case of confirmation bias so I’m sure everyone will have different things that stand out for them.

Definitely worth a read.

Written by Mark Needham

October 13th, 2014 at 11:59 pm

Posted in Books

Tagged with

Functional Programming in Java – Venkat Subramaniam: Book Review

with one comment

I picked up Venkat Subramaniam’s ‘Functional Programming in Java: Harnessing the Power of Java 8 Lambda Expressions‘ to learn a little bit more about Java 8 having struggled to find any online tutorials which did that.

A big chunk of the book focuses on lambdas, functional collection parameters and lazy evaluation which will be familiar to users of C#, Clojure, Scala, Haskell, Ruby, Python, F# or libraries like totallylazy and Guava.

Although I was able to race through the book quite quickly it was still interesting to see how Java 8 is going to reduce the amount of code we need to write to do simple operations on collections.

I wrote up my thoughts on lambda expressions instead of auto closeable, using group by on collections and sorting values in collections in previous blog posts.

I noticed a couple of subtle differences in the method names added to collection e.g. skip/limit are there instead of take/drop for grabbing a subset of said collection.

There are also methods such as ‘mapToInt’ and ‘mapToDouble’ where in other languages you’d just have a single ‘map’ and it would handle everything.

Over the last couple of years I’ve used totallylazy on Java projects to deal with collections and while I like the style of code it encourages you end up with a lot of code due to all the anonymous classes you have to create.

In Java 8 lambdas are a first class concept which should make using totallylazy even better.

In a previous blog post I showed how you’d go about sorted a collection of people by age. In Java 8 it would look like this:

List<Person> people = Arrays.asList(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28)); -> p.getAge())).forEach(System.out::println)

I find the ‘comparing’ function that we have to use a bit unintuitive and this is what we’d have using totallylazy pre Java 8:

Sequence<Person> people = sequence(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));
people.sortBy(new Callable1<Person, Integer>() {
    public Integer call(Person person) throws Exception {
        return person.getAge();

Using Java 8 lambdas the code is much simplified:

Sequence<Person> people = sequence(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));

If we use ‘forEach’ to print out each person individually we end up with the following:

Sequence<Person> people = sequence(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));
people.sortBy(Person::getAge).forEach((Consumer<? super Person>) System.out::println);

The compiler can’t work out whether we want to use the forEach method from totallylazy or from Iterable so we end up having to cast which is a bit nasty.

I haven’t yet tried converting the totallylazy code I’ve written but my thinking is that the real win of Java 8 will be making it easier to use libraries like totallylazy and Guava.

Overall the book describes Java 8’s features very well but if you’ve used any of the languages I mentioned at the top it will all be very familiar – finally Java has caught up with the rest!

Written by Mark Needham

March 23rd, 2014 at 9:18 pm

Posted in Books

Tagged with

Thinking Fast and Slow – Daniel Kahneman: Book Review

with 2 comments

I picked up Daniel Kahneman’s ‘Thinking Fast and Slow‘ after a recommendation by Mike Jones in early 2013 – it’s taken me quite a while to get through it.

The book starts by describing our two styles of thinking…

  • System 1 – operates automatically and quickly, with little or no effort and no sense of voluntary control.
  • System 2 – allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

…and the next 30+ chapters describe situations where System 1 can get us into trouble.

The book focuses on different cognitive biases that humans get fooled by and there was a spell of about 100 pages starting around chapter 4 where it started to feel repetitive and I felt like putting the book down.

That said, I kept going and enjoyed the remaining 2/3 of the book much more.

These were some of my favourite parts:

  • Outcome bias – this bias describes the tendency to build a story around events after we already know the outcome and come to the conclusion that the outcome was inevitable. For example, many people claim that they ‘knew well before it happened that the 2008 financial crisis was inevitable’.

    Kahneman points out that we only remember the intuitions which turned out to be true, noone talks about those intuitions which turned out to be false. He also cautions us to consider the role that luck plays in any success.

  • Intuitions vs Formulas – Kahneman quotes Paul Meehl who suggests that most of the time human decision makers are inferior to a prediction formula even when they are given the score suggested by the formula.

    In the world of software this reliance on intuition is often used when interviewing people for jobs and choosing who to hire based on an intuitive judgement. Kahneman suggests making a list of about 6 traits that are pre-requisites for the position and then ranking each candidate against those. The candidate who scores the highest is the best choice.

  • Over reliance on representativeness – often we judge the probability of something based on stereotypes rather than taking the base rate of the population into account.

    e.g. if we saw a person reading The New York Times on the subway which of the following is a better bet about this stranger?

    • She has a PhD.
    • She does not have a college degree.

    If we judge based on representativeness we’ll bet on the PhD because our stereotypes tell us that a PhD student is more likely to be reading the New York Times than a person without a college degree. However, there are many more non graduates on the subway than PhDs so the likelihood is that the person doesn’t have a college degree.

    Kahneman encourages us to take the base rates of the population into account even if we have evidence about the case at hand. He uses the example of predicting how long a project will take and suggests using the data from previous similar projects as a baseline and then adjusting that based on our specific case.

  • Framing – Kahneman talks about the impact that the way information is presented to us can have on the way we react to it. For example, he describes a study in which the merits of surgery for a condition are considered and the descriptions of the surgery are like so:
    • The one month survival rate is 90%.
    • There is 10% mortality is in the first month.

    People who were presented with the first option were much more likely to favour surgery than those shown the second option.

  • The default option – people feel much more regret if they deviate from the normal choice and something goes wrong than if they stick with the default and something equally wrong happens. This theory is well known on forms where authors will make the option they want people to select the default.

There are many more insights and these are just a few that stood out for me.

Overall I enjoyed reading the book and I feel it may the same impact as Malcolm Gladwell’s Blink and Outliers by making you think about situations differently than you may have done before.

I’d encourage you to read the book if you find human decision making interesting although I’d try and forget everything I’ve said as you’ve probably been inadvertently primed by me.

Written by Mark Needham

October 27th, 2013 at 10:53 pm

Posted in Books

Tagged with

On Writing Well – William Zinsser: Book Review

with 3 comments

I first came across William Zinsser’s ‘On Writing Well‘ about a year ago, but put it down having flicked through a couple of the chapters that I felt were relevant.

It came back onto my radar a month ago and this time I decided to read it cover to cover as I was sure there were some insights that I’d missed due to my haphazard approach the first time around.

What stood out in my memory from my first reading of the book was the emphasis on how words like a bit, a little, kind of, quite, pretty much and too dilute our sentences and I’ve been trying to avoid them in my writing ever since.

Other things that stood out for me this time were:

  • Avoid unnecessary words e.g. blared loudly or grinned widely. It’s difficult to blare in any other way and if you’re grinning it implied that your mouth is open widely.
  • Delete troublesome phrases. If you’re having trouble working out the structure for a sentence, it might be beneficial to get rid of it and see if the rest of the paragraph still makes sense. Often it will.
  • Rewriting. The author emphasises over and over again the importance of rewriting a piece of text until you’re happy with it. This consists mostly of reshaping and tightening previous drafts. The way it’s described sounds very similar to code refactoring. My colleague Ian Robinson recommended ‘Revising Prose‘ as a useful book to read next in this area.
  • Assume the reader knows nothing. The advice in the chapter on science and technology was probably the most applicable for the type of writing I do and I thought this section was particularly apt:

    Describing how a process works is valuable for two reasons. It forces you to make sure that you know how it works. Then it forces you to take the reader through the same sequence of ideas and deductions that made the process clear to you.

    I’ve found this to be the case multiple times although you can achieve the same benefits by presenting a talk on a topic; the benefits aren’t unique to writing.

  • Explain your judgements. Don’t say something is interesting. Instead explain what makes it interesting and let the reader decide whether or not it deserves that label.
  • Use simple words, be specific, be human and use active verbs. I often fall into the trap of using passive verbs which makes it difficult for the reader to know which part of the sentence they apply to. We want to minimise the amount of translation the reader has to do to understand our writing.
  • Use a paragraph as a logical unit. I remember learning this at secondary school but I’ve drifted towards a style of writing that treats the sentence as a logical block. My intuition tells me that people find it easier to read text when it’s in smaller chunks but I will experiment with grouping my ideas together in paragraphs where that seems sensible.

I gleaned these insights mostly from the first half of the book.

The second half focused on different forms of writing and showed how to apply the lessons from earlier in the book. Although not all the forms were applicable to me I still found it interesting to read as the author has a nice way with words and you want to keep reading the next sentence.

My main concern having read the book is ensuring that I don’t paralyse my ability to finish blog posts by rewriting ad infinitum.

Written by Mark Needham

September 30th, 2013 at 10:48 pm

Posted in Books

Tagged with ,

9 algorithms that changed the future – John MacCormick: Book Review

without comments

The Book

9 algorithms that changed the future (the ingenious ideas that drive today’s computers) by John MacCormick

My Thoughts

I came across this book while idly browsing a book store and since I’ve found most introduction to algorithms books very dry I thought it’d be interesting to see what one aimed at the general public would be like.

Overall it was an enjoyable read and I quite like the pattern that the author used for each algorithm, which was:

  • Describe the problem that it’s needed for.
  • Explain a simplified version of the algorithm or use a metaphor to give the general outline.
  • Explain which bits were simplified and how the real version addresses those simplifications.

The first step is often missed out in algorithms books which is a mistake for people like me who become more interested in a subject once a practical use case is explained.

Although the title claims 9 algorithms I counted the following 8 which made the cut:

I enjoyed the book and I’ve got some interesting articles/papers to add to my reading list. Even if you already know all the algorithms I think it’s interesting to hear them described from a completely different angle to see if you learn something new.

Written by Mark Needham

August 13th, 2013 at 8:00 pm

Posted in Books

Tagged with

Book Review: The Signal and the Noise – Nate Silver

with one comment

Nate Silver is famous for having correctly predicted the winner of all 50 states in the 2012 United States elections and Sid recommended his book so I could learn more about statistics for the A/B tests that we were running.

I thought the book was a really good introduction to applied statistics and by using real life examples which most people would be able to relate to it makes a potentially dull subject interesting.

Reasonably early on the author points out that there’s a difference between making a prediction and making a forecast:

  • Prediction – a definitive and specific statement about when and where something will happen e.g. a major earthquake will hit Kyoto, Japan, on June 28.
  • Forecast – a probabilistic statement over a longer time scale e.g. there is a 60% chance of an earthquake in Southern California over the next 30 years.

The book mainly focuses on the latter.

We then move onto quite an interesting section about over fitting which is where we mistake noise for signal in our data.

I first came across this term when Jen and I were working through one of the Kaggle problems and were using a random forest of deliberately over fitted Decision Trees to do digit recognition.

It’s not a problem when we combine lots of decision trees together and use a majority wins algorithm to make our prediction but if we use just one of them its predictions on any new data will be completely wrong.

Later on in the book he points out that a lot of conspiracy theories come when we look at data retrospectively and can easily detect signal from noise in data when at the time it was much more difficult.

He also points out that sometimes there isn’t actually any signal, it’s all noise, and we can fall into the trap of looking for something that isn’t there. I think this ‘noise’ is what we’d refer to as random variation in the context of an A/B test.

Silver also encourages us to make sure that we understand the theory behind any inference we make:

Statistical inferences are much stronger when backed up by theory or at least some deeper thinking about their root causes.

When we were running A/B tests Sid encouraged people to think whether a theory about why conversion had changed made logical sense before assuming it was true which I think covers similar ground.

A big chunk of the book covers Bayes’ theorem and how often when we’re making forecasts we have prior beliefs which it forces us to make explicit.

For example there is a section which talks about the probability a lady is being cheated on given that she’s found some underwear that she doesn’t recognise in her house.

In order to work out the probability she’s being cheated on we need to know the probability that she was being cheated on before she found the underwear. Silver suggests that since 4% of married partners cheat on their spouses that would be a good number to use.

He then goes on to show multiple other problems throughout the book that we can apply Bayes’ theorem to.

Some other interesting things I picked up are that if we’re good at forecasting then being given more information should make our forecast better and that when we don’t have any special information we’re better off following the opinion of the crowd.

IMG 20130514 011256

Silver also showed a clever trick for inferring data points on a data set which follows a power law i.e. the long tail distribution where there are very few massive events but lots of really small ones.

We have a power law distribution when modelling the number of terrorists attacks vs number of fatalities but if we change both scales to be logarithmic we can come up with a probability of how likely more deadly attacks are.

There is then some discussion of how we can make changes in the way that we treat terrorism to try and impact the shape of the chart e.g. in Israel Silver suggests that they really want to avoid a very deadly attack but at the expense of there being more smaller attacks.

A lot of the book is spent discussing weather/earthquake forecasting which is very interesting to read about but I couldn’t quite see a link back to the software context.

Overall though I found it an interesting read although there are probably a few places that you can skim over the detail and still get the gist of what he’s saying.

Written by Mark Needham

May 14th, 2013 at 12:16 am

Posted in Books

Tagged with

Book Review: The Retrospective Handbook – Pat Kua

without comments

My colleague Pat Kua recently published a book he’s been working on for the first half of the year titled ‘The Retrospective Handbook‘ – a book in which Pat shares his experiences with retrospectives and gives advice to budding facilitators.

I was intrigued what the book would be like because the skill gap between Pat and me with respect to facilitating retrospectives is huge and I’ve often found that experts in a subject can have a tendency to be a bit preachy when writing about their subject!

In actual fact Pat has done a great job making the topic accessible to all skill levels and several times covers typical problems with retrospectives before describing possible solutions.

These were some of the things that I took away:

  • One of the most interesting parts of the book was a section titled ‘Be Aware of Cultural Dimensions’ where Pat covers some of the different challenges we have when people from different cultures work together.

    I found the power distance index (PDI) especially interesting:

    The extent to which the less powerful members of organisations and institutions accept and expect that power is distributed unequally

    If you come from a culture with a low PDI you’re more likely to challenge something someone else said regardless of their role but if you’re from a culture with a high PDI you probably won’t say anything.

    The US/UK tend to have low PDI whereas India has a high PDI – something I found fascinating when participating in retrospectives in India in 2010/2011. I think the facilitator needs to be aware of this otherwise they might make someone very uncomfortable by pushing them too hard to share their opinion.

  • A theme across the book is that retrospectives aren’t about the facilitator – the facilitator’s role is to help guide the team through the process and keep things moving, they shouldn’t be the focal point. In my opinion if a facilitator is doing that well then they’d be almost invisible much like a football referee when they’re having a good game!
  • The ‘First-Time Facilitation Tips’ chapter is particularly worth reading and reminded me that part of the facilitator’s role is to encourage equal participation from the group:

    A common, shared picture is only possible if all participants give their input freely and share their view of the story. This is difficult if one or two people are allowed to dominate discussions. Part of your role as a facilitator is to use whatever techniques you can to ensure a balanced conversation occurs.

    I think this is much easier for an external facilitator to do as they won’t have the burden of inter team politics/hierarchy to deal with.

    Pat goes on to suggests splitting the group up into smaller groups as one technique to get people involved, an approach I’ve found works and from my experience this works really well and gets around the problem that many people aren’t comfortable discussing things in big groups.

  • There’s nothing more boring than doing the same retrospective week after week, nor is there a quicker way to completely put people off them, so I was pleased to see that Pat dedicated a chapter to keeping retrospectives fresh.

    He suggests a variety of different techniques including bringing food or running the retrospective in a different location to normal to keep it interesting. I’ve heard of colleagues in Brazil doing their retrospectives outside which is another angle on this theme!

  • Another good tip is that when creating actions we don’t need to spend time getting someone to sign up for them right there and then – an alternative is to encourage people to walk the wall and pick ones they feel they can take care of.

I think this book compliments Esther Derby/Diana Larsen’s ‘Agile Retrospectives‘ really well.

I find their book really useful for finding exercises to use in retrospectives to keep it interesting whereas Pat’s book is more about the challenges you’re likely to face during the retrospective itself.

There’s lots of other useful tips and tricks in the book – these are just a few of the ones that stood out for me – it’s well worth a read if you’re a participant/facilitator in retrospectives on your team.

Written by Mark Needham

August 31st, 2012 at 9:18 pm

Posted in Books

Tagged with

The Lean Startup: Book Review

with 5 comments

I’d heard about The Lean Startup for a long time before I actually read it, mainly from following the ‘Startup Lessons Learned‘ blog, but I didn’t get the book until a colleague suggested a meetup to discuss how we might apply the ideas on our projects.

My general learning from the book is that we need to take the idea of creating tight feedback loops, which we’ve learnt in the agile/lean worlds, and apply it to product development.

Eric Ries talks about the ideas of the Minimum Viable Product (MVP), which is something that I’ve heard mentioned a lot in the last few projects I’ve worked on so I thought I knew what it meant.

I’d always considered the MVP to effectively be the first release of any product you were building but Ries’ frames it as the minimum product you can release to get feedback on whether your idea is viable or not. For example Dropbox’s MVP was a video demonstrating how it would work when the team had written the code to sync files on all operating systems.

On a lot of the projects that I’ve worked on we start after the point at which the business has decided what their product vision is and we’re responsible for implementing it. They haven’t necessarily then gone on to make a big return from building the product which I always found strange.

The most frequent argument I’ve heard against releasing an ‘incomplete’ product early in the organisations that I’ve worked for is that it could ruin their brand if they took this approach. One suggestion the book makes is to release the product under a spin off subsidiary if we’re worried about that.

The book also discussed the ways that we need to treat early adopters of a product and mainstream customers differently.

For example early adopters won’t mind/may actually prefer to play with an unfinished product if they can help influence its future direction.

By the time we have proved that we have a viable product and are looking to aim it at the mainstream market it will need to be more feature complete and polished in order to please that crowd.

There is a big focus on making data driven decisions such that we gather metrics showing how our product is actually being used by customers rather than just guessing/going on intuition as to what we should be doing next.

Facebook released an interesting video where towards the end they describe the metrics which they have around their application such that they can tell whether a deployment is losing them money and therefore needs to be rolled back.

One particular thing that the book talks about is cohort analysis:

A cohort analysis is a tool that helps measure user engagement over time. It helps UX designers know whether user engagement is actually getting better over time or is only appearing to improve because of growth.

We tend to use metrics to help us see the quality of code and which things we might want to work on there but I think the idea of using it to measure user engagement is really cool and should help us to build a more useful product.

I especially enjoyed the parts of the book where Ries talks about ways that some of the ideas have been applied with startups which are doing well at the moment although I think it’d be fair to say that the lean startup framework has been retrospectively fitted to explain these stories.

I think the danger of thinking that they were following lean startup principles is that it can lead to us not thinking through problems ourselves which I guess is the same problem with any framework/methodology.

I’m intrigued as to whether it will make a difference to the overall success rate of startups or not if they follow the ideas from the book.

I imagine we’ll see some ideas failing much more quickly than they might have otherwise and the suggestion is that when this happens we need to pivot and try and find another approach that will make money. Despite that, there there will come a point when the startup runs out of money without finding a way to monetise their product and it’ll be game over.

Overall the book is quite easy reading and worth a flick through as it has some cool ideas which can help us to spend less time building products which don’t actually get used.

Written by Mark Needham

December 18th, 2011 at 9:00 pm

Posted in Books

Tagged with

Discussing the Undiscussable: Book Review

without comments

I came across the work of Chris Argyris at the start of the year and in a twitter conversation with Benjamin Mitchell he suggested that Bill Noonan’s ‘Discussing the Undiscussable‘ was the most accessible text for someone new to the subject.

In the book Noonan runs through a series of different tools that Chris Argyris originally came up with for helping people to handle difficult conversational situations more effectively.

I really like the way the book is written.

A lot of books of this ilk come across to me as being very idealistic but Noonan avoids that by describing his own mistakes in trying to implement Argyris’ ideas. This makes the book much more accessible to me.

He also repeatedly points out that even though you might understand the tools that doesn’t mean that you’ll be an expert in using them unless you spend a significant amount of time practicing.

These were some of the ideas that stood out for me from my reading over the last few months:

  • Advocacy/Inquiry – Noonan suggests that when we’re discussing a topic it’s important to advocate our opinion but also be open to people challenging it so that we can learn if there are any gaps in our understanding or anything that we’re missing.

    This seems quite similar to Bob Sutton’s ‘Strong Opinions, Weakly Held‘ which I’ve come across several times in the past.

    One anti pattern which comes from not doing this is known as ‘easing in‘ where we try to get the other person to advocate our opinion through the use of various leading questions.

    The problem is that they tend to know exactly what we’re doing and it can come across as being quite manipulative.

  • The Ladder of Inference – I’ve written about this previously and it describes the way that humans tend to very quickly draw conclusions about other people based on fairly minimal data and without even talking to the other person first!

    When Jim and I worked together at ThoughtWorks University we were constantly pointing out when the other was climbing the inference ladder and it was quite surprising to me how often you end up doing so even when you don’t realise it!

    What I find most interesting is that even when I was absolutely sure that my inference about a situation was correct it was still frequently wrong when I discussed it with the other person. They nearly always had a different perception of what was going on than I did.

    I think it’s a step too far to believe that I won’t ever climb the inference ladder again but it’s useful to know how frequently I do it so at least I’m aware that I might need to climb down from time to time.

  • Recovery time – There is constant reference throughout the book to our recovery time i.e. how quickly do we realise that we’ve made a mistake by participating in a defensive routine.

    Argyris’ tools are quite useful for helping us to reduce our recovery time because they are reflective in nature and when we reflect on a situation we tend to see where we’ve gone wrong!

    Noonan suggests that it’s inevitable we’ll make mistakes but the key is to try and detect our mistakes sooner and then hopefully reduce the number that we make.

Of course there are several other tools that Noonan describes, such as the left hand right hand case study approach, double loop learning, espoused theory vs theory in action and the mutual learning model.

I still make loads of the mistakes that the book points out and I’ve noticed that I only really reflect on how my conversations are going when I’ve been flicking through the book relatively recently.

It’s also useful to be hanging around other people who are studying Argyris’ work as you can then help each other out.

One of the initial books that Chris Argyris published describing these tools was ‘Action Science‘ (available as a free PDF).

I initially tried reading that before this book but I found it a bit hard to follow but I’ll probably try it again at some stage.

Written by Mark Needham

May 7th, 2011 at 12:45 am

Posted in Books

Tagged with