Mark Needham

Thoughts on Software Development

Archive for the ‘book-review’ tag

Functional Programming in Java – Venkat Subramaniam: Book Review

with one comment

I picked up Venkat Subramaniam’s ‘Functional Programming in Java: Harnessing the Power of Java 8 Lambda Expressions‘ to learn a little bit more about Java 8 having struggled to find any online tutorials which did that.

A big chunk of the book focuses on lambdas, functional collection parameters and lazy evaluation which will be familiar to users of C#, Clojure, Scala, Haskell, Ruby, Python, F# or libraries like totallylazy and Guava.

Although I was able to race through the book quite quickly it was still interesting to see how Java 8 is going to reduce the amount of code we need to write to do simple operations on collections.

I wrote up my thoughts on lambda expressions instead of auto closeable, using group by on collections and sorting values in collections in previous blog posts.

I noticed a couple of subtle differences in the method names added to collection e.g. skip/limit are there instead of take/drop for grabbing a subset of said collection.

There are also methods such as ‘mapToInt’ and ‘mapToDouble’ where in other languages you’d just have a single ‘map’ and it would handle everything.

Over the last couple of years I’ve used totallylazy on Java projects to deal with collections and while I like the style of code it encourages you end up with a lot of code due to all the anonymous classes you have to create.

In Java 8 lambdas are a first class concept which should make using totallylazy even better.

In a previous blog post I showed how you’d go about sorted a collection of people by age. In Java 8 it would look like this:

List<Person> people = Arrays.asList(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28)); -> p.getAge())).forEach(System.out::println)

I find the ‘comparing’ function that we have to use a bit unintuitive and this is what we’d have using totallylazy pre Java 8:

Sequence<Person> people = sequence(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));
people.sortBy(new Callable1<Person, Integer>() {
    public Integer call(Person person) throws Exception {
        return person.getAge();

Using Java 8 lambdas the code is much simplified:

Sequence<Person> people = sequence(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));

If we use ‘forEach’ to print out each person individually we end up with the following:

Sequence<Person> people = sequence(new Person("Paul", 24), new Person("Mark", 30), new Person("Will", 28));
people.sortBy(Person::getAge).forEach((Consumer<? super Person>) System.out::println);

The compiler can’t work out whether we want to use the forEach method from totallylazy or from Iterable so we end up having to cast which is a bit nasty.

I haven’t yet tried converting the totallylazy code I’ve written but my thinking is that the real win of Java 8 will be making it easier to use libraries like totallylazy and Guava.

Overall the book describes Java 8’s features very well but if you’ve used any of the languages I mentioned at the top it will all be very familiar – finally Java has caught up with the rest!

Written by Mark Needham

March 23rd, 2014 at 9:18 pm

Posted in Books

Tagged with

Thinking Fast and Slow – Daniel Kahneman: Book Review

with 2 comments

I picked up Daniel Kahneman’s ‘Thinking Fast and Slow‘ after a recommendation by Mike Jones in early 2013 – it’s taken me quite a while to get through it.

The book starts by describing our two styles of thinking…

  • System 1 – operates automatically and quickly, with little or no effort and no sense of voluntary control.
  • System 2 – allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

…and the next 30+ chapters describe situations where System 1 can get us into trouble.

The book focuses on different cognitive biases that humans get fooled by and there was a spell of about 100 pages starting around chapter 4 where it started to feel repetitive and I felt like putting the book down.

That said, I kept going and enjoyed the remaining 2/3 of the book much more.

These were some of my favourite parts:

  • Outcome bias – this bias describes the tendency to build a story around events after we already know the outcome and come to the conclusion that the outcome was inevitable. For example, many people claim that they ‘knew well before it happened that the 2008 financial crisis was inevitable’.

    Kahneman points out that we only remember the intuitions which turned out to be true, noone talks about those intuitions which turned out to be false. He also cautions us to consider the role that luck plays in any success.

  • Intuitions vs Formulas – Kahneman quotes Paul Meehl who suggests that most of the time human decision makers are inferior to a prediction formula even when they are given the score suggested by the formula.

    In the world of software this reliance on intuition is often used when interviewing people for jobs and choosing who to hire based on an intuitive judgement. Kahneman suggests making a list of about 6 traits that are pre-requisites for the position and then ranking each candidate against those. The candidate who scores the highest is the best choice.

  • Over reliance on representativeness – often we judge the probability of something based on stereotypes rather than taking the base rate of the population into account.

    e.g. if we saw a person reading The New York Times on the subway which of the following is a better bet about this stranger?

    • She has a PhD.
    • She does not have a college degree.

    If we judge based on representativeness we’ll bet on the PhD because our stereotypes tell us that a PhD student is more likely to be reading the New York Times than a person without a college degree. However, there are many more non graduates on the subway than PhDs so the likelihood is that the person doesn’t have a college degree.

    Kahneman encourages us to take the base rates of the population into account even if we have evidence about the case at hand. He uses the example of predicting how long a project will take and suggests using the data from previous similar projects as a baseline and then adjusting that based on our specific case.

  • Framing – Kahneman talks about the impact that the way information is presented to us can have on the way we react to it. For example, he describes a study in which the merits of surgery for a condition are considered and the descriptions of the surgery are like so:
    • The one month survival rate is 90%.
    • There is 10% mortality is in the first month.

    People who were presented with the first option were much more likely to favour surgery than those shown the second option.

  • The default option – people feel much more regret if they deviate from the normal choice and something goes wrong than if they stick with the default and something equally wrong happens. This theory is well known on forms where authors will make the option they want people to select the default.

There are many more insights and these are just a few that stood out for me.

Overall I enjoyed reading the book and I feel it may the same impact as Malcolm Gladwell’s Blink and Outliers by making you think about situations differently than you may have done before.

I’d encourage you to read the book if you find human decision making interesting although I’d try and forget everything I’ve said as you’ve probably been inadvertently primed by me.

Written by Mark Needham

October 27th, 2013 at 10:53 pm

Posted in Books

Tagged with

On Writing Well – William Zinsser: Book Review

with 3 comments

I first came across William Zinsser’s ‘On Writing Well‘ about a year ago, but put it down having flicked through a couple of the chapters that I felt were relevant.

It came back onto my radar a month ago and this time I decided to read it cover to cover as I was sure there were some insights that I’d missed due to my haphazard approach the first time around.

What stood out in my memory from my first reading of the book was the emphasis on how words like a bit, a little, kind of, quite, pretty much and too dilute our sentences and I’ve been trying to avoid them in my writing ever since.

Other things that stood out for me this time were:

  • Avoid unnecessary words e.g. blared loudly or grinned widely. It’s difficult to blare in any other way and if you’re grinning it implied that your mouth is open widely.
  • Delete troublesome phrases. If you’re having trouble working out the structure for a sentence, it might be beneficial to get rid of it and see if the rest of the paragraph still makes sense. Often it will.
  • Rewriting. The author emphasises over and over again the importance of rewriting a piece of text until you’re happy with it. This consists mostly of reshaping and tightening previous drafts. The way it’s described sounds very similar to code refactoring. My colleague Ian Robinson recommended ‘Revising Prose‘ as a useful book to read next in this area.
  • Assume the reader knows nothing. The advice in the chapter on science and technology was probably the most applicable for the type of writing I do and I thought this section was particularly apt:

    Describing how a process works is valuable for two reasons. It forces you to make sure that you know how it works. Then it forces you to take the reader through the same sequence of ideas and deductions that made the process clear to you.

    I’ve found this to be the case multiple times although you can achieve the same benefits by presenting a talk on a topic; the benefits aren’t unique to writing.

  • Explain your judgements. Don’t say something is interesting. Instead explain what makes it interesting and let the reader decide whether or not it deserves that label.
  • Use simple words, be specific, be human and use active verbs. I often fall into the trap of using passive verbs which makes it difficult for the reader to know which part of the sentence they apply to. We want to minimise the amount of translation the reader has to do to understand our writing.
  • Use a paragraph as a logical unit. I remember learning this at secondary school but I’ve drifted towards a style of writing that treats the sentence as a logical block. My intuition tells me that people find it easier to read text when it’s in smaller chunks but I will experiment with grouping my ideas together in paragraphs where that seems sensible.

I gleaned these insights mostly from the first half of the book.

The second half focused on different forms of writing and showed how to apply the lessons from earlier in the book. Although not all the forms were applicable to me I still found it interesting to read as the author has a nice way with words and you want to keep reading the next sentence.

My main concern having read the book is ensuring that I don’t paralyse my ability to finish blog posts by rewriting ad infinitum.

Written by Mark Needham

September 30th, 2013 at 10:48 pm

Posted in Books

Tagged with ,

Taiichi Ohno’s Workplace Management: Book Review

without comments

The Book

Taiichi Ohno’s Workplace Management by Taiichi Ohno

The Review

Having completed The Toyota Way a few weeks ago I was speaking with Jason about what books were good to read next – he recommended this one and The Toyota Way Fieldbook.

I struggled to see a connection to software development with a lot of what I read, but there were certainly words of wisdom that we can apply to continuously improve our ability to deliver projects.

What did I learn

  • Just in Time doesn’t mean exactly when the raw material is needed – it means just before it’s needed. This concept can certainly be applied in an agile software development process to ensure that story cards don’t spend too long in any column before moving to the next one. The reasoning in our case being that the knowledge behind the analysis/development of them is at its greatest just when the card has completed that stage.
  • If you make defects you have not worked – this is related to the idea of building quality into the process. You are not adding value if the work that you produce has defects in it. This is quite an interesting contrast to the more typical ‘hours worked’ approach whereby the productivity in these hours is not considered to be that important.
  • The job of team leaders is to make life on the gemba (i.e. shop floor) better. This has some similarities with the Tech Lead role on software projects where the person playing that role will spend a lot of their time reflecting on the development process and looking for ways to make it work better. This can be through driving pair rotation on a team, running analytics on the code to find areas of weakness, helping t setup test frameworks etc. Reflection on these types of things is the only way to drive improvement.
  • Stop defects moving through the system – this is achieved in agile by having story kickoffs and walkthroughs, the former to ensure that everyone is clear what is expected of a story and the latter to ensure that those criteria have been met. Catching defects early makes them much easier to fix since the story is still freshly in the head of the developers that worked on it.
  • Stop the line for defects – the idea here is to prevent defects from moving through the system, similar to the above point. In this case I’d have thought it’s more similar to not wanting code to be checked in on a red build so that the problems can be fixed before the line continues so to speak. It does seem a bit over the top to stop people checking in just because the build is red though, a better strategy perhaps being a team discipline to not check in when this is the case.
  • Don’t automate for the sake of it – look for manual improvements in the process before deciding to automate something. I think this is quite interesting as automating processes in software development is not as costly as it would be on a production floor. One area where maybe there is more debate is automated acceptance testing using UI driven tests. These can often take ages to run as part of the build when there may in fact be better (also probably automated) ways of testing the same functionality. In this case perhaps recognising that there are options when it comes to automating is the key take away.
  • There were several mentions of standardising the approach which is probably more applicable to manufacturing than software development, although there are certainly areas, such as debugging, where a standardised approach would probably be more effective.

In Summary

This book is fairly short but it acts as a nice contrast to The Toyota Way and presents similar information in a slightly different way.

Written by Mark Needham

December 9th, 2008 at 12:14 am

Posted in Books

Tagged with , ,