Comments on Testing, and Improving our dev/design approach

First off, let me share a great deck about testing microservices, with recommendations about what to test at different levels (unit, integration, component, end-to-end, and manual/exploratory): http://martinfowler.com/articles/microservice-testing. This will be especially good when we start to have more independent services.

My very high-level observation is that I’m concerned about the the testing approach (and implied dev/design approach) that have led to current codebase.

Since we’re taking an independent (micro)service approach to building OpenLMIS v3, we need to emphasize the interactions and RESTful interfaces between the different services, over the implementations of the services themselves.

So, specifically, I would expect to see contracts* driving the testing (and the code design/development), but it doesn’t feel like this is happening. For the API-level stories that are currently being played we should shift our testing and story-sign-off approach to focus more on defining expected RESTful APIs, and testing them.

(This is a step towards consumer-driven contracts, which is an approach we should adopt once we have more services than just the first one.)

I realize that people are already aware of this, and making changes to address it, e.g. per this email, so I look forward to revisiting this topic in an iteration or two and seeing the fruits of this.

···

Darius JazayeriPrincipal Architect - Global Health
Email
djazayeri@thoughtworks.com

Telephone
+1 617 383 9369

ThoughtWorks

Hi Darius,

Thank you for the input, it looks valuable.

You can see more integration tests right now since we were adding a lot of SDR, which generates code we can't unit test. Regarding consumer-driven contracts it is just being started, and soon you should see more of it. Currently considerable effort goes on adding RAML for previously implemented APIs.

Regards,

Chris
···

On 23.07.2016 03:18, Darius Jazayeri wrote:

    First off, let me share a great deck about testing microservices, with recommendations about what to test at different levels (unit, integration, component, end-to-end, and manual/exploratory): [http://martinfowler.com/articles/microservice-testing](http://martinfowler.com/articles/microservice-testing)        . This will be especially good when we start to have more independent services.
      My very high-level observation is that I'm concerned about the the testing approach (and implied dev/design approach) that have led to current codebase.
      Since we're taking an independent (micro)service approach to building OpenLMIS v3, we need to emphasize the *            interactions and RESTful interfaces*           between the different services, over the implementations of the services themselves.

So, specifically, _ I would expect to see contracts* driving the testing (and the code design/development)_ , but it doesn’t feel like this is happening. For the API-level stories that are currently being played we should shift our testing and story-sign-off approach to focus more on defining expected RESTful APIs, and testing them.

(This is a step towards consumer-driven contracts , which is an approach we should adopt once we have more services than just the first one.)

      I realize that people are already aware of this, and making changes to address it, e.g. per [            this email](https://groups.google.com/d/msg/openlmis-dev/Ld453cGEGcQ/rDpP4rDlAAAJ)          , so I look forward to revisiting this topic in an iteration or two and seeing the fruits of this.

Peeking at the existing tests:

  •           We have a tiny number of unit tests, and a lot of integration tests (and I understand there's a lot of exploratory testing happening too). This is an antipattern, and our testing pyramid is upside down.
    
  •             The small number of unit tests may be okay, if much of the code so far is DTOs and plumbing, and there's no real business logic to test yet.
    
  • The integration-test
    folder contains what’s really a mix of different levels of the testing hierarchy, and I would recommend breaking these out.
  •           We are heavy on the web tests that effectively duplicate service tests. At the same time, the web tests are not readable at a glance, so they do a bad job at documenting themselves. I recommend (a) we move towards rest-assured for those web tests, and (b) for simple functionality be more deliberate about just testing it with a component test, or with a web test.
    
  • The one rest-assured test I saw here would look much nicer if we indented the way they do in their examples.

A few things not related to testing:

      The skip() and tryDelete() methods in RequisitionService return a boolean false where I would expect them to throw an exception. (Or perhaps this is a standard business case and not an exceptional one?)
      OrderController.finalize() violates our style guide convention to *            Return a resource representation after a create/update* and also *            Use a clear and consistent error payload*.
      ProgramController does not return helpful error messages when it returns BAD_REQUEST (even though it just logged the helpful error message on the previous line). Also, peeking at this class, are we going to manually copy all properties from DTO to domain object like in [            these lines](https://github.com/OpenLMIS/openlmis-requisition/blob/master/src/main/java/org/openlmis/referencedata/web/ProgramController.java#L40-L41)          ? This seems like a bad pattern. (Aren't our domain objects close enough to DTOs that we don't need this at all?)
      In general we should probably introduce a utility class to encapsulate error messages the way we intend to send them over the wire, and be rigorous about using this.

      I know that some changes are already underway, so I'm looking forward to revisiting in an iteration or two.

Darius Jazayeri* Principal Architect - Global Health*
Email

Telephone
+1 617 383 9369

ThoughtWorks

  You received this message because you are subscribed to the Google Groups "OpenLMIS Dev" group.

  To unsubscribe from this group and stop receiving emails from it, send an email to openlmis-dev+unsubscribe@googlegroups.com.

  To post to this group, send email to openlmis-dev@googlegroups.com.

  To view this discussion on the web visit [](https://groups.google.com/d/msgid/openlmis-dev/CAOKb-R4-9LMCvA2ztxcf_kHgFctZioP_rjxFxtjh7p%2BghDwkvA%40mail.gmail.com?utm_medium=email&utm_source=footer)      . For more options, visit .

djazayeri@thoughtworks.comhttps://groups.google.com/d/msgid/openlmis-dev/CAOKb-R4-9LMCvA2ztxcf_kHgFctZioP_rjxFxtjh7p%2BghDwkvA%40mail.gmail.com
https://groups.google.com/d/optout

Thanks Darius!

As Chris mentioned, the team is in the midst of adding documented API and tests to prove those out. Chris, team: how can we track/measure the coverage of these?

Sonar can provide some help here, but as noted in OLMIS-848, Sonar is not picking up some tests due to their location in the source tree, which supports Darius’ observations about the integrations-tests directory.

For the sprint starting 7-27, we should normalize our testing setup:

  •     Fix OLMIS-848
    
  •     Organize tests in folders by type (unit, api, etc.)
    
  •     Focus on completing any outstanding web API tests
    

o Maybe have a team ILL member work with a ToP/AYIC member to jointly write or refactor some tests to hold up as good examples to follow.

  •     Unit tests:  as Darius mentioned, are we neglecting unit tests, or does the “plumbing” code dwarf the business logic code at this point?
    
  •      Normalize error returns per the ProgramController discussion below.  I can create a story, but need input from the team on what this should achieve in terms of a consistent error payload.  I saw some code review chatter about error codes, so it seems we need to set some standards.
    

I will create issues for these, and really desire to hear additional input from the team on other specific actions.

Rich

···

From: openlmis-dev@googlegroups.com [mailto:openlmis-dev@googlegroups.com] On Behalf Of Krzysztof Kaczmarczyk

Sent: Monday, July 25, 2016 9:42 AM

To: Darius Jazayeri djazayer@thoughtworks.com; OpenLMIS Dev openlmis-dev@googlegroups.com

Subject: Re: [openlmis-dev] Comments on Testing, and Improving our dev/design approach

Hi Darius,

Thank you for the input, it looks valuable.

You can see more integration tests right now since we were adding a lot of SDR, which generates code we can’t unit test. Regarding consumer-driven contracts it is just being started, and soon you should see more of it. Currently considerable effort goes on adding RAML for previously implemented APIs.

Regards,

Chris

On 23.07.2016 03:18, Darius Jazayeri wrote:

First off, let me share a great deck about testing microservices, with recommendations about what to test at different levels (unit, integration, component, end-to-end, and manual/exploratory): http://martinfowler.com/articles/microservice-testing. This will be especially good when we start to have more independent services.

My very high-level observation is that I’m concerned about the the testing approach (and implied dev/design approach) that have led to current codebase.

Since we’re taking an independent (micro)service approach to building OpenLMIS v3, we need to emphasize the interactions and RESTful interfaces between the different services, over the implementations of the services themselves.

So, specifically, I would expect to see contracts* driving the testing (and the code design/development) , but it doesn’t feel like this is happening. For the API-level stories that are currently being played we should shift our testing and story-sign-off approach to focus more on defining expected RESTful APIs, and testing them.

(This is a step towards
consumer-driven contracts
, which is an approach we should adopt once we have more services than just the first one.)

I realize that people are already aware of this, and making changes to address it, e.g. per this email, so I look forward to revisiting this topic in an iteration or two and seeing the fruits of this.


Peeking at the existing tests:

  • We have a tiny number of unit tests, and a lot of integration tests (and I understand there’s a lot of exploratory testing happening too). This is an antipattern, and our testing pyramid is upside down.
  • The small number of unit tests may be okay, if much of the code so far is DTOs and plumbing, and there’s no real business logic to test yet.
  • The
    integration-test
    folder contains what’s really a mix of different levels of the testing hierarchy, and I would recommend breaking these out.
  • We are heavy on the web tests that effectively duplicate service tests. At the same time, the web tests are not readable at a glance, so they do a bad job at documenting themselves. I recommend (a) we move towards rest-assured for those web tests, and (b) for simple functionality be more deliberate about just testing it with a component test, or with a web test.
  • The one rest-assured test I saw
    here
    would look much nicer if we indented the way they do in their
    examples
    .

A few things not related to testing:

The skip() and tryDelete() methods in RequisitionService return a boolean false where I would expect them to throw an exception. (Or perhaps this is a standard business case and not an exceptional one?)

OrderController.finalize() violates our style guide convention to Return a resource representation after a create/update and also Use a clear and consistent error payload.

ProgramController does not return helpful error messages when it returns BAD_REQUEST (even though it just logged the helpful error message on the previous line). Also, peeking at this class, are we going to manually copy all properties from DTO to domain object like in
these lines
? This seems like a bad pattern. (Aren’t our domain objects close enough to DTOs that we don’t need this at all?)

In general we should probably introduce a utility class to encapsulate error messages the way we intend to send them over the wire, and be rigorous about using this.


I know that some changes are already underway, so I’m looking forward to revisiting in an iteration or two.

Darius JazayeriPrincipal Architect - Global Health

Email

djazayeri@thoughtworks.com

Telephone

+1 617 383 9369

ThoughtWorks

You received this message because you are subscribed to the Google Groups “OpenLMIS Dev” group.

To unsubscribe from this group and stop receiving emails from it, send an email to openlmis-dev+unsubscribe@googlegroups.com.

To post to this group, send email to
openlmis-dev@googlegroups.com.

To view this discussion on the web visit
https://groups.google.com/d/msgid/openlmis-dev/CAOKb-R4-9LMCvA2ztxcf_kHgFctZioP_rjxFxtjh7p%2BghDwkvA%40mail.gmail.com
.

For more options, visit https://groups.google.com/d/optout.

You received this message because you are subscribed to the Google Groups “OpenLMIS Dev” group.

To unsubscribe from this group and stop receiving emails from it, send an email to openlmis-dev+unsubscribe@googlegroups.com.

To post to this group, send email to
openlmis-dev@googlegroups.com.

To view this discussion on the web visit
https://groups.google.com/d/msgid/openlmis-dev/5796416B.3090808%40soldevelo.com
.

For more options, visit https://groups.google.com/d/optout.