# Testing

# Why and what we test

We run tests due to:

  • Ensure stability for all stakeholders
  • Save time testing manually

We test the following:

  • All back-end calls using feature tests
    • For cases with correct provided information and state
    • Cases with invalid state and ensure we get the proper errors
    • Essential onboarding / signup flows in which state is essential
  • Key methods (very selectively) methods using unit tests
  • Key customer facing FE experience using Cypress

We do not test:

  • All front-end code – including VUE components

Note: Our Cypress setup is currently under developed and needs work to increase stability further

# How we write tests

As default we write PHP based feature tests – using the defaults of Laravel 7.0+

We create one test file for each feature or group of features e.g. "PerformanceManagementTest" would be one file covering everything related to the performance management view and tests. If a test file contains more than 20 tests there is a good chance you can split it up.

We write test names using snake_case and write them as documentation by writing out very clearly what the test is testing. Using snake_case makes it clear that the tests are different from normal production code. An example of a test method name: "a_requester_can_submit_a_task_with_guideline_reference_via_platform"

We try to make as many meaningful assertions as possible in a given test. Also, we often check changes in the database in addition to the response from the back end.

For larger, more complex features we keep the test code clean, by creating a private "default_success_handler" method which includes all the default assertions e.g. status, json content, database assertions and a private "default_input" method that creates default valid input.

# Starting with tests - TDD

For any back-end work we follow the concept of TDD (test driven development) in which we start by writing the relevant tests and then write the actual code to make those test pass.

When writing the initial tests we try to consider the most common cases, but do not over-invest in any thinkable scenario to keep things moving relative fast.

Instead, we make sure to write regression tests as various edge cases are identified doing the further testing.

# Testing UX, UI and front-end

We use Cypress.io for conducting a few end to end tests on nomorePost - mainly focusing on the registration process and a few core views.

March 2022 note: We plan on improving this over the next 12 months

Instead of fully covered end 2 end test setup, we leave it up to the developer and the relevant stakeholders to test these aspects manually as part of the development process.

We believe that testing your own code is essential for you to ensure ability.

For all new features it is essential to have the involvement of someone with the relevant team to review the result and experience.

For now, Anders must review all new features before they are pushed to production for the first time.

# Test servers

To help test in a staging "like" environment we have a list of staging servers that can be used for testing and developing new features or testing various fixes

Below is the list of test servers:

For testing the repo: knowmore

For testing the repo: nomorePost

For testing the repo powerpoint-addin

For testing the repo knowmore-front

note: We only have one test server for this repo as it is rarely updated

For testing the repo nomore-front

note: We only have one test server for this repo as it is rarely updated

For testing the repo knowmoreApi

note: We only have one test server for this repo as it is rarely updated

For testing the C# based repo ppt-app

  • quality-checker.teamnomore.com (not managed via Forge - but via AWS Elastic Bean stalk)

note: We only have one test server for this repo as it is only worked on by Anders for now

The servers can be managed via Forge - just use the links provided