Tuesday, 29 May 2018

Layered Testing


Many software systems will reach a scale where achieving adequate coverage of all provided features via traditional manual testing will prove virtually impossible, especially in a reasonable time scale.

A strong suite of automated tests, coupled with the adoption of a shift left mindset, can provide a solution to this problem. This approach can mean a system is under almost constant testing, at least for critical journeys and functionality, without which users would be adversely affected.

As with most aspects of software engineering, its imperative for this testing infrastructure to have well defined structure to avoid the evils of spaghetti code. Code quality shouldn't be something only in developers mind when writing production code, its as important for maintaining good quality tests.

This structure starts by identifying the different categories of tests that you will write, what they are designed to achieve, and how they fit into the bigger picture.

Unit Tests

Unit tests are defined by having a single source of failure, namely the class that is being tested. This obviously discounts the tests themselves having bugs, whilst developers often gravitate to wanting the test themselves to be wrong when they fail, this isn't the case as often as we may want it to be.

This single source of failure is maintained by all functional dependencies of the class being tested being mocked. The distinction being drawn here with functional dependencies is to avoid model classes and such like also having to be mocked, if the dependency offers functionality that could cause the test to fail then it should be mocked.

Unit tests should be run as frequently as possible and at least as often as change is merged into a shared branch, for this to have value the tests must follow FIRST principles of being fast, independent and repeatable.

Unit tests are therefore your fist line of defence against bugs being introduced into a code base and represent the furthest left it is possible to introduce automated testing. Indeed when following a Test Driven Development (TDD) methodology the tests exist even before the code they will be testing.

Integration Tests

Within a code base classes do not in fact operate independently, they come together in a grouping to perform a purpose within your overall system. If unit testing should fulfil the role of defending against bugs being introduced inside a class's implementation, then integration testing should act as a line of defence against them being introduced within the boundaries and interactions between classes.

By their very nature integration tests don't have a single reason to fail, any class within the group you are testing has the potential to cause a test to fail. Where in our unit testing we made extensive use of mocking to simulate functionality, in our integration testing we are looking to include as much real functionality as possible.

This makes integration tests harder to debug but this is a necessary price to pay to validate all the individual sub-systems of your code interact properly.

Whilst even the most ardent advocate of TDD wouldn't write integration tests prior to implementation, like unit tests integration tests should be fast and run frequently.

Automated UI Tests

In the majority of cases software is provoked into performing some operation based on input from a user. Implementing automated UI testing is an attempt to test functionality as that user, or at least in as close an approximation of the user as possible.

As with integration tests, these automated UI tests will have multiple reasons to fail, in fact they are likely to have as many reasons to fail as there are sources of bugs in the system being tested.

Although it is necessary to engineer an environment for these test to run, and not all functionality will lend itself to being tested in this way, these tests should contain virtually no mocking.

Automated UI testing is never going to lend itself to being as fast as integration testing and unit testing. For this reason they should be structured such that they can be easily sub-divided based on what they are testing. This will allow their execution to be targeted on the areas of change within a code base and the critical paths that cannot break.

They are likely to be run less frequently but they serve as a health check on the current suitability of the code for deployment. They are also work horses for the mundane, freeing up the time of precious manual resource to concentrate on testing more abstract qualities of the code.

These three areas by no means cover all the possible types of test you may wish to write, one notable exception being testing performance via Non Functional Tests (NFT). However they do demonstrate how a well thought through testing architecture consists of many layers each with a different purpose and objective.

Sunday, 13 May 2018

Technological Milestones



The writer Arthur C. Clarke postulated three laws, the third of which is "Any sufficiently advanced technology is indistinguishable from magic".

The users of technology and the people that engineer it will be likely to have a different view on the accuracy of that law, users will very often be enchanted by a technological advancement whilst engineers understand its foibles and intricacies.

Technology often doesn't move at quite the blistering pace that many believe it to, technology often advances via small increments while the ideas of how to utilise it is what experiences rapid advancement.

But sometimes giant leaps forward are made that mark milestones in what is possible and the magic that engineers can demonstrate.

Solid State

The fundamental building block of the entire modern world is the solid state transistor. The ability to fabricate these building blocks in ever smaller dimensions has driven the development of ever more powerful computers and all the other advancements in electronics we have witnessed.

The ability to use semiconductors to engineer structures like transistors evolved during the 1950's and 1960's and led to the development of the first computers.

The technology was further refined with the invention of the integrated circuit, or chip, and via the application of Moore's Law has driven the development of the modern world.

Moore's Law, named after engineer Gordon Moore, states that the number of transistors that can be fabricated in a given area doubles approximately every two years. 

The simple view of this is that the processing power of a computer doubles every two years, this proved to be true for the best part of forty years with the rate only slowing in recent times.

Solid state electronics truly is the genie that can't be put back in the bottle. 

The Internet and the World Wide Web

Solid state electronics gave us powerful computers that could accomplish many tasks, the next advancement came when we developed technology to allow these machines to work together and broke down the barriers around data and its uses.

All though the terms are often used interchangeably the Internet and the World Wide Web have very different histories separated by many decades.

The history of the Internet, the ability to connect computers over large distances, dates back to American military development during the 1960's and 1970's.

The World Wide Web, the ability to organise and make available data over the network the Internet provides, has its origins during the 1980's and 1990's as part of the World Wide Web project at the CERN research institute in Switzerland.

The pre-Web world is now almost unimaginable even for those of us that have memories before web-sites, apps and social media.

The ability to connect, share and interact on a global scale has changed the world forever, never has so much data, covering so many topics been available to so many people.

An interesting side note to this is that the protocols that control the movement of all this data such as TCP/IP or HTTP have changed remarkably little since there inception given the importance they now have to how we live our lives. Proof that not everything move at break neck speed.

Artificial Intelligence

It is often difficult to predict the next great step before it happens, the difficulty of doing this could well be what defines the genius of those that takes these steps forward for us.

If I was asked to predict now what in decades time people would look back on as a milestone it would be the emergence of Artificial Intelligence into the everyday world.

Whilst the dream of building Artificial Intelligence stretches back for many decades it is only in recent years that applications of the technology has started to become relatively common place.

We are still only touching the surface of what the technology will be capable of delivering and as this unfolds fear around these capabilities may grow.

An untold number of sci-films have predicated disastrous consequences leading from the invention of thinking machines, while these stories are built on a misunderstanding of the technology involved perhaps we are seeing the birth of a technology that proves Arthur C. Clarke's 3rd law and will cause many to classify it as magic.