Email iconarrow-down-circleGroup 8Path 3arrow-rightGroup 4Combined Shapearrow-rightGroup 4Combined ShapeUntitled 2Untitled 2Path 3ozFill 166crosscupcake-icondribbble iconGroupPage 1GitHamburgerPage 1Page 1LinkedInOval 1Page 1Email iconphone iconPodcast ctaPodcast ctaPodcastpushpinblog icon copy 2 + Bitmap Copy 2Fill 1medal copy 3Group 7twitter icontwitter iconPage 1

At UVD we use a variety of tools for testing our web applications. When it comes to acceptance testing, those tools mainly revolve around a Behat/Mink combination. Behat is billed as a “BDD framework for PHP 5.3+” which coupled with Mink’s “acceptance test framework for web applications” provides a totally solid test framework.

This combination allows us to run complex acceptance tests in a familiar syntax that we can hit the ground running with in PHP, and a readable self-documenting syntax the business owner can understand (Gherkin).

Our newest project is a real-time precious metals trading application, split into 3 separate elements (PHP backend, Node.js API and AngularJS powered frontend) and Behat is used to test them all, though this blog will focus on the frontend element of the application.

After trying various headless Mink drivers such as Goutte and Zombie, it became apparent as the complexity of the application increased, that we needed to run our acceptance tests on the real thing. We needed to be able to automate a real browser; we needed the ability to run complex JavaScript, we needed to be able to handle CSS transitions, and we needed screenshots when steps failed. We needed Selenium.

Selenium2 quickly became key to our acceptance testing strategy, allowing us to see our automated traders interacting with each other, in multiple browsers, via websockets. It quickly became the perfect fit for our increasingly complex real-time application. It was essential for us to ensure that when one trader completes a trade, for example, the result of this trade is communicated to other traders, in different locations. With Selenium2 we could see that first hand.

These tools also fit in nicely with our agile/TDD/BDD way of working here at UVD as Behat allows you to write feature files in Gherkin. These are scenarios, written in plain English, explaining what is being tested to the business owner. They are more than tests; they are our acceptance criteria. So, scenarios used in user stories written in conjunction with the business expert, (input in Trello as individual cards) eventually evolve into our feature files. This allows the client to completely understand the process and, in my experience, gets them on board with testing.

Feature-file

Using our user story scenarios as testing criteria also acts as insurance for us as the client can see their user requirements pass testing. Our Continuous Integration Policy stipulates that we can’t deploy code until it’s passed all tests run by Jenkins, which means that a feature will never be released until it has completely passed its acceptance criteria.

The video below shows the feature file above carrying out the tests in two browsers.

One problem that we found when testing our real-time communication was race conditions; scenario steps were occurring before they were supposed to. For example, assertions that a trade negotiation had commenced were happening before automated trader B has a chance to initiate the trade. Initially the solution here was to use arbitrary ‘waits’, telling the browser to run the next step in 3 seconds time for example. This was successful up to a point, but as our CI environment could often get bogged down with multiple jobs, sometimes this 3 seconds was not long enough. So we pushed the wait to 5 seconds, but as the volume of our acceptance tests increased, we would be waiting for increasingly long periods of time to see if the build had passed or failed. This was proving very time consuming, and continually taking longer. The solution was to use Mink to continually query the DOM via JavaScript, in effect to verify that the DOM elements that were supposed to be there, were present, before kicking off the next step in the test. (We have also managed to speed up the process by upgrading our CI environment!)

We see ourselves using this process for testing in the future and feel we’ve cracked quite a few tough nuts to get us to this point (we worked on introducing and refining it for 18 months) but perhaps we’ll restrict it’s use to the complex web applications of a business critical nature because the overhead in writing and maintaining the Behat suite is not insignificant (you might find you add 50% onto the development time). But because this application is responsible for trades potentially worth millions of dollars and the mission critical nature of the integrity of the application it’s definitely worth it in this case.

If you’re seriously interested in introducing a similar BDD approach into your web application development, it’s also worth bearing in mind that there’s significant time investment to introduce it into your workflow, to get automated testing and CI tools set up and maintained, and importantly getting everyone within the team onboard and up to speed with the tech and methodology, as the approach spans every team member and discipline of the project (project managers, designers, testers, backed and frontend developers, and importantly, your clients). We hope to write a bit more about this in the future so keep an eye out and do ask us questions if you want to know anything in more detail.

Share: