What is a crawler script?
Much like a web crawler is used to “crawl” your website to gather and organize data, in software testing terms a crawler is an automation script written to crawl your website and seek out issues. A surprising number of engineering and coding issues can be caught simply by taking a quick glance at the final product after a deployment. No advanced knowledge or training is required to spot issues such as these, but who has the time to manually review every piece of a site every time new code is pushed? That’s where a crawler script comes in. Crawlers are designed to mimic the kind of simple, at-a-glance testing that is crucial to the user experience but is time consuming to do manually. The crawler hits every page of a site, every piece of an application, and compares, through either manual or visual assertion, that all expected elements are in place, and that there’s nothing new that we don’t expect. This is a quick, simple path to getting some peace of mind with a new release. Automation Goals As with any new test, during the creation of a crawler script, a tester should have three main goals in mind: the test should be simple, it should be fast, and, most importantly, it should be reliable. These goals should rarely change, no matter what kind of testing you’re doing. With that said, how does the creation of a crawler script align with these goals? Simplicity A proper crawler script is the epitome of simplicity. At the end of the day, its only function is to let us know “Yes, this page looks identical to last time” or “No, this page looks different than it did before, and here’s how.” That’s it. With rare exceptions, if the crawler is doing more than this base-level of validation, that functionality should be stripped out of the crawler and re-written as its own test. Simple is better. Speed A simple crawler script is as fast as your website. Once your page loads, the script already has every piece of information it needs to validate your application and move on. By sticking to our primary goals and keeping the test simple, we’re also ensuring that our test runs as fast as possible. Reliability Making your automated tests reliable sounds like a no-brainer, but it’s truly one of the most difficult aspects of software validation. Software, by its very nature, is changing constantly, and therefore so do automated tests. But whereas software has the “fallback” of automated testing to root out any underlying issues, automated tests have no such luxury. The tests have to work. Every time. A suitably simple and speedy crawler will be as reliable as possible. By relegating more complicated, time-consuming tasks to their own tests, we boost not only the speed, but the reliability of the crawler. There is no particularly complex logic or parameters for the test to get hung up on. In this way, you ensure that you always have the crawler in your back pocket when it comes to a quick, base-level of validation. Simpler tests make for both increased speed and reliability.
Comments