Data has always been the brick of any software-building process. No IT process or technology experience can afford to ignore or forget this fundamental constituent. More so, in the data-defined engagements, offerings, models and disruptive technologies which have occupied our lives in the last few years. It becomes indispensable to ensure that testing covers all visible and invisible data aspects well- especially where Extraction, Transfer and Loading (ETL) processes play a relevant role. This is where Data Warehouse and ETL testing assume pivotal importance.
Our radical and successful foray in this area is marked by a good understanding of the ramifications of a technology transformation for any organisation; and how having smooth, well-digested and well-shelved data makes a lot of difference here. We bring in the seriousness and the expertise that this seemingly run-of-the-mill area needs so direly during a software transition.
Since the last decade or two, there have been tremendous changes in the software development methodologies. Varieties of tools besides agile development, extreme programming, and test-driven development have changed the philosophy and are still modifying how the software development takes place.
“Technology is like a fish. The longer it stays on the shelf, the less desirable it becomes.” Andrew Heller nailed it well. Technology has a tendency, after all, to move towards the oblivion of a shelf, unless this is actively addressed and redressed by both its makers and users.