Design, create and extract smaller, more intelligent sets of data for the Mainframe
Data Subset™ for Mainframe enables users to design, create and extract smaller, more intelligent sets of secure data for use in non-productions; improving the quality and efficiency of your test cycles and helping you to realise considerable savings (as much as $50k per database) on storage and maintenance costs.
Data Subset™ uses native Mainframe utilities to migrate the data, ensuring the highest possible performance for your subset. As well as being optimised for speed, this means that you can eliminate many of steps usually required to migrate the data off of the Mainframe. This accelerates the provisioning of realistic, secure test data which maintains all your full business and referential integrity.
However, the real value in Native Mainframe Subsetting is the ability to create and extract covered subsets, based on specific scenario, coverage or application demands. Using cubed views, you can extract an even spread of data types from production. This provides teams with the best coverage possible from production data, whilst also enabling you to identify and, in combination with synthetic data creation techniques, create data that is ‘fit for purpose’ and covers all required scenarios.
In addition to this, Data Subset™ allows you to create federated subsets; that is, consistent subsets from multiple systems based on specific criteria, coverage or attributes. This enables you to extract consistent, meaningful and realistic subsets from each system separately, provisioning teams with data that is ‘fit for purpose’ and facilitating efforts to shift testing left in your development lifecycle.
Using Data Subset™ for Mainframe allows you to:
· Significantly reduce infrastructure and storage costs by as much as $50k per database
· Gain greater control over your data and select exactly the data you need for testing
· Migrate data subsets more quickly using native database, high-performance utilities
· Automated process reduces the time and cost of provisioning data
· Maintain all business and referential integrity is maintained to ensure meaningful data