Attempts to set up a data warehouse within the pension insurance company have had varying degrees of success. A large-scale quality survey of relevant administrations in 2009 created a new urgency to work with data, with data integration taking a central stage.
Since, from a business point of view, such an investment must have a long lifespan – there were no (large-scale) cloud solutions yet – sustainability was one of the design principles. Besides this principle, flexibility, reliability, availability and repeatability were also important design principles. The design was created by the team that had to realise the environment. In a period of six weeks, a prototype was built using various methods and techniques. This resulted in a ‘grand high level design’ for the data model and the technical solution for the environment, in which an iterative development strategy was chosen.
After the realisation of the quality survey and the associated in control statement, the environment was further expanded. This was important for execution of portfolio analyses, in-depth quality analyses, operational control and foundation for the migration process & datapipeline to select and (commercially) migrate customers to the new product propositions. In 2018, the same data environment was further expanded for the analysis and implementation of new legislation. Now this environment is being used for data science activities. Thus, this environment has celebrated its ten-year anniversary and has been able to provide many strategic, tactical and operational goals with data needed to achieve desired results.
During the session, Mark van der Veen will share his experiences on how to get value from the initial set-up of the data environment.
Click here for the conference schedule