This session discusses all aspects of data warehouses and data lakes, including data quality, data governance, auditability, performance, historic data, and data integration, to determine if the data lakehouse is a marketing hype or whether this is really a valuable and realistic new data architecture.
This session will briefly recap the main concepts and practices of Data Mesh and Data Fabric and consider their implications for Data Quality Management. Will the Mesh and Fabric make Data Quality easier or harder to get right? As a foundational data discipline how should Data Quality principles and practices evolve and adapt to meet the needs of these new trends? What new approaches and practices may be needed? What are the implications for Data Quality practitioners and other data management professionals working in other data disciplines such as Data Governance, Business Intelligence and Data Warehousing?
In practice, DataOps is not as common for data & analytics as DevOps is for software engineering. For the latter, Development and Operations are jointly responsible for developing a system, deploying it and maintaining the system. With the aim of delivering faster, being more agile and creating maximum business value. This is where DataOps is the same as DevOps: the objective is similar. But ‘How’ we do this, differs considerably.