I've seen too many articles praising the Medallion architecture as the ideal solution for enterprise data quality. At first glance, the structured three-layer approach seems like a no-brainer: organize your data into neat layers of bronze, silver, and gold, and you've established a seamless improvement in data quality.
But if we look closer, my aversion to this architectural approach is growing. Sure, it promises consistent, scalable and centralized information quality improvement. In practice, however, quality problems are constantly solved, too late and rigidly, with the same tool, regardless of the context.
Enterprises are complex adaptive systems with wildly different data sources, each with unique challenges regarding information quality. Why impose the same rigid process on everyone? Forcing them all into the same centralized quality framework will create inefficiencies and unnecessary overhead.
I want to challenge medallion architecture as the supposed best answer to enterprise data quality problems. I will champion a decentralized and more personalized approach, inspired by Total Quality Management (TQM) and aligned with…