This is episode 5 of the Dataverax YoutTube channel and speaks to the data quality dimension of accuracy.
I am writing up the following companion piece to go along with my Tableau Public visualization of COVID Cases per Million (North America Region). I created this visualization because I was curious about the state of the outbreak and how it varies from jurisdiction to jurisdiction. At first blush we can see that New York state is the hardest hit by the pandemic, with neighbouring states also affected. It is interesting to see how some states and regions are little affected in as far as pure number of cases go.
But herein lies a problem. There are challenges in using this data which I have drawn from public sources. Each jurisdiction has handled this crisis a little bit differently so we may not be comparing apples to apples here. This may call into question the integrity and the quality of the data, but sometimes you can only make do with what you have. Especially in a crisis.
I trust we will learn many lessons from this crisis in terms of public health, government effectiveness, and also in terms of data quality. This situation continues to unfold, and I shall continue to follow it closely.
“Some are born great, some achieve greatness, and some have greatness thrust upon them” – William Shakespeare, Twelfth Night
There is only one constant in technology, and that is change. Change came to me on the heels of coronavirus a few weeks ago and like the flip of a switch, the times had changed. I have been here before: in 2017, in 2009, in 2001 and in 1995, I was caught up in sweeping change. But in change comes great opportunity, a chance of renewal, a chance to reinvent, a chance to be reborn.
I have always loved technology, and my how the times have changed. My first home computer was an Atari 800XL and was Atari’s answer to the Commodore 64. I would tinker with code in Basic, attempting to write simple video games and such. In high school I would upgrade to an IBM 286 with the at-the-time incredible memory of 640K. It is funny now to picture what we considered advanced computers of the day. I remember being amazed by the first computer I coveted that had one thousand kilobytes in memory, the Atari ST. To put that in perspective, my current laptop has 16 million times more memory.
I went on to business school, and to a job in the oil patch where I learned about databases and SQL code and how to watch a business go down the drain. I was laid off in round six. The department I had worked for went from 100 employees eventually down to 2. Shortly after the firm was bought out. So it goes.
Keen to move on from the oil patch, I briefly considered law school when a lawyer mentor told me “You must love the law.” Not loving the law, I went back to computers, the love of my career. I picked up a job in 1999 with a company called Newcomp Solutions in Toronto, where I learned about Cognos and data warehousing and ETL development. It was the heady days of the internet boom, and I had arrived.
Since those days I have seen continuous change and innovation, from Cognos series 5 to Cognos series 11, from Microsoft SQL Server 2000 and Data Transformation Services (DTS) to Azure and Power BI, from Informatica to Talend and Alteryx, to Tableau, to Big Data, to Cloud, to ever bigger and ever faster. I became a Cognos expert and a technical instructor. Along the way I picked up data strategy, data governance, enterprise architecture and broader data management concepts, at a variety of employers and clients, until I arrived where I am today, an independent consultant in data management consulting and technical education.
And now as we stand amidst a global pandemic, wondering what will come next, seeing what I have seen, I can tell you what it will be.
Ever more change. Bring it on. I am ready.