In this episode, Sheila Colclasure, Global Chief Data Ethics Officer of Acxiom, discusses ethical data use and the impact of GDPR on healthcare data. In this digital age, data is of great benefit to humankind, but there is also a dark side when it comes to ethical use of data. In a world of massive data and advanced analytics, it is important to carefully balance the benefits of data, the status of data, and the rights of an individual. Without this, we could accrue commercial interest at the expense of humans. It is, therefore, important to have a construct and a governance methodology to maintain the balance. Ethical data use is not just about minimum compliance but also about whether it’s “just” and “fair”. GDPR at its heart is an accountability law, and it’s more specific than HIPAA. It brings a level of transparency and trust in the digital marketplace. Listen to the podcast to know more.
[00:00:09]Welcome to the big Unlock where we discuss data analytics and emerging technologies in healthcare. Here are some of the most innovative thinkers in healthcare information technology talk about the digital transformation of healthcare and how they are driving change in their organization. [00:00:33] Hello again and welcome everyone to my podcast, it’s my great privilege and honour to invite our very special guest for today. Sheila Colclasure, Global Chief Data Ethics officer for Axiom. Sheila welcome to the show.
[00:00:52] Thank you Paddy I am thrilled to be here and so honoured to be your guest. Thank you.
[00:00:58] Thank you so much, before we get started, today we’re going to cover a bunch of things including what ethical data use is and now the big GDPR compliance that is finally upon us. We’ll talk about all of that which is all very pertinent and topical and before we get started it may be for the benefit of our listeners, you could tell us a little bit about who Axiom is and what your role is within Axiom.
[00:01:30] I’d be delighted to, of course Axiom corp. we have two business units we have our traditional Axiom business unit where we process our clients data transform activate our clients data however they would like it processed and transformed, and then we also build Data Products, Marketing Data Products and Risk data products which solve things like identity authentication needs. Our other business unit is live ramp and its digital and It offers connectivity onboarding distribution and it really enables all brands to activate their data in the digital ecosystem and reach their customers in digital, in whatever digital channel they need to be reached. And that’s a very powerful and exciting capability to be able to offer all of our clients and the rest of the marketplace. So that that’s what we are. I been here a while and my role is Data Ethics officer, and in that capacity I run a global program which is includes a team of specialists that ensure we govern data protect data and honor all of the privacy protections that data and individual people need. That’s a little bit about me and the global program. The notion is and the reason we use the term data ethics is because it’s more than just privacy. It really is about ensuring in every instance that we make sure the right thing happens with data that it’s not just legal which I view as the minimum obligation but it’s also just in other words did we interrogate the data design or data activation to ensure there’s no hidden bias discrimination potential for social harm or reputational damage. So that’s what I called ‘just’ and then the ‘fairness’ piece is when you have all of the facts about a data activation or data transformation surfaced, then you judge the impact of the data and you determine, is this a fair use of data, fair to not just me or the client or partner of course but fair to the individual that the data relates to or they agree that this is a fair and beneficial use of data about them and that’s the way we run the program and that’s what I mean by data ethics.
[00:04:20] That’s very helpful and very interesting too. We are at a time when there is a lot of concern about our personal data. Data that’s being collected in some cases without our complete knowledge and a lot of big technology firms are being questioned about how they actually gather the data and how they use the data and we won’t get too much into that. But you know we, in the context of healthcare which is where I spend most of my time and which is where now we’re seeing a lot of emerging sources of data. It could be you know data that’s coming in the form of genomics data it could be the form of speech data. Lots of emerging data sources. This becomes a little bit more complicated than when it was just electronic health records for instance. I know I’m getting a little bit specific into the world of healthcare data but can you kind of paint a picture of what ethical data use means in the context of healthcare.
[00:05:39] Well I think you’ve described the future Paddy, I think we’re moving in to the digital age and the digital age is driven with data and there is great benefit coming to humankind that there’s also a dark side if we don’t govern things properly if we don’t make the right decisions in order to make the right decisions, you have to have a methodology and that goes to this notion of data ethics in the healthcare space which I view as the most consequential to human kind in the digital age, the way that we use data to solve human health needs potentially eradicate disease and, I think absolutely improve our human condition our human wellbeing and our human lives. We have at our on our horizon the ability to use data in ways we’ve never used data before we have data available that we’ve never had available before. I’ll give you an example to try to light this up. One of my dear friends Dr George Savage who is the inventor of Proteus the first FDA approved sensor embedded Nedd, with the sensor we have the ability to now know and measure not just when a person took their chronic care med but how their body is reacting in real time. In other words, is the med actually working or is there an adverse side effect happening, and we can measure these things so that’s a huge benefit. But in addition to that with many of the population let’s say I don’t know pick a chronic condition let’s say blood pressure or high blood pressure if you could measure the way the entire population is working their blood pressure medicine is working then you could calibrate and improve the protocol right. You could understand all of a sudden if that method really performed or if it needed to be prescribed differently across populations you could understand so much more when we get to precision medicine and we start adding in other data streams the potential power and value to improve human condition is staggering. Now the notion of data ethics is this we could also get it wrong. We could also be predatory. We could be not have enough governance and the benefits instead of accruing to the human could accrue to commercial interests at the expense of a human, we could arrive at a place where there was no place where we could be free unobserved natural humans. And I think that would be a travesty. So we’ve got to have a construct, we have got to have an intention and then we must have a governance methodology to get it right. And some of it Paddy, goes to balance meaning we need data to flow so that the tools can act and we need the actors or the users of data to be fully accountable. And that means we have to judge each of these data flows and data uses to keep them in balance.
[00:09:10] Where are we today in this measure that you’ve just outlined where the data is being used to advance humankind. But at the same time we are putting in the adequate safeguards to protect privacy. Where are we in this continuum right now.
[00:09:31] Well I think we’re just at the beginning you know we are we really moved beyond the Big Data. Big data is a hot term that I think where in the world of massive data and advanced analytics, we really are seeing innovation happen with things like machine learning and deep neural nets and explainable A.I. than what we want to make sure is that we have our human values. This notion of data in service to people a methodized in the design in the analytics, in the data flows and in the data activations and were just at the beginning. One of the you know I would be remiss if I didn’t talk about the critical importance of transparency and choice and control for the individual the data relates to I think we should all be so transparent in an explainable and meaningful manner that any user can understand what is happening with data about themselves and have an opportunity to participate. Like one thing we may need to think about is, if we are trying to understand and improve population health data about us in a purely de-identified form might need to flow and we might not want to enable people to choose to opt out of those flows if it’s purely de-identified and there cannot be an individual impact to the user on that data flow but we can improve population health and maybe control disease outbreak in a way we never could before, maybe our decision is there are no choices about that dataflow but it’s those kinds of granular inflection points where we have to judge the benefit of the data. The status of the data and the rights of the individual against those flows and benefits.
[00:11:45] Right. You talk about population health management and de-identified data in the context of Population Health Analytics which is already happening it’s been happening for a few years a lot of the data is already captured with an electronic health record system that sit within the firewalls of our health systems, now you also talked about the digital future but now it’s no longer entirely within the firewall of the health system. Healthcare consumerism is on the rise. So people are now looking for digital experiences for everything from something as simple as scheduling an appointment to you and getting treatment recommendations or were let’s say a smartphone, where people are consuming health care is now changing slowly but steadily with generational shifts and a lot of other practices. Along with this comes the notion of using multiple data sources for example of that would be a data depending on where you are, you could be served a percent of recommendations by a digital app that collects data about you from your electronic health record system but also takes data about you from your location services and anything else that you may choose to share such as your ideal conditions or you know your lifestyle or anything like that. The question that arises now is who is responsible for all of this being coded right up. How how does one ensure that the data is all been co-mingled and analyzed and served up in ways that do not violate the privacy of the individual, while at the same time delivering the benefits. Is it in the hands of the consumer now. Is it shifting towards a consumer, is he or she now at the center of it or is it something else.
[00:13:46] Well I think that the consumer should be at the center but I think the bigger obligation is this notion of accountability. So yes data flows are global. Yes data flows are digital. But I think we have to mandate accountability so even when we’re going cloud even when we’re going digital those systems have to be engineered. And in the engineering layer we need this data protection by design and default it can’t be bolted on at the end. It’s more important than ever before in digital that we bring all of the policy considerations, all of the ethical interrogation down into the engineering design. So when you ideate you need to interrogate, when you design you must re interrogate when you begin to activate you need to validate it has to happen in the engineering layer. This is what they call privacy by design or in GDPR terms. Data protection by design and that is what I call data ethics. By design.
[00:15:08] That’s very well said Sheila. Now that’s a great segway into the other topic we were going to go to GDPR that went into effect last month 25th of May I believe and now everyone who has some kind of data flowing through the EU in very broad and simplistic terms is covered under GDPR. Is that correct.
[00:15:37] Well GDPR of course is an update of the European Union Data Protection Directive and it is a pan-European law and it has the changes. One of the changes are that it’s extra jurisdictional so its scope is it covers any European Union personal data that is processed anywhere in the world. So if you’re processing EU personal there in the U.S. that is in scope if you’re processing it in Shanghai that is in scope. So the next thing we all need to do is understand what is EU personal data and the way I like to think about it and explain it is EU personal data is any data that relates to a single user. It’s my name, address, email, phone numbers and any other attendant attributes that go with that data set. But it’s also a bit and byte data. It’s also a cookie I.D. a mobile I.D. a customer I.D. because it relates to a single user. So that’s the data that’s in scope for GDPR.
[00:16:45] So now that’s pretty all encompassing, right, it’s broad enough in scope that pretty much anything could be covered in any form of data use could be covered under GDPR or in any violation that carries significant penalties associated with it. So what do you know what are regular organizations supposed to be doing about that. I can tell you I get some 20 notifications every day from all kinds of providers that I didn’t even know that I was dealing with, telling me that they have now updated their policy of their GDPR compliant. What does it mean for the common man or woman. What does it mean for U.S. businesses especially those that on the surface of it don’t have anything to do with EU, lets take healthcare as an example, health systems in the United States don’t really deal with European patients and they are pretty much confined to the US. But at the same time they are using all the data for marketing consumers and so on. I know I throw in a lot of things in there but can you touch on some of them.
00:17:54] Oh sure. I think those are such smart questions Paddy, thank you. Well for an individual what it means is GDPR offered expanded data subject rates of course data subject is European parlance for the individual. So there’s expanded rights in not just notice but a very updated tense parents specified notice a GDPR level notice so that’s new. There is access which we’ve always had but now there’s some a shorter timeframe for enterprises to respond to access requests than there is a correction limitation deletion portability. So there’s new rights for the data subject and that’s one, number two GDPR is, at its heart an accountability law. It’s an accountability and data governance and record keeping law because remember part of GDPR requires demonstrability and inspect ability so you’ve got to keep records. What does this mean for a U.S. entity? U.S. entity that is processing EU personal data is in scope. So it needs to undertake a self exam to determine if it’s processing EU personal data and if it is, then it’s got to remain there and ensure that it’s processing the data and offering the data subject for rights as processing the data in accordance with GDPR. You know I was pretty fortunate here, where I work we have had a data governance program where we actually govern the data in the engineering layer. That’s the way we’ve operated for a very very long time, so for us GDPR was a catalyst to do an inventory and improve our systems and processes and record keeping, but for those who hadn’t really thought about this before they’ve started at ground zero and they’ve had it design that not just do an inventory but essentially do a design process and stand up a data governance program. I think some of the larger entities the more resources it’s not been quite as hard on them for some of the mid-tier and small-tier it has been, it’s certainly has been a resource intensive effort for me. But it does bring a level of transparency and accountability to the marketplace and I think that goes to trust and I think that’s very important. We do want a trusted marketplace especially a trusted digital marketplace because we’re all more digital every day and in the future that is going to be how our commercial markets work for the most part. So I think this is an important new law that we all need to pay attention to.
[00:21:06] What about healthcare, is there a specific twist or nuance that healthcare especially in U.S. healthcare enterprisers, health systems that you don’t really manage the European patients for instance is that a nuance and twist here that they should be looking at. Or in general U.S. corporations that don’t have anything to do with the EU, should they be worried about this at all. Or is it just business as usual.
[00:21:41] It’s I would argue it’s not business as usual, because GDPR is essentially more specific even than our own U.S. HIPAA, Health Insurance Portability and Accountability Act. So if a US health system has EU personal data, this is sort of the first analysis under the law under GDPR there’s a notion called incidental. So if it happens that a U.S. health system or health network is there U.S. focused and not EU focused but they you know when this happens in the world of data. They have a few EU records but they’re not monetizing or commercializing or activating that data then it’s incidental and there’s an exception under the law that they need to undertake and legal analysis that construct in the law to determine if they can operate. And let me give you an example and part of my business thought that their EU data records were incidental. Then we evaluated it, what we determined was my business noted that the records in their U.S. file were European and they bundle that into a European product and they treated it as a European product and they monetized it as a European product. I cannot claim incidental if I’m doing that even though the data collection is happening in the U.S. by a U.S. entity. Right. So the first thing in U.S. healthcare provider needs to do is undertake a self exam and analysis to determine how much EU personal data they have and what they’re doing with it. That then will inform their obligations or their next steps.
[00:23:44] Interesting! well I know this is early days yet in the rollout of GDPR and I’m sure we’re going to see a lot of questions come up and you know it is just a start. Well we’re at the top of the half hour I wish we could continue. But you know there’s so much more to discuss. But I really appreciate your sharing your thoughts here Shiela. And I’m sure our our audience is going to find this very very interesting and useful. So thank you so much for your time. And we’ll be in touch.
[00:24:24] Thank you Paddy, I’m delighted to be with you. Have a great week.
About our guest
As Global Chief Data Ethics Officer for Acxiom and LiveRamp (an Acxiom company), Sheila directs the enterprise data governance, protection, and privacy program and the external data-focused global policy development for botfh organizations. Designed around Ethical Data Use (EDU), Sheila runs an accountability-based, 360° program that also covers government affairs, consumer affairs and related public relations.
She is recognized as a global thought leader on applied data ethics, consumer advocacy and information policy. She participates in numerous domestic and international efforts to help develop effective data- policy, establish industry best practices and achieve maximum harmonization of information policy across the world. With extensive knowledge of laws governing the collection and use of information worldwide, she is sought out by policy makers, regulators and government agencies for her views on the ethical use of data and how to address the complexity of operationalizing next-generation data governance for the connected and digital data-driven ecosystem. Sheila is a 2017 Presidential Leadership Scholar and was recognized by CSO as one of the “12 amazing women in security.”
She is frequently interviewed by the media on ethical data use, data protection and data governance and has advanced thought and practice leadership with government authorities and the industry in many forums, including the Consumer Electronics Show’s Digital Health Summit, US Department of Health and Human Services, Dublin Tech Summit, Global Data Transparency Lab, IAF Digital University, and Ibero-American Data Protection Network. Sheila has also taken the stage at Forrester analyst events, adExchanger events, International Association of Privacy Professionals Global Conferences, Digital Advertising Alliance Summit events, Data and Marketing Association annual conferences, Philly Phorum, Conference of Western Attorneys General, National Association of Attorneys General, American Legislative Exchange Council, American Bar Association annual conferences and Marketing Science Institute.
Sheila serves on the advisory boards of the Information Accountability Foundation (IAF) and the Future of Privacy Forum (FPF) and is corporate liaison to several industry standards-setting groups and research and policy development groups, including the Center for Information Policy Leadership, Digital Advertising Alliance, Mobile Marketing Association and Cambridge Privacy Forums.
Sheila provides support and consult to the Acxiom and LiveRamp client and partner ecosystem – some of the largest marketers in the world – on effective data protection governance and the ethical use of data as a strategic business essential. Her primary focus includes enterprise-wide applied data ethics, ensuring compliance with legal and co-regulatory requirements and development of and compliance with Acxiom’s own leading-practice data ethics policies. She specializes in data protection, use and governance laws U.S. FCRA, GLB, HIPAA, Privacy and Safeguard Rules, Telemarketing Sales Rule, CAN-SPAM, and the European Union General Data Privacy Regulation, ePrivacy Directive/Regulation. She has developed and implemented an annual data protection and governance audit function that is widely regarded as the best in the industry.
Prior to joining Acxiom, she worked for the U.S. Senate, and then managed congressional and political affairs for the American Institute of Certified Public Accountants in its Washington, D.C. office, focusing on legislative and regulatory initiatives as well as directing its political action committee. Sheila has a master’s degree in communications, specializing in business and political communication
About the host
Paddy is the co-author of Healthcare Digital Transformation – How Consumerism, Technology and Pandemic are Accelerating the Future (Taylor & Francis, Aug 2020), along with Edward W. Marx. Paddy is also the author of the best-selling book The Big Unlock – Harnessing Data and Growing Digital Health Businesses in a Value-based Care Era (Archway Publishing, 2017). He is the host of the highly subscribed The Big Unlock podcast on digital transformation in healthcare featuring C-level executives from the healthcare and technology sectors. He is widely published and has a by-lined column in CIO Magazine and other respected industry publications.