Data Strategy

Podcast: data transparency and education, with Anne Thielen

Ethical questions around technology and data are nothing new. But should organisations be more transparent with the public? Does the public even want to be educated about data? 

Podcast-Microphone-Image
18 minutes to read
Author

These are the thought-provoking questions raised by Anne Thielen, R&D Manager, Health Technology Solutions at Sonova AG in this episode of our Data Today Podcast.

We learn about Anne's background as a technologist, and explore the current challenges facing the medical field when it comes to data. Join us as we explore the fascinating topic of The Internet of Humans and the difficult conversations we need to be having about data ethics. 

Podcast transcript

Dan Klein: 

Hello and welcome to Data Today, brought to you by Zühlke. I'm your host Dan Klein, and I look after everything data and AI at Zühlke. We're living in a world of opportunities, but to fully realize them, we have to reshape the way we innovate. We need to stop siloing data, ring-fencing knowledge, and looking at traditional value chains. And that's what this podcast is about. We're taking a look at data outside the box to see how amazing individuals from disparate fields and industries are transforming the way they work with data, the challenges they are overcoming and what we can all learn from them. Everything we do in the modern world creates data. We are all connected in some way. Every part of us is a node in the great internet of humans all generating data and insight and all making decisions independently, which can have interactions we can't imagine elsewhere. 

It can all seem very chaotic. In fact, it's a beautiful, but very hard to follow, dance. And somewhere to someone that data and the insight we can generate from it has value. The question is how can we make sure that the humans always see a net positive benefit from the data they create rather than becoming a victim of it? That's the ethical question that plagues today's guest, Anne Thielen, program manager, health technologies, at Sonova Group, a Swiss medical equipment manufacturer specializing in hearing aids. So Anne, tell me, you obviously work for Sonova and you are very keen on the ethics and the use of data by inference if you like. One would assume that Sonova for you is a good company ethically to work with. How do you maintain that ethical stance within a company given the corporate pressures that somebody like Sonova is almost certainly going to be under? 

 

Anne Thielen: 

Generally I would say that threat is something that we all have, but me in my role, I think it's my job to make sure that we don't go down this path where I have a few crucial advantages, right? Which is that I come from technology, I do understand how it works, and I also have a voice and I raise that voice in order to make sure that we take the steps that are necessary in order to go and ethic away with the data. Also using the advantage that a lot of the people in positions that make decisions not necessarily come from a software background or from a technical background or have been doing so many years ago. So they also have to rely on people like me knowing still the insights, the details, and so I use that position that I can recommend steps and really highlight them to them to make sure that we go down this path. 

 

Dan Klein: 

So do you have a definition of what good use of data looks like and what bad use of data looks like? 

 

Anne Thielen: 

I think it's about being transparent about what you do or what you intend to do with the data. It's not so much about what specifically you can do with it because there are always two sides, right? You can use profiling of clients for lots of different reasons, for really good ones in a moral sense versus really bad one in a moral sense. But in the end it's about making this very transparent about what you're going to do and making that also transparent in a company, going away even internally from making decisions behind closed doors, no documentation or minutes of important meetings where things have been discussed. 

So really bringing that transparency not only from the client, from the end user, or from the recipient of the data, but also within the company that people are aware of what they're working on and what they're working towards. And then if there might be something, in a basic moral sense, unethical, then everybody has the opportunity to raise their voice. And I think this is the most important thing that we can act on if we see, if we can identify something that is not going in the direction, we would stand behind and say, "This is morally and ethically, for me, okay." 

 

Dan Klein: 

That premise leads me to then ask the question around governance because if you have people able to say, "Hang on a sec, this might be ethically problematic, morally problematic," there does need to be a route by which you can then affect that change. So is that private citizens and calling it out on Twitter or is that expecting the courts and the legal system to deal with it? Is that suitably set up do you think for where we are today? 

 

Anne Thielen: 

No, I think it's not suitably set up and I think it's not only private people in courts or something, but it's also society as a general body with all the experts that we have in with the universities and all these academic research institutions and other general dependent or independent institutions. Where if you make it transparent and you speak about it in public, on congresses, things like that, everybody has a choice to highlight it because this is really mean letting it all go back to the normal person, right? 

To say, "Oh, the end user, they have the right to speak up," and then putting that responsibility onto them where they're not subject matter experts. This is a good way, but also if subject matter experts are part of the discussion, are part of the initial discussion or even being part of the continued discussion, then you raise a system or you create a system where people can speak up and you rely on not only a very, let's say weak from a subject matter perspective, point of view, population part. But you also rely on the ones that are equal to you, the ones that actually have a voice and have the understanding to raise concerns. 

 

Dan Klein: 

So within the medical industry then, where are your biggest concerns at the moment in the use of data and technology? 

 

Anne Thielen: 

My biggest concern is probably are we fast enough to compete with all the nonmedical companies? Because we are moving in highly regulated space, which is there for very many good reasons. Whereas companies that are not providing per definition medical features or medical functions, but are really at the borderline, they are not regulated. They're really acting in a completely different field with very little attention to the products that they provide and the safety of those products. 

 

Dan Klein: 

In terms of what Sonova does, now Sonova's into hearing products. What sorts of challenges do you have within your products themselves in terms of how you put them out into society and the data that you take from your products? 

 

Anne Thielen: 

In general, a medical product has a very strong purpose, right? In our case, it's to compensate a hearing loss or to help with a hearing impairment. So there is this main functionality, the main focus that our products have. There are highly integrated systems at the edge of technology, so everything we do we do to support that purpose. So now when you talk about creating data points that we also want to save and that we want to have a look into and analyze and so on, that is an additional purpose to those devices and just with our specific purpose that we have, that puts a lot of challenge on the system. So just to create this additional purpose brings us technically to a big challenge. 

 

Dan Klein: 

Do you feel the legislative guidance for organizations in this space is strong enough at the moment? Because as you say, you are at the cutting edge of this stuff and typically the legislative environment tends to lag where the technology is. So are there holes in the legislative framework that potentially need to be filled here? 

 

Anne Thielen: 

I would absolutely agree to that, and this is again, going back to our initial conversation, this is why people with the knowledge, they have to go into the lead. They have to request it from the organization to actually build that system. Because it's largely not there, or if it's there, then it's really on a very high level, not figured out in the details on really how to act and how to move forward and what to put in place and whatnot. So for sure, but again, it's also a big opportunity. It's for us also really interesting to be part of that economy of the MedTech industry to put down or to enforce or at least ask for this guidance by ourselves and to co-create them essentially. 

 

Dan Klein: 

Using data for good can be tricky. I've had had a few instances in my career where I've had to really weigh up the benefits of accessing or sharing data against the risks that this might pose. In the UK we have case law which looks at proportionality. We widely accept that there's an upside and downside to innovation and try to examine how they balance. If it was proportionally likely that somebody would do something bad, I wouldn't do it. If I thought the risk was less than the opportunity, I'd do it. That's my trade-off and I've applied it in many sectors in my career. At what point do you say, "We are not doing this?" There's a debate to be had there. Anne has always been a technologist, but her transition to Sonova brought her up close and personal with the human side of technology. 

 

Anne Thielen: 

Initially I have a background in MEMS, so in Micro-Electro-Mechanical Systems, really sensors and actuators. So which gives you a very holistic view on how to build and use and process data and MEMS that would raise these data points. So yeah, I went to MIT, looked into how electronic circuits are built in cells and if there's a possibility to build very similar circuits, so inspired by nature into electronic design, which would again help us to build much more efficient circuits, especially on the nanoscale. And this is how I ended up in Switzerland because then once I finalized the project at MIT, I was like, "Oh, now I want to look into the nanoscale not only the microscale." So I ended up in Switzerland where I pursued PhD in looking into the nanoscale. Really how it works and how it doesn't work. Looking into how electrons move to computer chips when they reach the edge of scaling, meaning when classic transport is outdated and quantum mechanics kicks in. And yeah, in a nutshell, so far, we still use classical transport. 

There has been, it's really difficult to make quantum mechanical transport to computer chips work. So we always try to avoid it, even though the chips are scaling down more and more. I wanted to actually stay in the solid-state physics industry and really go further there. And then I was invited by a company called Sonova and I was really surprised. I had no touchpoint with them whatsoever. But I got here and it's an outstanding company when it comes to the people. It was a very open and friendly environment, and I was so intrigued by that that I ditched. But I also discovered something that I've never seen before and that is a human. As a hardcore technologist and engineer, I was always looking into machines, into devices, and here in this company I discovered that there is a human behind it and it's quite interesting to understand how you bring technology in to the day-to-day life of our clients and to our patients and apply that technology to ... that it's becoming useful to them and that really opened up a new horizon for me. 

 

Dan Klein: 

Isn't that interesting? Because that's where the moral debate is in some senses is that actually what matters in technology is not the technology but how humans interact with it. And I think crucially something you said is about whether humans can be bothered to understand how they're interacting with it. For me, there's a moral question, not just in terms of what we do with the technology as technologists, but there's a wider societal question around people paying attention to it.

 

Anne Thielen: 

Absolutely agree, and I think this is also where a lot of people try to circumvent, even companies where they claim, "Oh, we're just doing the components." Or, "We're just doing some parts of the machine." In the end, there's just one company that actually builds this user interface in the application. And then this company or these people, they have then to go through all this big debate about how it's used and where it's applied and is it applied in the right way and do we educate people enough? Should we even educate them about that? Look at our school system. It's lagging behind a hundred years, still educating the basics in languages and history, hardly having any interface to technology. Whereas nowadays in all of our pockets are super computers, if you want to say it like this, right? And still ask the kids nowadays, "How does a touch screen work?" They're not interested. They're only interested how to use TikTok with it. So what's our agreement then? That education is the only way forward both to the young and the older generation? 

 

Dan Klein: 

Yes, I violently agree with you on that. I think the crucial thing here is that we have to not assume that we should be educating the young, but actually assume the education should be both ways that the young have to educate the older generation. The older generation, they then potentially might want to suggest, "Oh, there's some things you might want to learn about here." Some of the history tells us that we may not want a government as a bad actor because they may want to look at us for bad reasons. Or technology companies in a country we're not particularly happy with may want to look at us for particularly bad reasons. I think we should definitely agree there's an onus on education both ways and transparency. 

 

Anne Thielen: 

No, absolutely. Yeah, I think so too. 

 

Dan Klein: 

With transparency comes some difficult conversations. People will need to be held accountable and policy changed. On the other hand, does the general public have time to be educated on the origins, quality, and use of their data? Do the different generations, everyone from baby boomers to Gen Z, have the capacity and will to understand what's at stake? The younger generation may be more data savvy in some ways, but many of them will have had embarrassing moments through social media oversharing that my generation never had. Even though my generation doesn't suffer the same tech fears, say, of the baby boomers, my hope is that we'll have a group of young people who will, through trial and error, be much more data conscious online. Do good data ethics demand that we track the origin of data? We live in a time of fake news and with stories being warped from their origin. Can education and transparency in data help combat this? This all boils down to having difficult conversations. So what do we need to hear? 

 

Anne Thielen: 

If you are able to trace back where it comes from and where it's described and what is the pathway of the data, then you can be, maybe not absolutely, but more sure of that the origin is what it is supposed to be and the story is what it's supposed to be and the storyline has actually some actual value to it. 

 

Dan Klein: 

Let's pick a medical example here then. We have a situation where mobile phones and wearables, let's say, are able to collect information about people, but I think most of the population would be aware that the quality of that collected data is not going to be as good as you being connected up to a monitor in a hospital. Do you think that's a fair assumption or do you think people assume that actually that replaces the hospital? 

 

Anne Thielen: 

I think it's not a fair assumption because the variable or the phone or whatever it is has a big advantage. It sits with you the whole time permanently on a day-to-day basis. And so even though the quality of data might be lesser than what you expect from a medical device in a hospital, the sheer quantity and the ability to then clean out that data gives you so much more freedom and so much more possibilities to identify a condition or some other health issues. It is in the end I would say superior over the medical device because it's always stationary and it never catches you in a real world environment. 

 

Dan Klein: 

You are seeing the real patient rather than a patient sitting stationary or lying stationary in a hospital bed. Absolutely. You've talked on some of your blogs, some of your posts, about the internet of humans, which I love as an idea. I love it as a concept. And I'm just ... If we talk about the internet of humans and we connect humans, as opposed to just devices, does that therefore immediately assume we have to be really clear about the resilience of the technology that's connecting humans? 

 

Anne Thielen: 

There's always this conceptual thinking about it would be good to, let's say, understand everything and explain everything beforehand, right? But in the end we also have the fact that if you are the architect of a new playground for kids, you can only guess what the kids are playing there. And yes, you can apply UX and test it with some kids and some paper installations of what kind of games they're going to play in there. In the end, there's always a part where you just go into the open and you have to do the real world experiment and to try out what will come up because you cannot foresee all the outcomes. You can test a lot, but not everything. There is a dynamic to itself that you could never have imagined, I think on every playground that you build, and we are building a new playground here. 

 

Dan Klein: 

Historically, we would've had a playground. We would have had kids play in it. And as you say, you don't remember what the kids did in the playground. You have no concept of what they'd done. You have no record of it. And actually when we come to the internet of humans, it's almost more invasive than somebody videoing a playground. And videoing a playground is clearly not something that is done these days for very obvious reasons, but when we talk about the internet of humans, we are recording every digital interaction that a human's having continuously. So the child's playground is no longer safe? Is that the ethical debate we're having today? 

 

Anne Thielen: 

I think it's on us to make it a safe place, but at the same time, we also must admit to ourselves and be humbled by the knowledge that we also do not fully understand and know everything. We try, and for sure this is the best thing possible, but there will be damage on the way. Of course we can sit here and it's like, "Oh, of course. Everything is planned out, as long as it takes to figure everything out." We will not figure everything out. We have to accept that there will be some damage done on the way, which is okay, in the sense that if you have put in a lot of effort and make it as transparent as possible and educate as much as possible that you minimize that damage. But there will for sure be some damage. This is what happens with new playgrounds, with new inventions, with new technology that we have not seen before. But it's really about being as transparent and open as possible because then you can collect as much feedback and hopefully, like I said, minimize. 

 

Dan Klein: 

So how do we ensure as a society that the human is always at the core and therefore ethics of how that technology is used is always there to satisfy societal needs? 

 

Anne Thielen: 

We of course promote the debate and the discussion about it. That's the only thing that we can actively do, that we go out like we do now then and go into the conversation and raise all these points to be discussed openly instead of hiding them and putting them under the blanket. But go out and have the opportunity and share it with others like we do to allow to give them feedback and allow to listen, even though it sometimes is not so pleasant to get that feedback, but to have that resilience also to listen to all of these different angles and concerns. 

 

Dan Klein: 

At the end of the day, we are always going to get some things wrong. We're only humans after all. Anne's approach, today through ethics, is one of radical transparency, one that owns up to mistakes and failings and allows the public and institutions access to the origins of the data they're exposed to. Worrying about the unregulated nature of the medical space is legitimate. Can legislation and government fill the gaps? One things for sure, we all want the playground to be safe. Business ecosystems are not new. What is new is that they are becoming increasingly data empowered. To realize complex opportunities, we need innovation beyond boundaries, democratized information and close collaboration between diverse players. Collaborative, data empowered, border-less innovation is how we embrace a world of exponential change, and that's what this podcast is about. Thanks for listening to Data Today, brought to you by Zühlke. I've been your host, Dan Klein.  

Discover more episodes of Data Today with Dan Klein.