Thanks to Digital Hub Denmark and Industriens Fond for your support in sharing our vision to bring interpretability and explainability to life science data sets.
Casper:
If we want to continue being the humans in control of the technology we built, of the society we create, then we have to understand what the decisions we make are based on and not just blindly trust that the computer’s probably right.
I’m Casper Wilstrup. I’m one of the founders of the company, Abzu, where I serve as CEO.
Fundamentally, when you use artificial intelligence, or more specifically machine learning, to analyze a collection of data, you typically get a model that can predict an outcome. But it will not be able to tell you anything about why these things occurred.
So it will perhaps say: This person will not benefit from a drug, whereas this person will. But you won’t, as a researcher, get any kind of insights into why that is.
So what Abzu set out out to do was to create a system that, instead of giving you these kind of black-box models as they are called, would instead actually give you the scientific theories that could explain the phenomenon that you were studying.
So if we humans want to control the world we live in, and the society that we’re creating, then we need to explain phenomena and not just analyze them with traditional machine learning technologies.
In the long run, I think explainable artificial intelligence of the flavor that we’re developing is going to be, if not leading, then at least half of the market across virtually any specific business problem that you can mention.
And that’s where we see the big long-term untapped potential of Abzu is to provide this fundamental technology that is being used across many, many, many different business verticals.