Femtech startup Clue is increasing the feature-set of its interval monitoring app to draw ladies exterior its present youthful demographic.
“Menopause is a large house,” stated founder Ida Tin, talking on stage right here at TechCrunch Disrupt Berlin. “I’ve been submerged into this space of feminine well being for nearly a decade now and day by day I be taught one thing new and all over the place I look I’m like ‘why is no person coping with this?’
“The place is know-how? How is know-how serving ladies’s wants after they undergo menopause? There may be nothing — it’s actually, actually simply this open house. So what we actually wish to do with Clue is to sort of be a companion as they stroll via life.”
Whereas she stated the present precedence for the app is including extra options to serve its present consumer base, who’re primarily utilizing it for interval monitoring, options for monitoring menopause signs may be added “within the coming months and years”.
The transformative potential of monitoring knowledge to unlock a deeper understanding of well being points was additionally mentioned through the session.
“Give it a couple of years and I feel that folks will begin understanding that having this longitude data-set of your well being goes to be an extremely worthwhile factor to have — nearly like life insurance coverage,” stated Tin.
“As a result of we are going to be taught to select up early indicators of illness that presently now we have no methods to detect early sufficient — ovarian most cancers can be one in all them. Which is completely treatable when you catch it early. However it’s laborious to catch it early.
“And I feel there can be many extra issues like this the place folks will be taught to know that gathering knowledge to your well being is only a actually, actually good factor to do.”
However on the info entrance she additionally cautioned that know-how firms pushing into the well being house really want to prioritize knowledge transparency and ethics.
Taking time to do due diligence on potential companions is among the causes Clue has been holding off on doing extra integrations with third events which may broaden its personal knowledge pipe, she added, noting additionally that it could slightly accomplice with a maker than construct its personal units.
units which are actually thrilling her are “the sort of issues that may inform us about what’s happening within the physique at a extra molecular stage”, she stated.
“And in addition issues the place the consumer expertise might be true mass market — I feel in the mean time now we have some options for pure household planning however… the consumer expertise will not be what it must be for it to be one thing that’s actually working for lots of people. So these are the sorts of issues we predict now we have concepts that might make that higher.”
“That’s undoubtedly an ambition that now we have to combine with lots of various things — and it’s fantastic to be on this house of femtech as a result of there may be a lot taking place,” added Tin. “However we’ve been holding off til we’ve found out what actually to do with these further knowledge streams, what partnership we felt is a extremely good model match.
“Particularly with among the huge companies — we wish to actually guarantee that now we have the consumer’s wants on the heart of our consideration. And guarantee that we are able to navigate one thing as difficult as a partnership with a giant company with out that, in the long run, not benefiting the consumer.”
Responding to a query about issues raised within the UK by a data-sharing and app improvement partnership between advert large Google DeepMind and the nation’s Nationwide Well being Service, she stated: “I feel the dearth of transparency is admittedly problematic.”
This summer time, a 2015 settlement between the Royal Free NHS Belief and DeepMind was judged by the UK’s knowledge safety watchdog to have damaged privateness legal guidelines. Underneath the association, the medical data of 1.6M sufferers utilizing three London hospitals handed to DeepMind with out the folks’s information or consent — and, because it turned out, with no authorized foundation for the knowledge to be shared.
“Issues that occur with out customers understanding the place their knowledge goes I feel simply shouldn’t occur,” stated Tin.
“There’s all the time this type of stress between — that knowledge can be utilized for dangerous and knowledge that can be utilized for good. And I feel proper now there’s a lot connotation that knowledge is sort of a unfavourable factor, and individuals are misusing it, promoting it, hacking it, breaking it. And I actually wish to additionally increase the voice — and it’s a implausible factor that we are able to now perceive all this factor that we couldn’t perceive earlier than. And actually be a stance that we are able to use knowledge for good — we simply must get it proper. We shouldn’t draw back and suppose knowledge’s dangerous.
“Knowledge’s implausible — it’s after we misuse it, it turns into problematic,” she added. “So let’s construct good, moral, strong knowledge firms.
“It’s begins with a really deep, moral alternative that you simply make as a founder, as an organization… What sort of firm can we wish to be? And what do we predict is true? After which dwelling by these requirements.”