No, I don’t want to breathe! When Apple integrated a mindfulness app into its watch a few years ago, the Apple Watch reminded me at least once a day to take a minute to breathe. I tried it once or twice, but afterwards I realized that it doesn’t do me any good personally. From then on, the constant reminder led to annoyed gasping. So I turned off the app notifications. It’s been quiet since then and I have the impression that a few other people besides me have decided not to breathe with Apple.


Malte Kirchner has been an editor at heise online since 2022. In addition to the technology itself, he is concerned with the question of how it is changing society. He pays special attention to news from Apple. He also does development and podcasting.

In my opinion, the experience with the mindfulness app is a good example of why I still believe that people can and should decide for themselves which device functions they use and which they don’t – and that technical progress shouldn’t be too patronizing from the outset should be accompanied. New rumors say that Apple plans to expand the Health app in iOS 17, codenamed Quartz, to include an AI trainer. Coupled with new features that keep an eye on the mental health of the user, this coach aims to provide tips. And although we are talking about what is currently still pure speculation, the first skeptics are already speaking up, who are very concerned that the AI ​​​​will negatively influence people, possibly driving them crazy in the first place.

Of course, depending on character and temperament, it cannot be ruled out that the increasing paternalism of devices will also have negative effects on people. The best example is the diverse call-to-action requests with which social networks and apps constantly try to draw you in. Quite a few people, especially young people, have astronomically high levels of screen time. We’re not talking about Digital Detox for nothing. And the strange thing is that tech companies like Apple and Google even provide tools for this – digitally, of course, in smartphones. Malicious gossip has it that this is like a drug dealer preventing his customers from overdosing in order to keep them as customers. I would rather see it as if the tech companies have not forgotten that people can act independently and suddenly rearrange their priorities in life against their once-loved smartphone.

Efforts to further develop the smartphone into a consultant in everyday life are certainly a step in the direction of making it even more indispensable. Health-AI and new functions for mental health would first of all satisfy the growing need of many people in the age of self-tracking to gain even more data about themselves, to optimize themselves.

They were whetted by the beginnings of this movement, by the heart rate monitors and the EKG watches, which even alerted some to a serious cardiological problem that lay dormant inside them like a smoldering fire. Or let’s take the fitness rings: they create awareness for many people – myself included – and through their gamification with the closing of the rings a motivation to do more for themselves.

I see exactly this potential in the field of mental health. And what’s more: there is probably a large number of unreported cases of people who don’t even notice how they are slipping into a possible psychological problem. The allegedly planned new functions would certainly not come out of a time when people with the Covid lockdowns often got into an unwanted isolation that did something to them. I see on social networks that some level-headed contemporaries have become thinner, more aggressive in tone. And who knows what the whole thing has done to you.

Is it a contempt for humans to hand them over to an AI? That’s a good, valid question. On the other hand, is it better to just leave people to their own devices and not give them any help at all?

Recommended Editorial Content

With your consent, an external survey (Opinary GmbH) will be loaded here.

Always load polls

From my point of view, it is important that in the case of mental health, a person comes into play early enough when the system recognizes problems. And yes, the “elephant in the room” is of course data protection. Health data is of a completely different quality than the question of which socks I prefer to buy. It’s nice if I trust a tech company on the issue, but the strengthening of health functions should also be seen as a request for regulators to look very closely and give them laws that they can use to effectively enforce protection. It is now more important than ever.

And in the end, of course, it should be up to you whether the AI ​​coach keeps an eye on you when using a product like the Apple Watch or the iPhone. Just like the mindfulness app. If that’s guaranteed, we should first wait and see what’s really coming at the WWDC developer conference in June and until then just take a deep breath – by the way, it also works great without an app!


More from Mac & i

More from Mac & i


More from Mac & i

More from Mac & i


(mki)

To home page

California18

Welcome to California18, your number one source for Breaking News from the World. We’re dedicated to giving you the very best of News.

Leave a Reply