Tuesday, February 17, 2026 -- Mattie died 833 weeks ago today.
Tonight's picture was taken in February of 2009. What was going on here? Well Mattie was having a physical therapy session! In typical Mattie fashion, he wasn't going to do something we did not try first. So me and Anna (his physical therapist) got down on the floor to play Twister. Mattie was calling out the moves and once he observed the process and how we managed this, he was then willing to give it a try! But he did not do it alone.
Whatever Mattie did, I always joined him. We were a team and I was committed to every aspect of his cancer journey.
Quote of the day: The only courage that matters is the kind that gets you from one moment to the next. ~ Mignon McLaughlin
Yesterday I mentioned the Lost Screen Memorial and the impact of social media on mental health. Today, I came across an article entitled, Inside Open AI Leaders' Decision to Take Down A Beloved AI Model. Honestly we live in a world that I do not understand most days and I think technology is a big contributor to a great deal of problems we face as a society. Certainly technology can be beneficial, but when we have the need to turn to technology for emotional support, this is where a red flag goes up in my head. I don't mean turning to your phone to call, email, or text a LIVE human being. I mean turning to a chatbot for advice, input, and guidance. It almost seems like something out of a sci-fi book, however, it is our 21st century reality. A reality that also charges you for access to these chatbot services!
There are many chatbot users who claim that these interactions have saved their lives. That of course is the positive side of usage, however, there are a host of victims to this technology as well. In fact, doctors claim they have seen links between chatbot usage and the development of psychotic delusions and there is a lawsuit in California regarding users who killed themselves, attempted suicide, suffered mental breaks or, in at least one case, killed another person. It is alleged that these chatbot models give priority to user engagement and prolonged interactions over safety, which is similar to the debate on social-media sites which are accused of pushing users into echo-chambers of their own views and rabbit holes of disturbing content.
Never using a chatbot, I truly can't comment on the entire experience. But I guess when it comes to emotional issues, I am not likely to turn to a device. Especially when I know the device doesn't know me, doesn't know my situation, has no concept of my history and certainly it can't be impartial! So it intrigues me that someone would use this method, especially during a crisis. I see the dangers in this, as it is like seeking therapy from an unlicensed professional, with little oversight or regulation.
But what this says to me is people are turning to these options because we as a society have stopped listening. We have stopped talking, we have stopped connecting, and we have prioritized things, tasks, and jobs over human interactions. This should make us all pause, because at the end of the day, intentional care, compassion, and support for our family and friends in our lives should be our priority, our goal.
More valentine's day surprises in the mail today! I am grateful for every gift and card, that are reminders that I am worthy of love, respect, and commitment.
No comments:
Post a Comment