AI can’t be all that bad. The problem I’m always seeing with AI is a double-edged sword. You have corporations shoving AI in just about everything, treating it like its a cure for cancer and that really rubs people the wrong way. Then, on a more of a society level, you’ve got people who use AI for an assortment of things like making art with AI and still accredit themselves as an artist to people who treat AI like a therapist when it is not advised to.
However, I’ve found some benefits with AI. For example, I’m chatting with ChatGPT on credit cards, because it is something I may lean towards getting into. It’s helping me better understand than most people have tried explaining to me. Simply because it is giving me a more stream-lined response than people just beating the bush.


How do you know for certain?
A HIPAA violation is a death sentence to a company, along with massive fines.
There’s no incentive for them to fuck around
I can’t know for certain, as I’m not on the product side of things. But I do know that HIPAA standards are very rigorous and if it were discovered that they were intentionally misleading therapists and clients then it would invite a class action lawsuit that would be insanely large.
I do ask for and document my clients’ consent, though, so if anyone is not comfortable with it that’s fine. I just write the note the old fashioned way. Most are fine but a few have said they don’t want to and it’s not a big deal.
People conflate security with risk mitigation. It’s not secure in the way that you can confirm the data has been deleted. The risk however is mitigated due to vendor attestations reinforced by contracts.
Yep, so you can’t actually know if the recording is destroyed, it’s just contractually required to be destroyed. Big difference in my book.
Wished these sensitive audios would be processed locally and never leave the therapist’s network instead.