I’ve been using generative transformers (so-called AI, such as ChatGPT) for some time now, and I’ve found them to be pretty useful. Whether I need a brief summary of an article or a quick one-sentence explanation of a blog post, they deliver it as I wish.
If I need a guide on setting up a secure system for managing my SSH keys, I can get a detailed step-by-step process. Or if I want help creating a cartoonish avatar for a friend’s daughter, that’s done in few seconds. They are doing an impressive job, and I genuinely appreciate the technology behind them.
However, I do have some concerns. Above all, I value my privacy. I don’t want any corporation creeping on my private conversations, nor do I want them tracking what and when I think about certain stuff. The size or structure of the corporation doesn’t matter to me—it’s my personal information, and I want control over it. That’s why I avoid sharing any sensitive data with these services. And if I do, I try my best to anonymize it.
The ChatGPT app from OpenAI states that when you delete a chat, it will be removed from their servers within a month. If they stick to this promise, they’re doing more than many other companies. Still, it doesn’t go far enough.
Real privacy and security are comes with free (as in freedom) software. Free software gives me the ability to modify and control the program, ensuring it works exactly how I want it to, not how a corporation dictates.
With free software, I have the power to take control of my computer. If the program compromises my personal data or violates my rights (online or offline) I can stop it. I’ll be the one in control.
Unfortunately, I haven’t yet found an so-called AI tool that is both free (as in freedom) and provides an openly accessible language model for public use. But I’m hopeful that this will change. As the free software community continues to grow (as it always has), I’m confident we’ll soon see generative transformers that respect freedom and privacy.