Justifying privacy!

Trying to justify the right of privacy is like trying to justify any other human right. They are called rights, not because you need to justify them, but because we came to a consensus that they are integral parts of a working society. The right to privacy is like the right to not being harmed, your right to free speech or right to freedom. All of these are nothing you need to justify in order to acquire them, you innately have them, period.

You may choose not to use them, but you can never use that as grounds to deny them to others. The answer “because I have a right to privacy” is enough to satisfyingly answer the “I have nothing to hide” paradigm; no justification needed for making use of basic human rights.

Consequences of Roe v. Wade overturn

The expected reversal of Roe v. Wade will trigger the most significant and far reaching challenge to Big Tech Trust & Safety policies in the history of the Internet. Anti-choice states will demand access to search and location data. And that’s only the beginning.

Internet security should not be taken for granted. We already know that data-collectors sell these information to one who pays the most and buying such data is easy even for a normal citizen. The fact that these data can end up in the hands of dangerous people who will give you death sentence is frightening.

This is a time for reconsideration of our safety and privacy practices as well as a demand for more strict regulations on people’s privacy-related matters on internet.

If Roe v. Wade is overturned, states should pass laws on people’s privacy to protect people from being recognized as someone who did or aided an abortion. Every humane legislator should fight this law and pass other laws that will practically make this overturn ineffective.

Human privacy was always essential to freedom and security but these are the times we should take it more seriously. Every major player in the field should now act upon the matter and spread the information about the consequences of this overturn for our privacy and the consequences of privacy-violation for this overturn and our lives.

This is a great example for us and everybody to see how privacy is essential for us and is far more serious than what anyone could think. Privacy is a matter of human right and it should be respected as one of the most important of our rights whatsoever.

Toxicity around programmer community

See the above picture? That’s what have been annoying many people, specially between programmers. We have been trolling many people into a cult and a fashion to make everybody look like us. We made it to a point that whenever we see a (relatively) weird-looking person, we know he’s a programmer. We even call ourselves nerds.

We constantly are in war for our text editors (while we know Emacs is superior), we constantly talk about our operating systems, we constantly are looking to compute and write small programs, we bang about not having a life or not being able to live a normal day, and we pressure ourselves to show everybody that we’re familiar with computers.

For the last 3 or 4 years, I’ve been avoiding to call myself a programmer and emphasize that I just know how to code. I’ve been avoiding the cult we’re in for a long time now and I hope I succeed.

A programmer should have a life. Instead of banging about how fucked up we are, we should instead show off how our lives are keep becoming better. Instead of joking about how anti-social we are, we should show off what a great family we built.

Few days ago, Kev Quirk shared this picture (of a tweet) and shared his thoughts:

This is what’s wrong with the tech industry. The expectation that one should give their free time *EVERY DAY* or you’re somehow “not passionate”.

My response to that would be fuck you and your shitty fucking culture.

My kids and family are FAR more important to me than your ridiculous expectations.

I fully agree. We should stop participating in this cult and start caring about ourselves. Nothing is wrong about caring about ourselves. Nothing is wrong about having a good night sleep. Nothing is wrong to take our life as serious as our job.

It’s OK to care about politics and environment and other important matters. We’re not robots built to do certain jobs. Just because we have passion, talent, and skills for creating and working with programs, doesn’t mean we should sacrifice everything we have for it.

Can fediverse admins read your DMs?

As the news about Elon Musk buying Twitter reaches more people, Mastodon (and others) face increase in user number and active members. This is good news.

And yes, fediverse admins can read your DMs. That’s not news to anyone. Every person with some knowledge of how internet and web application/services work knows that a sysadmin and those with access to databases control everything and in this example, can read your direct messages.

Even with encrypted services, a sysadmin can disable encryption or do various series of attacks to get your encryption keys/passwords. I posted something on Mastodon the other day about the irony of people using Gmail to sign up for Mastodon (or generally any other social network/online service) and then being worried about the privacy of their DMs.

I’m not saying they don’t have a right for privacy or their concern is baseless. Coming from Middle East I truly understand it. Here’s a scenario as an example: Imagine an authoritarian regime, like Russia, sets and runs thousands of Mastodon instances and operates them to aggregate data about people. People unknowingly will send private messages and all of those messages are now in hand of a regime that suppresses its opposition.

Or even if you trust an instance admin, you can still be in danger as Mastodon is a federated network and the admin of the other instance you’re talking to (through DMs) can too read your messages.

However, trusting a social network for private messaging is wrong at first place. You shouldn’t use a social network of any kind to send important messages or to communicate safely. Mastodon DMs (or any other network’s direct message) is just a post that is shown to people who are mentioned, nothing more.

The best use for DMs is to ask for a private communication method or handle, like XMPP, and then contact them there. It would be nice for Mastodon and other fediverse software to have end-to-end encrypted private messaging implemented in.

The other thing is that Mastodon, and every other fediverse software I now, are free software. Meaning people can run, study, change, and share it as they wish (under the terms of the free licenses they have) and it gives people to freely modify the software and run their own server to please their own needs for communication on the fediverse. I don’t think that can be done by normal everyday users as they probably lack skills, time, and money to do so, but it’s possible.

The other thing is that we should teach people about their privacy rights and how internet and online services work. For a long time, mega corporations mistreated people and misguided them to accept or trust services that violate their basic human right of privacy and it should be our job and obligation to teach them this.

Fediverse is currently the best social network we can have. It’s decentralized, without ads, without trackers, and designed solely for socializing and creating networks/connections. We should help develop it to be more secure and trustworthy and we should keep promoting it to have more people there. Of course, that requires us to be honest, respectful, and welcoming to new people.

My favorite social network

Few days ago I found honk, my favorite social network. Taking a look at it, it’s just perfect, the way I want all social networks to be.

It’s federated with ActivityPub protocol, has no likes, no faves, no polls, no stars, no claps, no counts. There’s no attention mining in it and it just works to connect people’s postings and thoughts and create a community.

It’s theme and look/feeling may be not desirable and beautiful but since it’s free software, you surely can change its user interface. I’m pretty positive that the original developers will accept contributions to the user interface and/or user experience of it.

The honk mission is to work well with minimal setup and support costs. It’s to spend more time using the software and less time operating it. It currently works well as intended. It’s multi-user, supports many features, and is my favorite social network now. It’s exactly how I wanted social networks to be since years ago.

The developers have a sense of humor too, you know by reading their documentation and intro/README texts, but I hope the whole project is not a joke and they are like-minded people.

Auto-update is a bad idea

So if you know me a little, you probably know that I’m all in for computer user freedom and having people fully in charge and control of their computing. I’m a free software advocate and I’m very careful about my own computers and digital devices.

One thing that I believe is a bad idea implemented in most of our operating systems or digital devices is auto-update. Automatic updates allow users to keep their software programs updated without having to check for and install available updates manually. The software automatically checks for available updates, and if found, the updates are downloaded and installed without user intervention.

So the user has no idea what is being downloaded, when it is being installed, how does the update work, what will be the effect of the new update. In a perfect world where everyone is good and all programs respect users’ interest, auto-update is a pretty awesome idea but sadly we don’t live in such world.

Automatic updates are bad for privacy and some security aspects. Turning on auto-update on a system puts you in danger of trusting the device manufacturer to behave good. Anything could be contained in the update and the possible harm may not be reversed.

The update could contain a back door and the door can open the way for anyone to sabotage your computing. Universal backdoor is the way to go if you wish to be colonized and dependent and get your device shut down when it suits the vendor.[1]

If you’re using free software, you can study your program and monitor its changes but you’re still vulnerable as a program being free (as in freedom) won’t technically disallow insecurities being implemented on your device but you still have more chance on reversing the changes and/or monitor what is happening to your device.

I understand if many people don’t have enough time or knowledge to keep all their devices up-to-date or verify every update manually but handling updates manually often has more benefits than turning on auto-updates.

The three laws of personal devices

Law 1

Your devices must work in your interests and your interests alone.

Law 2

When a feature can be built so that algorithms and data are kept exclusively on the person’s own device, it must be built that way.

If a feature cannot be built in this manner, all data must be end-to-end encrypted and the owner of the device must be the exclusive holder of the private key.

Law 3

The hardware, software, and services must be free as in freedom.


This writing is modified and posted under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) license, originally written by Aral Balkan.

EdTech malware

Tech corporations are taking over the field of education by pushing their proprietary products into educational institutions of all levels. Proprietary applications loaded with malicious functionalities such as surveillance and collection of sensitive personal data—among many others—are being imposed on schools’ staff, teachers, students, and even parents. With the rapid expansion of online teaching, these proprietary educational technologies not only spread dramatically across schools, but they went from the classroom to the home.

This is not to say that educational technology is a bad approach per se. The problem arises when the software used in EdTech is nonfree, meaning it denies students the rights that free software grants all users.

Nonfree EdTech fails to assist the learning process by forbidding students to study the programs they are required to use, thus opposing the very nature and purpose of education. It does not allow school administrators and teachers to safeguard students’ rights by forbidding them to inspect the source code of the programs they run. It does not enable parents to make sure their children are protected from surveillance, data collection, and other mistreatment by the owner of the proprietary program.

Proprietary video conferencing software, as well as other nonfree programs, are tethered to online dis-services that collect large amounts of personal data. The school may have to agree to the company’s unjust terms of dis-service. The school, in turn, will typically force students to create an account on the dis-service, which includes agreeing to the terms.

EdTech companies are already developing great power over the students in the schools where they operate, and it will get worse. They use their surveillance power to manipulate students by customizing learning materials in the same way they customize ads and pieces of news. This way, they direct students into tracks towards various levels of knowledge, power and prestige.

These companies also structure their terms and conditions so that they are never held responsible for the consequences.

This article argues that these companies should get licenses to operate. That wouldn’t hurt, but it doesn’t address the root of the problem. All data acquired in a school about any student, teacher, or employee, must not leave the school’s control: whatever computers store the data must belong to the school and run free software. That way the school district and/or parents can control what it does with those data.

Join us in the fight against the use of nonfree software in schools.


This writing is copyrighted to GNU Education Team and GNU.org web site, licensed under CC BY-SA.