Considerate.io

People Aren't an Edge Case

More and more of our lives depend on software. Therefore it is increasingly important that software is built in order to benefit people. "Considerate Software" is a design, engineering and management philosophy that captures this spirit by encouraging development of safe, sustainable and responsible software.


Why "Considerate"?

A search of what is "considered harmful" turns up far too many instances of trivial, mechanical inconveniences for software creators. In reality, software can create and escalate actual harm in people's lives. We believe software creators should focus not only on the technical aspects of their craft, but the effects of the software that they build.

bach

Meet "Bach"

Bach is a considerate cat, and has the following traits:

  1. Bravery — Bach is brave enough to refuse tasks or jobs that she considers harmful. She raises issues with harmful features. When hiring, she takes character as well as technical experience into account. When PII or financial information are involved, she knows that safeguarding this info can take time and makes sure the team will allow for it in planning features.
  2. Awareness — Bach considers the worst case scenarios of the software systems she works on. She knows where is the company headed after the next round of funding. She knows how to treat sensitive data with extra care. She ensures that when people provide their data, they do so knowingly and willingly. When Bach is interviewing at a company, she looks beyond salary and perks to ensure that her values will continue to be reflected by the company, even after another round of funding or sale.
  3. Compassion — Bach wouldn't build a system that she wouldn't feel comfortable recommending to friends or using herself. She goes beyond a superficial "user empathy" of what makes people "click," and is willing to imagine how people would feel if they were adversely affected by the system.
  4. Humility — Bach isn't perfect, and recognizes that her skills might not be the best ones on the team in every technical area. To help keep end users and team members safe, she readily admits areas where she's not an expert, especially when it comes to security.

Consideration for Companies

Consideration isn't just for individuals. A company with enough considerate cats can ensure the company acts in the world's best interest. Here's how that works.

Companies can be considerate by:

  • Allowing and encouraging team members to, with impunity, discuss concerns about privacy, security, and health of the people using the software, as well as the software's effect on the world as a whole.
  • Maintaining an informed list of worst-case scenarios from the system, including security vulnerabilities, abuses of the software, exposure of personal or financial data, exposure to financial risk, environmental destruction, job losses, and negative consequences on the health and happiness of communities and individuals affected by the software.
  • Contributing either code, fiscal sponsorship, or both to the open source ecosystem and rejecting a parasitic relationship of strip-mining OSS.

... and considering these while hiring

  • Will this candidate increase the bredth of perspective or diversity of the team?
  • Will this candidate be considerate of our team members and users of our software?
  • Has the candidate worked for companies that do the world a disservice, and if so, do they have any regrets about it?

... and while sprint planning

  • Are we collecting any new information from people? If so, are we allowing time to sufficiently safeguard it? Are people aware that they are giving up new information? Do we really need the information?
  • Are we making any new assumptions about people based on the data we have? Do any of those assumptions have harmful worst case scenarios?
  • Will these changes make our software punish any groups of people unfairly?
  • Are we releasing new functionality to people who use the system that has potential for abuse or harm?
  • Are our third parties gaining new functionality that has potential for abuse or harm?
  • Are people using the system informed about new and existing 3rd party access?

... and during longer planning cycles

  • Are we scaling security efforts proportionally to the importance and sensitivity of our data?
  • Does our team contain any security expertise? If not, should we consider an external audit or training employees?
  • Is the code sufficiently easy to reason about? Is it well tested?
  • Do people using the software know what data about them is being collected?
  • Do people using the software have expectations that are aligned with company goals?
  • Do we allow people using the software to truly leave our system?
  • Do people own the content they create on our system?
  • Do employees have the right to refuse working on features based on ethical grounds?
  • Have planned features been scrapped or revamped due to employee concerns?

Inspiration

Relevant talks