Copy
View this email in your browser

Welcome!

Happy end of May! We're almost halfway through the year and it seems that this year is really flying by. If you haven't noticed, we have a new look and feel. I spent this month designing and developing a new website for Society x Tech. You can check out past issues, read maker interviews, and subscribe all in one place!  Let me know what you think!

This issue is slightly different because we'll be talking about a book. This month I read Technically Wrong: Sexist Apps, Biased Algorithms and Other Threats of Toxic Tech by Sara Wachter-Boettcher. I came across this book because of Frauvis monthly book club. I really enjoyed it and thought it would be great for me to highlight some key takeaways. Read on to find out what I have to say! 

All inclusive user personas

When beginning to design a product, one of the first steps is defining your audience. You do so by creating user personas,  which are hypothetical archetypes of real users as defined by their goals and frustrations. While this does help in determining what to focus on when designing a product, there are some biases that go into these user personas causing lots of things to getting ignored. One thing I really liked and I as a designer agree with that she mentioned is that user personas shouldn't be associated with a specific visualized face because you get too caught up in what they look like and biases start to arise. When building a product, you shouldn't look at "outlier" personas or scenarios as something for a later iteration of a product. Inclusiveness should start at the beginning.

The subtle ways technology can make you feel excluded.

The book talks about instances where technology has excluded people in ways that you may not think of. Some examples were when Facebook tells someone that your name doesn't appear to be real, when forms online don't allow you to put hyphens in your name, and when your name gets cut off on forms for being too long. For some, you may have never experienced this before, and for others these things may not seem like a big deal, but at the end of the day these are subtle jabs at people's culture and people's identity. 

Data can't drive all your solutions.

Algorithms contain the biases that people who build them also have. Because of this, solutions being made by these algorithms are not equal. Until biases can be eliminated, you can't justify using data to solve all of your issues. It doesn't make any sense.  As a society we need to work on making our communities fairer so that the algorithms that we choose to implement will be also be safe and inclusive.

What are we doing to retain diverse talent?

We talk about the need for diversity and inclusion in tech, but are we talking about retention? If a company hires 50% more women but 65% of those women leave after a year, have we really done anything? While there is clearly a pipeline problem, for those that do make it through the pipeline, how are they supposed to feel inclined to stay? Before bringing diverse talent, make sure that environment is inclusive and geared towards helping people succeed. It's important to take care of the people and establish better morale across the company to make better tech within society. 

Technology is powerful and is how our society and culture has been shaped. If we as people start thinking more critically about the products that we interact with every day, we can make great strides into creating a more inclusive environment that helps all.
What do you think? What are some biases that you see in Tech? Tweet me so we can discuss. 
Copyright © 2019 Society x Tech, All rights reserved.

Our mailing address is:
societyxtech@gmail.com

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.
 






This email was sent to <<Email Address>>
why did I get this?    unsubscribe from this list    update subscription preferences
Society x Tech · 195 Montague St · Brooklyn, NY 11201 · USA

Email Marketing Powered by Mailchimp