The Power of the Post

Exploring Ethical Responsibility in Tech & Social Media



Don’t get me wrong, social media is an amazing tool. It connects people from all over the world, shares new information, perspectives, cultures, and puppy videos. It’s also a terrifyingly powerful tool in unregulated hands, and like anything in history, we should know by now how that can end up. Today's post is a little different than my usual topics, but applies to anyone that uses the internet and social media personally, or in business.



My point in writing this is to put these ideas out there for others to consider, as I’ve slowly discovered them, and believe they’re worth discussing. They’re important topics for people to consider in their lives, communities, and businesses. My hope is that we choose to be curious and ask ourselves the important questions in a digital age where one small voice can have a huge, unexpected impact - for better or worse.


Where does the ethical responsibility lie? How do we work together to change the course we’re on? These questions apply to every issue finally coming to light on a global stage. And they’re being discussed, analyzed, and “cancelled” via the newest threat to humanity - the technology behind social media.


How can you use this technology to bring people together, market your business, or have a positive impact on the world? We’ll all have to work together to use curiosity, purpose, and clarity in how we communicate, and share with others. I’ll use insights from sources that shifted my paradigms about the shared human experience in technology and social media. I’ll delve into what all this could mean as a user, marketer, tech company, or just a human in today’s society.


There are only two industries that call their customers “users” illegal drugs and software. -Edward Tufte

Be a Curious User - Ask Why


It’s all too easy to just ignore a threat. Especially if we’re currently benefiting directly from ignoring it and the threat isn’t an immediate, tangible danger to us. So what is the danger of being an avid user of technology and social media?


The new Netflix documentary The Social Dilemma, exposes the progression of social media and big tech companies from being an exciting tool for communication and sharing information, to their unethical manipulation of users, and the damage we’ve already incurred from this phenomena. Stick with me on this.



Tristan Harris, a former design ethicist at Google, appears throughout the documentary weighing in on the technical aspects of AI and algorithms, but mostly their dangerous impact on society. Harris explains, “We’re going from a tools-based tech environment to a manipulation-based environment.” Meaning technology today isn’t simply a tool waiting to be used, it’s focused on convincing you to buy any tool from the highest bidder for your attention.


Shoshana Zuboff, a Harvard Professor and social psychologist also featured in the documentary, gives a perspective on the economics driving these companies forward. Zuboff describes how big tech companies are collecting data and using it to build a new marketplace trading in "human futures." Big tech companies are in a race to build the best model for predicting human actions. If it sounds a lot like the plot from HBO's WestWorld, that’s because it basically is.



So how much are you willing to give up your autonomy, and most importantly, what will these corporations do with it? Jaron Lanier, author of Ten Arguments For Deleting Your Social Media Accounts Right Now states that “It’s the gradual, slight, imperceptible change in your own behavior and perception, that is the product...give me a billion dollars and I will change the world by 1%.” Now am I running to my phone right now and deleting all my apps? No, but this is very interesting food for thought.

And if you’re wondering, there is already plenty of evidence this technology works very effectively, if you haven’t experienced some of that evidence yourself at some point. Like a targeted ad coming up on your instagram feed shortly after you just talked to your friend about trying to find the perfect black jumpsuit. Or worse, when several engineers in the documentary point out that there wasn't technically a cyber attack, Russia simply used the technology we created against us.


So how do we become responsible, autonomous users? The book 21 Lessons for the 21st Century, by Yuval Noah Harari is a truly eye-opening view on the future of humanity based on history, human nature, and the effects of rapid technological advancement. From Harari’s perspective, “Technology isn’t bad. If you know what you want in life, technology can help you get it. But if you don’t know what you want in life, it will be all too easy for technology to shape your aims for you and take control of your life. Especially as technology gets better at understanding humans, you might increasingly find yourself serving it, instead of it serving you.”


Later in the book, Harari’s perspective shifts to giving us a glimmer of hope as he says “The first step is to tone down the prophecies of doom, and switch from panic mode to bewilderment. Panic is a form of hubris. It comes from the smug feeling that I know exactly where the world is heading – down. Bewilderment is more humble, and therefore more clear-sighted.” In other words, we’re all confused and bewildered, so stay curious and hopeful. Don't be so afraid to say "I don't know" to yourself and others.


What lies beneath our addiction to technology, and who is responsible for changing how we interact with it? After all, this power is driven by consumer wants and needs - supply and demand. Then are we to blame for wanting something so detrimental to human consciousness? At what point is innovation bad and when do you stop and refocus? We’re “voting with our wallets” more than ever before. We find out the owner of a company treats his employees poorly, we stop buying from that company. Millennials especially care about where and how products are made, and the purpose behind the company's mission. This is the driving force behind a lot of powerful 21st century brands.


What about when it comes to tech companies? Even while I’m writing this my phone is next to me and I’m tempted to pause and check Facebook. So trust me, I’m a user of social media, just like anyone else, but I also took the most repeated advice at the end of The Social Dilemma - turn your notifications off for anything that isn’t urgent. Truly, I'm not sure I would have ever finished this blog post if I hadn't followed that advice. That's more meta than the time I ordered UberEats while in an Uber.


As a user I feel it’s important to stay openly curious, ask yourself why a lot. Why am I posting this? Why am I on social media right now? Why does this post make me feel this way? Why did this company post this, and why is it showing up on my feed? Why do I get on my phone immediately before I go to sleep or wake up? Why am I seeing this ad right now? Because the answer to all of those questions is - it was by design, but the designer usually isn’t you. Consciously choose the media you consume and how often you use it, check multiple sources before sharing, and stay curious or bewildered about different perspectives. And at the least - read the entire article before you share it.


Be an Ethical Designer - Ask What


When advertisers are the customers, user attention becomes the product being sold.


You’ll hear a lot of the former engineers and executives in The Social Dilemma talk about the business model being broken. They claim giant tech and social media companies use an “attention extraction business model” intentionally creating an addictive product. With AI and algorithms running this business 24/7, it is massively profitable. According to Harris, the algorithms have three goals to keep up: engagement, growth, and advertising profit. The longer they keep you on an app, the more money they can charge for that ad space, and when they know your likes and dislikes, what you search for, what videos you stop to watch, these targeted ads become extremely effective. So as a marketer or business owner, why not use that to your advantage, right? Seems harmless at first.


To make things a little more terrifying, Tim Kendall, former executive at Facebook, describes those three parameters as so accurately tuned, that “at Facebook we talked about having Mark (Zuckerberg) have those dials.” Essentially to adjust these goals and drive users to a specific action. It’s called persuasive technology and it works a lot like a slot machine using positive intermittent reinforcement so you keep going back to see if you get a positive response, cash, likes, comments - whatever elicits a dopamine response in your brain. They’re teaching techniques like this in some of the best colleges in the country, to grow this industry of creating highly effective, addictive tech products.


Now that we have more definitive information about how these algorithms work, and what the negative outcomes could be, we are finally stopping to think about the fact that “never before in history have 50 designers made decisions that would have an impact on two billion people,” according to Harris. Shouldn’t we focus more on creating ethical design with an overall beneficial purpose? Instead we’re teaching future tech leaders that profit overrides purpose and consideration for your impact on humanity.


“The danger is that if we invest too much in developing AI and too little in developing human consciousness, the very sophisticated artificial intelligence of computers might only serve to empower the natural stupidity of humans.” says Harari in 21 lessons. Something addictive will always be profitable, but ultimately detrimental until society intervenes to put humanity above profit - think early cigarette ads compared to now.


What would you ask of a social media company now that you know the potential power they have over you, your children, and society, not to mention the future of human existence? The documentary goes into the alarming, negative effects social media has on younger generations. According to the CDC the number of 10-14 year old girls that have committed suicide has increase 151%. They note that the number begins to go up after 2009, which would be the first generation to have social media in middle school.

So at what point does it become the company’s responsibility to protect children, instead of creating a product that intentionally manipulates and harms them? The only solution for this is to be, and raise, responsible users, as we discussed earlier, and keep talking about these issues, stay curious and informed.


Harris now runs a nonprofit called Center for Humane Technology. The Center "advocates for regulators and technology companies to avoid social media features and profit incentives that it sees as contributing to internet addiction, political extremism, and misinformation." The work has already begun to hold these companies accountable for upholding ethical values, and asking what is the purpose of this technology? What possible impact will this have on society in the long term? What can I do to ensure the safest use of this product? What is my product actually being used for, and is it potentially dangerous? What problem does it solve?



Be a Purpose-Driven Marketer - Ask How


If you aren’t paying for the product, then you are the product.


Jonathan Mildenhall is one of my favorite sources for modern marketing advice. Having worked with Coca-Cola, Airbnb and many other strong brands, he has a fresh perspective on what it takes to create a truly successful brand in the 21st century. Mildenhall advocates for purpose-driven marketing that considers a company as a whole, the values of their customers, and their impact on the world.



So how do companies develop a marketing strategy focused on authenticity and purpose? According to Yuval Noah Harari, “one can try to evade the problem by adopting a ‘morality of intentions’. What’s important is what I intend, not what I actually do or the outcome of what I do. However, in a world in which everything is interconnected, the supreme moral imperative becomes the imperative to know.” In other words, it’s not enough to just talk about doing the right thing, you must put these values into action and make them part of the fiber of your company to begin with. From there, you market to your “tribe” that shares this sense of purpose in the world. The same sentiment applies to being a conscious consumer, and puts the burden on you to investigate what type of business you're investing in.


Mildenhall believes that starting with a purpose bigger than your product and having a social responsibility agenda in line with business purposes, strengthens your community. This community then creates a sense of belonging, and ultimately engagement, with your audience. But this engagement is organic and driven by a strong sense of purpose and responsibility, instead of technology-based manipulation.


Marketers, business executives, and consumers share the responsibility of creating the interest to listen and investigate, as well as the courage to act. How can I better communicate my brand’s purpose to my audience? How can I effectively, and ethically get my product in the hands of the people that need it? How can I develop a brand purpose that is bigger than my product? How will my company as a whole evoke these beliefs? How do we address setbacks or bad reviews?



Be a Humanitarian - Ask Where


When it comes to our individual responsibility to the world, Harari says “the greatest crimes in modern history resulted not just from hatred and greed, but even more so from ignorance and indifference” while we've just discovered that you're more profitable to corporations as a thoughtless consumer, than an independent, functioning human.


Approaching what we now know about the future of technology and social media from a human perspective, where do you want society to go? Seeking truth and clarity is our biggest asset in the age of “fake news.” And that can start with anyone. Deciding what to do with it when you find it takes courage. Most of the countries that are targeted and exploited by the darker influence of social media have democratic societies. In The Social Dilemma, Harris distills his thoughts into this powerful statement:

“Technology has the ability to pull the worst out of society, then the worst part of society is the existential threat. Society is incapable of healing itself.”

We are currently faced with a seemingly infinite number of societal problems that require a collective effort to overcome, but we are more divided than ever. In a new article by Yuval Noah Harari titled The World After Coronavirus he states “to achieve such a level of compliance and co-operation, you need trust. People need to trust science, to trust public authorities, and to trust the media.” If there’s anything social media is doing now, it’s definitely not helping us create trust, and yet we keep inserting ourselves into the most effective position at continuing this narrative. The user, the engineer, the marketer, the unassuming human - so now it’s time to ask yourself where? Where did this information come from? Where did I find it? Where will this argument lead? Where are we currently headed? Where do we want to be headed? Where is this post going and who will see it?


This last part, being a humanitarian, is required by everyone in order to be effective in the digital age. We all must ask what our personal and professional impact will be. Social media and technology gives each of us more power over humanity than ever before, and it’s up to all of us to decide what to do with that power.






#thesocialdilemma #socialmediamarketing #ethicsintechnology #socialmedia #digitalethics #humanetechnology #ai

Let's Connect

katieoconsulting@gmail.com

Phone: 832-422-5488

Ask about my nonprofit discount
  • Facebook
  • Twitter
  • LinkedIn

Monthly Updates

© 2020 by Katie O'Neil. Proudly created with Wix.com