I will push back here though. You pointed out above it isn't Zuckerberg alone who is responsible. It's a systemic effect with many involved.
I wrote my original post off the cuff. I might edit it to say that Zuckerberg, as the CEO, sets the tone, and hires the people who build the system and set the culture that either incentivizes/allows/demands these practices (dark patterns, etc.).
Individual engineers can take a stand, but without their managers, and those managers' managers backing them, it's a Sissyphean task.
The point being, Zuckerberg is ultimately responsible.
My issue is with the word "ultimately" - to me it is used to throw the burden of making ethical choices off the front line employee and onto a black hole somewhere up the chain of command.
Yes, the responsibility is ULTIMATELY on zuck, but for it to reach him, every single person in line has to shrug their shoulders and do the wrong thing because it's more profitable to do so.
In that sense, I agree with your push back. However, I lament that we've given tech employees this kind of morality and ethics-free environment to play in.
I get that. I really do. I appreciate our conversation here so let me add a bit more color.
I'm a big fan of systems thinking. My friends get on me for it, but I wholly believe "if you aren't thinking about the system, you aren't thinking" or some aphorism to that effect. Donella Meadows, Russell Ackoff, W. Edwards Deming all inform my thoughts on this.
When an individual contributor like you are mentioning above is in a system there are four options they can take when they see something off.
1) Do nothing
2) Distort the system (often seen as a band-aid fix)
3) Distort the data (when Alan Mulally took over Ford all of their dashboards were "green" which meant good while the company was slated to lose billions of dollars that quarter)
4) Change the system
From Deming I learned this: who is responsible for the system? Management. Who is the top manager at FB? Mark Zuckerberg. So that's my perspective on using the word "ultimately." I also noticed that with your wording "...but for it to reach him" gives the sense that these behaviors are percolating from the bottom up. My sense, is looking at it from the other perspective--Zuckerberg down. He has set the tone, built the system, and the culture that allows (demands?) this behavior.
Another Deming quote that I have seen all too viscerally in my 17 years in the professional world is this
"A bad system beats a good person every time."
I don't excuse unethical or illegal behavior of ICs or managers, but I can look at the system they are a part of and understand it.
As someone who has quit a software engineering job, partially over ethical objection to how my software was being used, I really appreciate this thread and this topic. Even if your role is just “3rd engineer grunt from the left” you share the ethical blame and consequences with the other actors in the entire system.
Great discussion. This seemed an appropriate place to interject an idea from another systems thinker[0]:
"A loss of X dollars is always the responsibility of an executive whose financial responsibility exceeds X dollars." - Gerald Weinberg's 'First Principle of Financial Management' and 'Second Rule of Failure Prevention' [1]
By this model, I'd say engineers have some responsibility, but managers and, especially, executives shoulder the primary responsibility. By the same model, engineer's choices also have to be mediated by their own responsibilities to pay the bills, take care of their families, and so on.
[0] An Introduction to General Systems Thinking, Weinberg
That heuristic from Weinberg is excellent. I think that illustrates the point we were discussing above in a much clearer way.
Thanks for bringing him to light. I was completely ignorant of him before your comment and look forward to looking into the sources you graciously added to the discussion.
Edit: Went to add the books you cited to my Goodreads list and lo and behold, the first one was already there. I am not going to live long enough to read all of the books I want to!
> I am not going to live long enough to read all of the books I want to!
The day that thought first came to me was a dark day! Another of Weinberg's heuristics [0]: Read books only after three people have recommended them. It's a quality filter, tuned to the kind of people you associate with, and to reducing the number of pages you'll have to digest.
[0] From a message on the SHAPE (Software as a Human Activity Practiced Effectively) online forum, which may be long gone.
It's a lot easier to liquidate one's stock in a company like Facebook than it is to find another position in the valley. And I suspect that a lot of people are leaving Facebook over its conduct and they will continue to do so but some are going to stay. What Facebook is doing isn't any worse than what the tobacco companies did and the tobacco companies are all still here. I don't know anyone who brags about working for them though.
IMO quitting in a huff over issues like this causes damage to one's career that one really ought to avoid. I left my previous gig over cultural issues and I nearly got labeled a job hopper for doing so. I'd suggest taking the long view and let's see what happens with all the tech companies that are now doing morally questionable things.
It feels like there's been a phase transition in mood in the valley in the past 2 years.
I don't see how this is a helpful way of looking at it. One engineer objects, and that gets handed to someone else while the manager prepares to fire said engineer. All engineers object, and the team gets disbanded with the project getting handed to someone else. It ultimately falls on all shoulders, especially leadership, and singling out engineers just because they make the commit is, I think, reductive and ignorant of the systemic forces driving people to part take in evil.
Fair point, but I stand by my statement that it is unhelpful to single out one party. It is the personal and collective responsibility of each person involved.