Mark Zuckerberg Has His Eye on the Global Community as Facebook Enters Its Adulthood

The 13th birthday marks the beginning of adulthood in the Jewish religion, as well as the minimum age to have a profile on Facebook. And for Facebook, which celebrated its 13th birthday Feb. 4, it also marks a passage into the next stage of life.

Facebook co-founder and CEO Mark Zuckerberg penned a lengthy note to the social network’s users, delivering the message that after 13 years of focusing on connecting friends and family, Facebook is taking the next step, with plans to develop the social infrastructure for a connected global community. Highlights follow:

Zuckerberg said in his introduction:

This is a time when many of us around the world are reflecting on how we can have the most positive impact. I am reminded of my favorite saying about technology: “We always overestimate what we can do in two years, and we underestimate what we can do in 10 years.” We may not have the power to create the world we want immediately, but we can all start working on the long term today. In times like these, the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us.

For the past decade, Facebook has focused on connecting friends and families. With that foundation, our next focus will be developing the social infrastructure for community—for supporting us, for keeping us safe, for informing us, for civic engagement and for inclusion of all.

He added that there are five important questions Facebook must answer:

  • How do we help people build supportive communities that strengthen traditional institutions in a world where membership in these institutions is declining?
  • How do we help people build a safe community that prevents harm, helps during crises and rebuilds afterwards in a world where anyone across the world can affect us?
  • How do we help people build an informed community that exposes us to new ideas and builds common understanding in a world where every person has a voice?
  • How do we help people build a civically engaged community in a world where participation in voting sometimes includes less than one-half of our population?
  • How do we help people build an inclusive community that reflects our collective values and common humanity from local to global levels, spanning cultures, nations and regions in a world with few examples of global communities?

On supportive communities, he shined the spotlight on Facebook’s groups feature, writing:

Online communities are a bright spot, and we can strengthen existing physical communities by helping people come together online as well as offline. In the same way connecting with friends online strengthens real relationships, developing this infrastructure will strengthen these communities, as well as enable completely new ones to form.

We recently found that more than 100 million people on Facebook are members of what we call “very meaningful” groups. These are groups that upon joining quickly become the most important part of our social network experience and an important part of our physical support structure. For example, many new parents tell us that joining a parenting group after having a child fits this purpose.

There is a real opportunity to connect more of us with groups that will be meaningful social infrastructure in our lives. More than 1 billion people are active members of Facebook groups, but most don’t seek out groups on their own—friends send invites or Facebook suggests them. If we can improve our suggestions and help connect 1 billion people with meaningful communities, that can strengthen our social fabric.

Going forward, we will measure Facebook’s progress with groups based on meaningful groups, not groups overall. This will require not only helping people connect with existing meaningful groups, but also enabling community leaders to create more meaningful groups for people to connect with.

Zuckerberg discussed Facebook’s Safety Check and Community Help features in his section on safe community:

To prevent harm, we can build social infrastructure to help our community identify problems before they happen. When someone is thinking of committing suicide or hurting themselves, we’ve built infrastructure to give their friends and community tools that could save their life. When a child goes missing, we’ve built infrastructure to show AMBER Alerts, and multiple children have been rescued without harm. And we’ve built infrastructure to work with public safety organizations around the world when we become aware of these issues. Going forward, there are even more cases where our community should be able to identify risks related to mental health, disease or crime.

To help during a crisis, we’ve built infrastructure like Safety Check so we can all let our friends know we’re safe and check on friends who might be affected by an attack or natural disaster. Safety Check has been activated almost 500 times in two years and has already notified people that their families and friends are safe more than 1 billion times. When there is a disaster, governments often call us to make sure Safety Check has been activated in their countries. But there is more to build. We recently added tools to find and offer shelter, food and other resources during emergencies. Over time, our community should be able to help during wars and ongoing issues that are not limited to a single event.

To rebuild after a crisis, we’ve built the world’s largest social infrastructure for collective action. A few years ago, after an earthquake in Nepal, the Facebook community raised $15 million to help people recover and rebuild, which was the largest crowdfunded relief effort in history. We saw a similar effort after the shooting at the Pulse nightclub in Orlando, Fla., when people across the country organized blood donations to help victims they had never met. Similarly, we built tools so that millions of people could commit to becoming organ donors to save others after accidents, and registries reported larger boosts in sign-ups than ever before.

Looking ahead, one of our greatest opportunities to keep people safe is building artificial intelligence to understand more quickly and accurately what is happening across our community.

There are billions of posts, comments and messages across our services each day, and since it’s impossible to review all of them, we review content once it is reported to us. There have been terribly tragic events—like suicides, some livestreamed—that perhaps could have been prevented if someone had realized what was happening and reported them sooner. There are cases of bullying and harassment every day that our team must be alerted to before we can help out. These stories show we must find a way to do more.

In the informed community section, Zuckerberg wrote:

The two most discussed concerns this past year were about diversity of viewpoints we see (filter bubbles) and accuracy of information (fake news). I worry about these and we have studied them extensively, but I also worry that there are even more powerful effects we must mitigate around sensationalism and polarization leading to a loss of common understanding.

Accuracy of information is very important. We know there is misinformation and even outright hoax content on Facebook, and we take this very seriously. We’ve made progress fighting hoaxes the way we fight spam, but we have more work to do. We are proceeding carefully because there is not always a clear line between hoaxes, satire and opinion. In a free society, it’s important that people have the power to share their opinion, even if others think they’re wrong. Our approach will focus less on banning misinformation and more on surfacing additional perspectives and information, including that fact-checkers dispute an item’s accuracy.

We noticed that some people share stories based on sensational headlines without ever reading the story. In general, if you become less likely to share a story after reading it, that’s a good sign the headline was sensational. If you’re more likely to share a story after reading it, that’s often a sign of good in-depth content. We recently started reducing sensationalism in News Feed by taking this into account for pieces of content, and going forward, signals like this will identify sensational publishers, as well. There are many steps like this we have taken and will keep taking to reduce sensationalism and help build a more informed community.

He touted Facebook’s voter-registration efforts in the civically engaged community section of his note:

The starting point for civic engagement in the existing political process is to support voting across the world. It is striking that only about one-half of Americans eligible to vote participate in elections. This is low compared to other countries, but democracy is receding in many countries and there is a large opportunity across the world to encourage civic participation.

In the U.S. election last year, we helped more than 2 million people register to vote and then go vote. This was among the largest voter turnout efforts in history, and larger than those of both major parties combined. In every election around the world, we keep improving our tools to help more people register and vote, and we hope to eventually enable hundreds of millions of more people to vote in elections than do today, in every democratic country around the world.

In India, Prime Minister Narendra Modi has asked his ministers to share their meetings and information on Facebook so they can hear direct feedback from citizens. In Kenya, whole villages are in WhatsApp groups together, including their representatives. In recent campaigns around the world—from India and Indonesia across Europe to the U.S.—we’ve seen that the candidate with the largest and most engaged following on Facebook usually wins. Just as TV became the primary medium for civic communication in the 1960s, social media is becoming this in the 21st century.

And in the inclusive community section, Zuckerberg wrote:

In the past year, the complexity of the issues we’ve seen has outstripped our existing processes for governing the community. We saw this in errors taking down newsworthy videos related to Black Lives Matter and police violence, and in removing the historical Terror of War photo from Vietnam. We’ve seen this in misclassifying hate speech in political debates in both directions—taking down accounts and content that should be left up and leaving up content that was hateful and should be taken down. Both the number of issues and their cultural importance has increased recently.

This has been painful for me because I often agree with those criticizing us that we’re making mistakes. These mistakes are almost never because we hold ideological positions at odds with the community, but instead are operational scaling issues. Our guiding philosophy for the Community Standards is to try to reflect the cultural norms of our community. When in doubt, we always favor giving people the power to share more.

We’re operating at such a large scale that even a small percent of errors causes a large number of bad experiences. We review more than 100 million pieces of content every month, and even if our reviewers get 99 percent of the calls right, that’s still millions of errors over time. Any system will always have some mistakes, but I believe we can do better than we are today.

With a broader range of controls, content will only be taken down if it is more objectionable than the most permissive options allow. Within that range, content should simply not be shown to anyone whose personal controls suggest they would not want to see it, or at least they should see a warning first. Although we will still block content based on standards and local laws, our hope is that this system of personal controls and democratic referenda should minimize restrictions on what we can share.

Zuckerberg and Facebook chief product officer Chris Cox at Facebook’s internal quarterly all-company meeting.

Comments are closed.