You are here

Facebook

Subscribe to canal de noticias Facebook Facebook
Actualizado: hace 9 horas 19 mins

Net Neutrality – A Day of Action 

Mié, 07/12/2017 - 20:40

Today, people and companies across the country are participating in a day of action to fight for net neutrality. Facebook is proud to be a part of it.

Net neutrality means a free and open internet for everyone. It ensures that internet service providers are not allowed to block or throttle internet traffic or discriminate against certain content.

The FCC has existing rules in place to protect net neutrality and ensure that anyone with an internet connection has a fair shot at turning an idea into something that can change the world. That would change if internet providers were allowed to decide what content its customers could access, or charge customers more to access the websites and services of their choice.

The FCC’s current rules help prevent this from happening. We strongly support those rules, but the FCC’s new proposal could undo those protections. That’s why Facebook supports strong net neutrality rules that will keep the internet free and open.

We’re open to working with anyone, including members of Congress, on a solution that will preserve strong net neutrality protections. We hope you join us in this fight. To learn more about the day of action visit:  iadayofaction.org.

Categorías: Redes Sociales

Live From Facebook Spaces: A New Way to Share VR With Friends

Mié, 07/12/2017 - 18:00

By Mike Booth, Head of Product Management, Facebook Spaces

Thanks to the immersive power of virtual reality, Facebook Spaces lets you feel like you’re spending time with your friends in person — no matter where they are. Starting today, you can share live video on Facebook from Facebook Spaces to give the people you care about a window into your VR world. Bring more friends along for the ride!

Facebook Live is already one of the most immediate and interactive ways to share moments with friends. By going live from Facebook Spaces, you can share a whole new kind of moment with friends and family directly from VR. Whether you’re touring exotic locations across the globe in 360, collaborating on a virtual marker masterpiece, or riffing on a viral video, the people who matter most to you can now follow along in real time on Facebook.

When you go live from Facebook Spaces, you’ll have a virtual camera that you can position anywhere in your space to capture the action. Friends on Facebook can comment on your broadcast and ask you questions to participate in the moment with you — and you can even see their reactions in VR. You’ll see a stream of friends’ comments and can pull out your favorites as physical objects that everyone in the space can interact with — a great way to highlight compelling questions and clever one-liners from your friends.

Live from Facebook Spaces opens up the fun of VR and lets everyone join the experience. Along with Messenger video calling and selfies, it’s an easy way to share your VR experiences and create lasting memories with everyone you care about. We’re excited to see how people go live from Spaces to interact with friends in new ways — and this is only the beginning, as we’ll continue to add new features to the experience.

Check out Facebook Spaces in beta on the Oculus Store, and try your first live broadcast today. If you don’t have a Rift yet, now is the perfect time to jump in! Thanks to the Oculus Summer of Rift sale, Rift + Touch are now available at $399 for a limited time.

Categorías: Redes Sociales

Investing in Menlo Park and the Community

Vie, 07/07/2017 - 17:00

By John Tenanes, VP Global Facilities and Real Estate

We found a home when we moved to Menlo Park in 2011.We are part of this community, and being here makes it possible for us to work on our mission to bring the world closer together.

That’s why we plan to keep investing in this community. When we first expanded beyond our original campus, we looked no further than across the street. Frank Gehry helped us design that building, which we call MPK20. Our presence has expanded further since then, and we are now planning to redevelop the former Menlo Science & Technology Park which we intend to call Willow Campus.

Working with the community, our goal for the Willow Campus is to create an integrated, mixed-use village that will provide much needed services, housing and transit solutions as well as office space. Part of our vision is to create a neighborhood center that provides long-needed community services. We plan to build 125,000 square feet of new retail space, including a grocery store, pharmacy and additional community-facing retail.

The first official step will be the filing of our plan with Menlo Park in July 2017. We will begin more formal conversations with local government officials and community organizations over the course of the review process, which we expect to last approximately two years. We envision construction will follow in phases, with the first to include the grocery, retail, housing and office completed in early 2021, and subsequent phases will take two years each to complete.

Housing is also critically important to these efforts. We hope to contribute significantly to the housing supply by building 1,500 units of housing on the campus, 15% of which will be offered at below market rates. This added on-site housing should also mitigate traffic impacts from growth. These efforts complement our ongoing work to address the issue, including the Catalyst Housing Fund for affordable housing we established in partnership with community groups to fund affordable housing for our local area. The fund was initiated last year with an initial investment of $18.5 million that we hope will grow.

The region’s failure to continue to invest in our transportation infrastructure alongside growth has led to congestion and delay. Willow Campus will be an opportunity to catalyze regional transit investment by providing planned density sufficient to support new east-west connections and a future transit center. We’re investing tens of millions of dollars to improve US101.

Construction will generate an array of jobs, and we’re planning to help local workers access those opportunities. The site will be developed in two phases designed to bring office, housing and retail online in tandem.

Our hope is to create a physical space that supports our community and builds on our existing programs. We’ve hosted tens of thousands of community members at farmers’ markets and events, and partnered with nonprofits like Rebuilding Together Peninsula to rehabilitate local homes. We’ve also enrolled local high school students from East Palo Alto, Belle Haven and Redwood City in our six-week summer internship program.

This is only the beginning. Going forward, we plan to continue to work closely with local leaders and community members to ensure Facebook’s presence is a benefit to the community. It’s one we’re lucky to call home.

Our design partner in imagining the campus is OMA New York. We have worked with them to prepare a video describing our vision and hope for integrating more closely with our community.

Categorías: Redes Sociales

News Feed FYI: Showing More Informative Links in News Feed

Vie, 06/30/2017 - 20:00

By Adam Mosseri, VP, News Feed

Today we are making an update to help reduce low quality links in News Feed. We are always working to improve people’s experience in News Feed by showing more stories that we think people will find informative and entertaining.

Our research shows that there is a tiny group of people on Facebook who routinely share vast amounts of public posts per day, effectively spamming people’s feeds. Our research further shows that the links they share tend to include low quality content such as clickbait, sensationalism, and misinformation. As a result, we want to reduce the influence of these spammers and deprioritize the links they share more frequently than regular sharers. Of course, this is only one signal among many others that may affect the ranking prioritization of this type of post. This update will only apply to links, such as an individual article, not to domains, Pages, videos, photos, check-ins or status updates.

One of our core News Feed values is that News Feed should be informative. By taking steps like this to improve News Feed, we’re able to surface more stories that people find informative and reduce the spread of problematic links such as clickbait, sensationalism and misinformation.

Will This Impact My Page?
Most publishers won’t see any significant changes to their distribution in News Feed. Publishers that get meaningful distribution from people who routinely share vast amounts of public posts per day may see a reduction in the distribution of those specific links. As always, publishers should keep in mind these basic guideposts to reach their audience on Facebook and continue to post stories that are relevant to their audiences and that their readers find informative.

Categorías: Redes Sociales

Expanding Find Wi-Fi Globally

Vie, 06/30/2017 - 18:40

By Alex Himel, Engineering Director

Today we’re beginning to roll out Find Wi-Fi everywhere in the world on iPhone and Android. We launched Find Wi-Fi in a handful of countries last year and found it’s not only helpful for people who are traveling or on-the-go, but especially useful in areas where cellular data is scarce.

Find Wi-Fi helps you locate available Wi-Fi hot spots nearby that businesses have shared with Facebook from their Page. So wherever you are, you can easily map the closest connections when your data connection is weak.

To find Wi-Fi hotspots, open your Facebook app, click on the “More” tab and then “Find Wi-Fi.” Once in the “Find Wi-Fi” tab you may need to turn it on. You can then browse the closest available hotspots on a map, and learn more about the businesses hosting them.

Categorías: Redes Sociales

Metrics Release: New Ways to Better Understand Interactions

Jue, 06/29/2017 - 18:00

Great campaigns are powered by great insights. As people interact with businesses in new ways, marketers need to better understand actions driven by their online presence. We’ve heard feedback from businesses that they want more transparency and understanding around their Facebook performance. As part of our commitment to measurement, about every month or so we’ll release new metrics so that businesses have better ways to measure outcomes, all in one place. We’ll begin the series of metrics updates with features that capture new kinds of interactions with your ads or Page.

More Visibility On Ad Interactions

Getting visitors to your website or app greatly expands the potential for new customers. But slow-loading mobile sites or poor connections quickly cause many people to lose interest if they’re waiting for a page to load after clicking on an ad. To give you a better sense of the number of visitors that arrive to your website after a link click on an ad, we’re beginning to rollout a new metric called landing page views. This new metric will help businesses realize the importance of optimizing for a better mobile web experience. Businesses will be able to choose to optimize for landing page views when they use the traffic objective, finding more people who will actually arrive on their landing page after clicking on their ad.

We’ve also heard that businesses want more clarity around whether or not someone who clicks on an ad is a new or returning customer. The pre-impression activity breakdown is a new metric we’re introducing over the coming weeks that shows the number of people who have previously engaged with an advertiser’s website or app versus new visitors. We make the determination based on whether a site in recent weeks fired a pixel or triggered an app event associated with a business. Pre-impression breakdown is particularly helpful for businesses running dynamic ads for broad audiences, where the audiences expand beyond their own customers, and where ad creatives are generated dynamically based on associated product recommendations.

New Reporting On Page Interactions

We’re introducing three new reporting metrics to give Page owners a more complete understanding of how people learn about and interact with their businesses. These metrics will roll out to Pages over the coming weeks and can be viewed within the “overview” tab of Page Insights on desktop.

  • Follows: rather than only showing the total number of follows a business has, we’re now breaking out the number of follows a Page gains or loses over time, insights into where follows happen, follower demographics, and a breakdown of organic and paid follows. You can visit the Help Center for a refresher on understanding follows and likes.
  • Previews: people may come across your Page’s information without actually clicking on the Page. We’ll now show you the number of people who saw your Page’s information when hovering over your Page’s name on desktop.
  • Recommendations: people are already using recommendations as a way to get advice from friends, family, and local Groups on Facebook. We’ll now start showing the number of times a Page has been included under someone’s recommendation as a suggestion from friends and family.

We’ll continue to surface more metrics over the coming months as we get feedback and uncover new ways to provide actionable insights.

Categorías: Redes Sociales

Discover Bots and Businesses in Messenger

Mié, 06/28/2017 - 18:00

by Yingming Chen, Engineer, Messenger

We first announced our new Discover section at F8 to increase the opportunity for people to find the amazing experiences developers have built for people and businesses to interact with on Messenger. Since that announcement, we’ve been working to make it more intuitive and relevant for you. Today we’re excited to announce v1.1 of Discover which enables people to browse and find bots and businesses in Messenger, starting to roll out today for people in the U.S.

Here’s how Discover works – when you tap on the Discover icon on the lower right hand corner of the Messenger Home screen, you can browse by category, recently visited businesses and featured experiences. Discover makes it even easier to get things done, from reading the latest articles, booking your next vacation, or getting the latest sport highlights, right in Messenger. In addition to this full roll out to U.S. consumers, we’ve also updated the units that appear in Discover, showcasing the many resources you have to interact with businesses, get your questions answered and find the information you want.

Here’s what you’ll find in Discover:

Recently Used: Shows you the bots and businesses you recently interacted with.

Featured: A representation of the full range of experiences available in Messenger. Helps people find bots and businesses to explore.

Categories: Bots and businesses organized by topic. Refreshed frequently so you can find new experiences.

Our goal with Discover is to ensure that experiences in Messenger are compelling, high quality and easy to find. This latest update makes it even more intuitive for people to find what they care about most. And be sure to keep coming back – new experiences are always added!

For developers and businesses interested in getting their experiences added to the Discover section, please go here.

Categorías: Redes Sociales

Two Billion People Coming Together on Facebook

Mar, 06/27/2017 - 19:08

By Mike Nowak, Product Director, and Guillermo Spiller, Product Manager

As Mark Zuckerberg announced today, we reached a new milestone: there are now 2 billion people connecting and building communities on Facebook every month.

This wouldn’t have happened without the millions of smaller communities and individuals who are sharing and making meaningful contributions every day. Each day, more than 175 million people share a Love reaction, and on average, over 800 million people like something on Facebook. More than 1 billion people use Groups every month.

To show our appreciation for the many ways people support one another on Facebook, we will share several personalized experiences over the coming days.

Good Adds Up Video

We are launching a personalized video to celebrate bringing the world closer together. You may see your video in your News Feed or by visiting facebook.com/goodaddsup.

Celebrating the Good People Do

After someone reacts to a friend’s post with Love, wishes someone happy birthday or creates a group, they will see a message in News Feed thanking them.

Sharing Community Stories and Impact

On facebook.com/goodaddsup, we are featuring fun facts about how people are contributing to the community. In the US, we are also sharing stories of people who inspire us. Every day, people connect with one another, contribute to their local communities and help make the world a better place.

We want to help do our part as well. As Mark mentioned last week at the Facebook Communities Summit, our mission is to bring the world closer together. Reaching this milestone is just one small step toward that goal. We are excited to continue to build products that allow people to connect with one another, regardless of where they live or what language they speak.

Thank you for being part of our global community!

Categorías: Redes Sociales

Hard Questions: Hate Speech

Mar, 06/27/2017 - 14:00

Who should decide what is hate speech in an online global community?
By Richard Allan, VP EMEA Public Policy

As more and more communication takes place in digital form, the full range of public conversations are moving online — in groups and broadcasts, in text and video, even with emoji. These discussions reflect the diversity of human experience: some are enlightening and informative, others are humorous and entertaining, and others still are political or religious. Some can also be hateful and ugly. Most responsible communications platforms and systems are now working hard to restrict this kind of hateful content.

Facebook is no exception. We are an open platform for all ideas, a place where we want to encourage self-expression, connection and sharing. At the same time, when people come to Facebook, we always want them to feel welcome and safe. That’s why we have rules against bullying, harassing and threatening someone.

But what happens when someone expresses a hateful idea online without naming a specific person? A post that calls all people of a certain race “violent animals” or describes people of a certain sexual orientation as “disgusting” can feel very personal and, depending on someone’s experiences, could even feel dangerous. In many countries around the world, those kinds of attacks are known as hate speech. We are opposed to hate speech in all its forms, and don’t allow it on our platform.

In this post we want to explain how we define hate speech and approach removing it — as well as some of the complexities that arise when it comes to setting limits on speech at a global scale, in dozens of languages, across many cultures. Our approach, like those of other platforms, has evolved over time and continues to change as we learn from our community, from experts in the field, and as technology provides us new tools to operate more quickly, more accurately and precisely at scale.

Defining Hate Speech

The first challenge in stopping hate speech is defining its boundaries.

People come to Facebook to share their experiences and opinions, and topics like gender, nationality, ethnicity and other personal characteristics are often a part of that discussion. People might disagree about the wisdom of a country’s foreign policy or the morality of certain religious teachings, and we want them to be able to debate those issues on Facebook. But when does something cross the line into hate speech?

Our current definition of hate speech is anything that directly attacks people based on what are known as their “protected characteristics” — race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, or serious disability or disease.

There is no universally accepted answer for when something crosses the line. Although a number of countries have laws against hate speech, their definitions of it vary significantly.

In Germany, for example, laws forbid incitement to hatred; you could find yourself the subject of a police raid if you post such content online. In the US, on the other hand, even the most vile kinds of speech are legally protected under the US Constitution.

People who live in the same country — or next door — often have different levels of tolerance for speech about protected characteristics. To some, crude humor about a religious leader can be considered both blasphemy and hate speech against all followers of that faith. To others, a battle of gender-based insults may be a mutually enjoyable way of sharing a laugh. Is it OK for a person to post negative things about people of a certain nationality as long as they share that same nationality? What if a young person who refers to an ethnic group using a racial slur is quoting from lyrics of a song?

There is very important academic work in this area that we follow closely. Timothy Garton Ash, for example, has created the Free Speech Debate to look at these issues on a cross-cultural basis. Susan Benesch established the Dangerous Speech Project, which investigates the connection between speech and violence. These projects show how much work is left to be done in defining the boundaries of speech online, which is why we’ll keep participating in this work to help inform our policies at Facebook.

Enforcement

We’re committed to removing hate speech any time we become aware of it. Over the last two months, on average, we deleted around 66,000 posts reported as hate speech per week — that’s around 288,000 posts a month globally. (This includes posts that may have been reported for hate speech but deleted for other reasons, although it doesn’t include posts reported for other reasons but deleted for hate speech.*)

But it’s clear we’re not perfect when it comes to enforcing our policy. Often there are close calls — and too often we get it wrong.

Sometimes, it’s obvious that something is hate speech and should be removed – because it includes the direct incitement of violence against protected characteristics, or degrades or dehumanizes people. If we identify credible threats of imminent violence against anyone, including threats based on a protected characteristic, we also escalate that to local law enforcement.

But sometimes, there isn’t a clear consensus — because the words themselves are ambiguous, the intent behind them is unknown or the context around them is unclear. Language also continues to evolve, and a word that was not a slur yesterday may become one today.

Here are some of the things we take into consideration when deciding what to leave on the site and what to remove.

Context

What does the statement “burn flags not fags” mean? While this is clearly a provocative statement on its face, should it be considered hate speech? For example, is it an attack on gay people, or an attempt to “reclaim” the slur? Is it an incitement of political protest through flag burning? Or, if the speaker or audience is British, is it an effort to discourage people from smoking cigarettes (fag being a common British term for cigarette)? To know whether it’s a hate speech violation, more context is needed.

Often the most difficult edge cases involve language that seems designed to provoke strong feelings, making the discussion even more heated — and a dispassionate look at the context (like country of speaker or audience) more important. Regional and linguistic context is often critical, as is the need to take geopolitical events into account. In Myanmar, for example, the word “kalar” has benign historic roots, and is still used innocuously across many related Burmese words. The term can however also be used as an inflammatory slur, including as an attack by Buddhist nationalists against Muslims. We looked at the way the word’s use was evolving, and decided our policy should be to remove it as hate speech when used to attack a person or group, but not in the other harmless use cases. We’ve had trouble enforcing this policy correctly recently, mainly due to the challenges of understanding the context; after further examination, we’ve been able to get it right. But we expect this to be a long-term challenge.

In Russia and Ukraine, we faced a similar issue around the use of slang words the two groups have long used to describe each other. Ukrainians call Russians “moskal,” literally “Muscovites,” and Russians call Ukrainians “khokhol,” literally “topknot.” After conflict started in the region in 2014, people in both countries started to report the words used by the other side as hate speech. We did an internal review and concluded that they were right. We began taking both terms down, a decision that was initially unpopular on both sides because it seemed restrictive, but in the context of the conflict felt important to us.

Often a policy debate becomes a debate over hate speech, as two sides adopt inflammatory language. This is often the case with the immigration debate, whether it’s about the Rohingya in South East Asia, the refugee influx in Europe or immigration in the US. This presents a unique dilemma: on the one hand, we don’t want to stifle important policy conversations about how countries decide who can and can’t cross their borders. At the same time, we know that the discussion is often hurtful and insulting.

When the influx of migrants arriving in Germany increased in recent years, we received feedback that some posts on Facebook were directly threatening refugees or migrants. We investigated how this material appeared globally and decided to develop new guidelines to remove calls for violence against migrants or dehumanizing references to them — such as comparisons to animals, to filth or to trash. But we have left in place the ability for people to express their views on immigration itself. And we are deeply committed to making sure Facebook remains a place for legitimate debate.

Intent

People’s posts on Facebook exist in the larger context of their social relationships with friends. When a post is flagged for violating our policies on hate speech, we don’t have that context, so we can only judge it based on the specific text or images shared. But the context can indicate a person’s intent, which can come into play when something is reported as hate speech.

There are times someone might share something that would otherwise be considered hate speech but for non-hateful reasons, such as making a self-deprecating joke or quoting lyrics from a song. People often use satire and comedy to make a point about hate speech.

Or they speak out against hatred by condemning someone else’s use of offensive language, which requires repeating the original offense. This is something we allow, even though it might seem questionable since it means some people may encounter material disturbing to them. But it also gives our community the chance to speak out against hateful ideas. We revised our Community Standards to encourage people to make it clear when they’re sharing something to condemn it, but sometimes their intent isn’t clear, and anti-hatred posts get removed in error.

On other occasions, people may reclaim offensive terms that were used to attack them. When someone uses an offensive term in a self-referential way, it can feel very different from when the same term is used to attack them. For example, the use of the word “dyke” may be considered hate speech when directed as an attack on someone on the basis of the fact that they are gay. However, if someone posted a photo of themselves with #dyke, it would be allowed. Another example is the word “faggot.” This word could be considered hate speech when directed at a person, but, in Italy, among other places, “frocio” (“faggot”) is used by LGBT activists to denounce homophobia and reclaim the word. In these cases, removing the content would mean restricting someone’s ability to express themselves on Facebook.

Mistakes

If we fail to remove content that you report because you think it is hate speech, it feels like we’re not living up to the values in our Community Standards. When we remove something you posted and believe is a reasonable political view, it can feel like censorship. We know how strongly people feel when we make such mistakes, and we’re constantly working to improve our processes and explain things more fully.

Our mistakes have caused a great deal of concern in a number of communities, including among groups who feel we act — or fail to act — out of bias. We are deeply committed to addressing and confronting bias anywhere it may exist. At the same time, we work to fix our mistakes quickly when they happen.

Last year, Shaun King, a prominent African-American activist, posted hate mail he had received that included vulgar slurs. We took down Mr. King’s post in error — not recognizing at first that it was shared to condemn the attack. When we were alerted to the mistake, we restored the post and apologized. Still, we know that these kinds of mistakes are deeply upsetting for the people involved and cut against the grain of everything we are trying to achieve at Facebook.

Continuing To Improve

People often ask: can’t artificial intelligence solve this? Technology will continue to be an important part of how we try to improve. We are, for example, experimenting with ways to filter the most obviously toxic language in comments so they are hidden from posts. But while we’re continuing to invest in these promising advances, we’re a long way from being able to rely on machine learning and AI to handle the complexity involved in assessing hate speech.

That’s why we rely so heavily on our community to identify and report potential hate speech. With billions of posts on our platform — and with the need for context in order to assess the meaning and intent of reported posts — there’s not yet a perfect tool or system that can reliably find and distinguish posts that cross the line from expressive opinion into unacceptable hate speech. Our model builds on the eyes and ears of everyone on platform — the people who vigilantly report millions of posts to us each week for all sorts of potential violations. We then have our teams of reviewers, who have broad language expertise and work 24 hours a day across time zones, to apply our hate speech policies.

We’re building up these teams that deal with reported content: over the next year, we’ll add 3,000 people to our community operations team around the world, on top of the 4,500 we have today. We’ll keep learning more about local context and changing language. And, because measurement and reporting are an important part of our response to hate speech, we’re working on better ways to capture and share meaningful data with the public.

Managing a global community in this manner has never been done before, and we know we have a lot more work to do. We are committed to improving — not just when it comes to individual posts, but how we approach discussing and explaining our choices and policies entirely.

Read more about our new blog series Hard Questions. We want your input on what other topics we should address — and what we could be doing better. Please send suggestions to hardquestions@fb.com.

*What’s in the numbers:

  • These numbers represent an average from April and May 2017.
  • These numbers reflect content that was reported for hate speech and subsequently deleted, whatever the reason.
  • The numbers are specific to reports on individual posts on Facebook.
    • These numbers do not include hate speech deleted from Instagram.
    • These numbers do not include hate speech that was deleted because an entire page, group or profile was taken down or disabled. This means we could be drastically undercounting because a hateful group may contain many individual items of hate speech.
    • These numbers do not include hate speech that was reported for other reasons.
      • For example, outrageous statements can be used to get people to click on spam links and with our current definitions if this was reported for spam we do not track it as hate speech.
      • For example, if a post was reported for nudity or bullying, but deleted for hate speech, it would not be counted in these numbers.
    • These numbers might include content that was reported for hate, but deleted for other reasons.
      • For example, if a post was reported for hate speech, but deleted for nudity or bullying, it would be counted in these numbers.
    • These numbers also contain instances when we may have taken down content mistakenly.
  • The numbers vary dramatically over time due to offline events (like the aftermath of a terror attack) or online events (like a spam attack).
  • We are exploring a better process by which to log our reports and removals, for more meaningful and accurate data.
Categorías: Redes Sociales

Facebook, Microsoft, Twitter and YouTube Announce Formation of the Global Internet Forum to Counter Terrorism

Lun, 06/26/2017 - 19:30

Today, Facebook, Microsoft, Twitter and YouTube are announcing the formation of the Global Internet Forum to Counter Terrorism, which will help us continue to make our hosted consumer services hostile to terrorists and violent extremists.

The spread of terrorism and violent extremism is a pressing global problem and a critical challenge for us all. We take these issues very seriously, and each of our companies have developed policies and removal practices that enable us to take a hard line against terrorist or violent extremist content on our hosted consumer services. We believe that by working together, sharing the best technological and operational elements of our individual efforts, we can have a greater impact on the threat of terrorist content online.

The new forum builds on initiatives including the EU Internet Forum and the Shared Industry Hash Database; discussions with the UK and other governments; and the conclusions of the recent G7 and European Council meetings. It will formalize and structure existing and future areas of collaboration between our companies and foster cooperation with smaller tech companies, civil society groups and academics, governments and supra-national bodies such as the EU and the UN.  

The scope of our work will evolve over time as we will need to be responsive to the ever-evolving terrorist and extremist tactics. Initially, however, our work will focus on:  

  1. Technological solutions: Our companies will work together to refine and improve existing joint technical work, such as the Shared Industry Hash Database; exchange best practices as we develop and implement new content detection and classification techniques using machine learning; and define standard transparency reporting methods for terrorist content removals.
  2. Research: We will commission research to inform our counter-speech efforts and guide future technical and policy decisions around the removal of terrorist content.
  3. Knowledge-sharing: We will work with counter-terrorism experts including governments, civil society groups, academics and other companies to engage in shared learning about terrorism. And through a joint partnership with the UN Security Council Counter-Terrorism Executive Directorate (UN CTED) and the ICT4Peace Initiative, we are establishing a broad knowledge-sharing network to:
    • Engage with smaller companies: We will help them develop the technology and processes necessary to tackle terrorist and extremist content online.
    • Develop best practices: We already partner with organizations such as the Center for Strategic and International Studies, Anti-Defamation League and Global Network Initiative to identify how best to counter extremism and online hate, while respecting freedom of expression and privacy. We can socialize these best practices, and develop additional shared learnings on topics such as community guideline development, and policy enforcement.
    • Counter-speech: Each of us already has robust counter-speech initiatives in place (e.g., YouTube’s Creators for Change, Jigsaw’s Redirect Method, Facebook’s P2P and OCCI, Microsoft’s partnership with the Institute for Strategic Dialogue for counter-narratives on Bing, Twitter’s global NGO training program). The forum we have established allows us to learn from and contribute to one another’s counter-speech efforts, and discuss how to further empower and train civil society organizations and individuals who may be engaged in similar work and support ongoing efforts such as the Civil society empowerment project (CSEP).

We will be hosting a series of learning workshops in partnership with UN CTED/ICT4Peace in Silicon Valley and around the world to drive these areas of collaboration.

Further information on all of the above initiatives will be shared in due course.

 

Categorías: Redes Sociales

Our First Communities Summit and New Tools For Group Admins

Jue, 06/22/2017 - 18:12

By Kang-Xing Jin, VP, Engineering

Today we hosted our first-ever Facebook Communities Summit in Chicago with hundreds of group admins where we announced new features to support their communities on Facebook.

Mark Zuckerberg kicked off by celebrating the role Groups play in the Facebook community and thanking the group admins who lead them. He also announced a new mission for Facebook that will guide our work over the next decade: Give people the power to build community and bring the world closer together.

An important part of delivering on our new mission is supporting group admins, who are real community leaders on Facebook. We’re adding several new features to help them grow and manage their groups:

  • Group Insights: group admins have told us consistently that having a better understanding of what’s going on in their groups would help them make decisions on how to best support their members. Now, with Group Insights, they’ll be able to see real-time metrics around growth, engagement and membership — such as the number of posts and times that members are most engaged.
  • Membership request filtering: we also hear from admins that admitting new members is one of the most time-consuming things they do. So, we added a way for them to sort and filter membership requests on common categories like gender and location, and then accept or decline all at once.
  • Removed member clean-up: to help keep their communities safe from bad actors, group admins can now remove a person and the content they’ve created within the group, including posts, comments and other people added to the group, in one step.
  • Scheduled posts: group admins and moderators can create and conveniently schedule posts on a specific day and time.
  • Group to group linking: we’re beginning to test group-to-group linking, which allows group admins to recommend similar or related groups to their members. This is just the beginning of ways that we’re helping bring communities and sub-communities closer together.

More than 1 billion people around the world use Groups, and more than 100 million people are members of “meaningful groups.” These are groups that quickly become the most important part of someone’s experience on Facebook. Today we’re setting a goal to help 1 billion people join meaningful communities like these.

In Chicago, we celebrated some of these groups built around local neighborhoods, shared passions and life experiences. For example, some of the groups and admins that attended include:

  • Terri Hendricks, who started Lady Bikers of California so that women who ride motorcycles could connect with each other, meet in real life through group rides, and offer each other both motorcycle-related and personal support. Terri says that when she started riding motorcycles it was rare to see other women who rode and that across the group, there is “nothing that these ladies wouldn’t do for each other.”
  • Matthew Mendoza, who started Affected by Addiction Support Group. The group is a safe space for people who are experiencing or recovering from drug and alcohol addiction, as well as their friends and family, to offer support and share stories.
  • Kenneth Goodwin, minister of Bethel Church in Decatur, Georgia, who uses the Bethel Original Free Will Baptist Church group to post announcements to the local community about everything happening at Bethel. He and the other admins will often share information about events, meeting times for their small group ministries, and live videos of sermons so people who cannot attend can watch from their homes.

We’re inspired by these stories and the hundreds of others we’ve heard from people attending today’s event. We’re planning more events to bring together group admins outside the US and look forward to sharing more details soon.

Categorías: Redes Sociales

Giving People More Control Over Their Facebook Profile Picture

Jue, 06/22/2017 - 04:00

By Aarati Soman, Product Manager

Part of our goal in building global community is understanding the needs of people who use Facebook in specific countries and how we can better serve them. In India, we’ve heard that people want more control over their profile pictures, and we’ve been working over the past year to understand how we can help.

Today, we are piloting new tools that give people in India more control over who can download and share their profile pictures. In addition, we’re exploring ways people can more easily add designs to profile pictures, which our research has shown helpful in deterring misuse. Based on what we learn from our experience in India, we hope to expand to other countries soon.

Profile pictures are an important part of building community on Facebook because they help people find friends and create meaningful connections. But not everyone feels safe adding a profile picture. In our research with people and safety organizations in India, we’ve heard that some women choose not to share profile pictures that include their faces anywhere on the internet because they’re concerned about what may happen to their photos.

These tools, developed in partnership with Indian safety organizations like Centre for Social Research, Learning Links Foundation, Breakthrough and Youth Ki Awaaz, are designed to give people more control over their experience and help keep them safe online.

New Controls

People in India will start seeing a step-by-step guide to add an optional profile picture guard. When you add this guard:

  • Other people will no longer be able to download, share or send your profile picture in a message on Facebook
  • People you’re not friends with on Facebook won’t be able to tag anyone, including themselves, in your profile picture
  • Where possible, we’ll prevent others from taking a screenshot of your profile picture on Facebook, which is currently available only on Android devices
  • We’ll display a blue border and shield around your profile picture as a visual cue of protection

Deterring Misuse

Based on preliminary tests, we’ve learned that when someone adds an extra design layer to their profile picture, other people are at least 75% less likely to copy that picture.

We partnered with Jessica Singh, an illustrator who took inspiration from traditional Indian textile designs such as bandhani and kantha, to create designs for people to add to their profile picture.

If someone suspects that a picture marked with one of these designs is being misused, they can report it to Facebook and we will use the design to help determine whether it should be removed from our community.

Categorías: Redes Sociales

Hard Questions: How We Counter Terrorism

Jue, 06/15/2017 - 19:00

By Monika Bickert, Director of Global Policy Management, and Brian Fishman, Counterterrorism Policy Manager

In the wake of recent terror attacks, people have questioned the role of tech companies in fighting terrorism online. We want to answer those questions head on. We agree with those who say that social media should not be a place where terrorists have a voice. We want to be very clear how seriously we take this — keeping our community safe on Facebook is critical to our mission.

In this post, we’ll walk through some of our behind-the-scenes work, including how we use artificial intelligence to keep terrorist content off Facebook, something we have not talked about publicly before. We will also discuss the people who work on counterterrorism, some of whom have spent their entire careers combating terrorism, and the ways we collaborate with partners outside our company.

Our stance is simple: There’s no place on Facebook for terrorism. We remove terrorists and posts that support terrorism whenever we become aware of them. When we receive reports of potential terrorism posts, we review those reports urgently and with scrutiny. And in the rare cases when we uncover evidence of imminent harm, we promptly inform authorities. Although academic research finds that the radicalization of members of groups like ISIS and Al Qaeda primarily occurs offline, we know that the internet does play a role — and we don’t want Facebook to be used for any terrorist activity whatsoever.

We believe technology, and Facebook, can be part of the solution.

We’ve been cautious, in part because we don’t want to suggest there is any easy technical fix. It is an enormous challenge to keep people safe on a platform used by nearly 2 billion every month, posting and commenting in more than 80 languages in every corner of the globe. And there is much more for us to do. But we do want to share what we are working on and hear your feedback so we can do better.

Artificial Intelligence

We want to find terrorist content immediately, before people in our community have seen it. Already, the majority of accounts we remove for terrorism we find ourselves. But we know we can do better at using technology — and specifically artificial intelligence — to stop the spread of terrorist content on Facebook. Although our use of AI against terrorism is fairly recent, it’s already changing the ways we keep potential terrorist propaganda and accounts off Facebook. We are currently focusing our most cutting edge techniques to combat terrorist content about ISIS, Al Qaeda and their affiliates, and we expect to expand to other terrorist organizations in due course. We are constantly updating our technical solutions, but here are some of our current efforts.

  • Image matching: When someone tries to upload a terrorist photo or video, our systems look for whether the image matches a known terrorism photo or video. This means that if we previously removed a propaganda video from ISIS, we can work to prevent other accounts from uploading the same video to our site. In many cases, this means that terrorist content intended for upload to Facebook simply never reaches the platform.
  • Language understanding: We have also recently started to experiment with using AI to understand text that might be advocating for terrorism. We’re currently experimenting with analyzing text that we’ve already removed for praising or supporting terrorist organizations such as ISIS and Al Qaeda so we can develop text-based signals that such content may be terrorist propaganda. That analysis goes into an algorithm that is in the early stages of learning how to detect similar posts. The machine learning algorithms work on a feedback loop and get better over time.
  • Removing terrorist clusters: We know from studies of terrorists that they tend to radicalize and operate in clusters. This offline trend is reflected online as well. So when we identify Pages, groups, posts or profiles as supporting terrorism, we also use algorithms to “fan out” to try to identify related material that may also support terrorism. We use signals like whether an account is friends with a high number of accounts that have been disabled for terrorism, or whether an account shares the same attributes as a disabled account.
  • Recidivism: We’ve also gotten much faster at detecting new fake accounts created by repeat offenders. Through this work, we’ve been able to dramatically reduce the time period that terrorist recidivist accounts are on Facebook. This work is never finished because it is adversarial, and the terrorists are continuously evolving their methods too. We’re constantly identifying new ways that terrorist actors try to circumvent our systems — and we update our tactics accordingly.
  • Cross-platform collaboration: Because we don’t want terrorists to have a place anywhere in the family of Facebook apps, we have begun work on systems to enable us to take action against terrorist accounts across all our platforms, including WhatsApp and Instagram. Given the limited data some of our apps collect as part of their service, the ability to share data across the whole family is indispensable to our efforts to keep all our platforms safe.

Human Expertise

AI can’t catch everything. Figuring out what supports terrorism and what does not isn’t always straightforward, and algorithms are not yet as good as people when it comes to understanding this kind of context. A photo of an armed man waving an ISIS flag might be propaganda or recruiting material, but could be an image in a news story. Some of the most effective criticisms of brutal groups like ISIS utilize the group’s own propaganda against it. To understand more nuanced cases, we need human expertise.

  • Reports and reviews: Our community — that’s the people on Facebook — helps us by reporting accounts or content that may violate our policies — including the small fraction that may be related to terrorism. Our Community Operations teams around the world — which we are growing by 3,000 people over the next year — work 24 hours a day and in dozens of languages to review these reports and determine the context. This can be incredibly difficult work, and we support these reviewers with onsite counseling and resiliency training.
  • Terrorism and safety specialists: In the past year we’ve also significantly grown our team of counterterrorism specialists. At Facebook, more than 150 people are exclusively or primarily focused on countering terrorism as their core responsibility. This includes academic experts on counterterrorism, former prosecutors, former law enforcement agents and analysts, and engineers. Within this specialist team alone, we speak nearly 30 languages.
  • Real-world threats: We increasingly use AI to identify and remove terrorist content, but computers are not very good at identifying what constitutes a credible threat that merits escalation to law enforcement. We also have a global team that responds within minutes to emergency requests from law enforcement.

Partnering with Others

Working to keep terrorism off Facebook isn’t enough because terrorists can jump from platform to platform. That’s why partnerships with others — including other companies, civil society, researchers and governments — are so crucial.

  • Industry cooperation: In order to more quickly identify and slow the spread of terrorist content online, we joined with Microsoft, Twitter and YouTube six months ago to announce a shared industry database of “hashes” — unique digital fingerprints for photos and videos — for content produced by or in support of terrorist organizations. This collaboration has already proved fruitful, and we hope to add more partners in the future. We are grateful to our partner companies for helping keep Facebook a safe place.
  • Governments: Governments and inter-governmental agencies also have a key role to play in convening and providing expertise that is impossible for companies to develop independently. We have learned much through briefings from agencies in different countries about ISIS and Al Qaeda propaganda mechanisms. We have also participated in and benefited from efforts to support industry collaboration by organizations such as the EU Internet Forum, the Global Coalition Against Daesh, and the UK Home Office.
  • Encryption. We know that terrorists sometimes use encrypted messaging to communicate. Encryption technology has many legitimate uses – from protecting our online banking to keeping our photos safe. It’s also essential for journalists, NGO workers, human rights campaigners and others who need to know their messages will remain secure. Because of the way end-to-end encryption works, we can’t read the contents of individual encrypted messages — but we do provide the information we can in response to valid law enforcement requests, consistent with applicable law and our policies.
  • Counterspeech training: We also believe challenging extremist narratives online is a valuable part of the response to real world extremism. Counterspeech comes in many forms, but at its core these are efforts to prevent people from pursuing a hate-filled, violent life or convincing them to abandon such a life. But counterspeech is only effective if it comes from credible speakers. So we’ve partnered with NGOs and community groups to empower the voices that matter most.
  • Partner programs: We support several major counterspeech programs. For example, last year we worked with the Institute for Strategic Dialogue to launch the Online Civil Courage Initiative, a project that has engaged with more than 100 anti-hate and anti-extremism organizations across Europe. We’ve also worked with Affinis Labs to host hackathons in places like Manila, Dhaka and Jakarta, where community leaders joined forces with tech entrepreneurs to develop innovative solutions to push back against extremism and hate online. And finally, the program we’ve supported with the widest global reach is a student competition organized through the P2P: Facebook Global Digital Challenge. In less than two years, P2P has reached more than 56 million people worldwide through more than 500 anti-hate and extremism campaigns created by more than 5,500 university students in 68 countries.

Our Commitment

We want Facebook to be a hostile place for terrorists. The challenge for online communities is the same as it is for real world communities – to get better at spotting the early signals before it’s too late. We are absolutely committed to keeping terrorism off our platform, and we’ll continue to share more about this work as it develops in the future.

Read more about our new blog series Hard Questions. We want your input on what other topics we should address — and what we could be doing better. Please send suggestions to hardquestions@fb.com.

Categorías: Redes Sociales

Hard Questions

Jue, 06/15/2017 - 14:00

By Elliot Schrage, Vice President for Public Policy and Communications

Today we’re starting something new.

Facebook is where people post pictures with their friends, get their news, form support groups and hold politicians to account. What started out as a way for college students in the United States to stay in touch is now used by nearly 2 billion people around the world. The decisions we make at Facebook affect the way people find out about the world and communicate with their loved ones.

It goes far beyond us. As more and more of our lives extend online, and digital technologies transform how we live, we all face challenging new questions — everything from how best to safeguard personal privacy online to the meaning of free expression to the future of journalism worldwide.

We debate these questions fiercely and freely inside Facebook every day — and with experts from around the world whom we consult for guidance. We take seriously our responsibility — and accountability — for our impact and influence.

We want to broaden that conversation. So today, we’re starting a new effort to talk more openly about some complex subjects. We hope this will be a place not only to explain some of our choices but also explore hard questions, such as:

  • How should platforms approach keeping terrorists from spreading propaganda online?
  • After a person dies, what should happen to their online identity?
  • How aggressively should social media companies monitor and remove controversial posts and images from their platforms? Who gets to decide what’s controversial, especially in a global community with a multitude of cultural norms?
  • Who gets to define what’s false news — and what’s simply controversial political speech?
  • Is social media good for democracy?
  • How can we use data for everyone’s benefit, without undermining people’s trust?
  • How should young internet users be introduced to new ways to express themselves in a safe environment?

As we proceed, we certainly don’t expect everyone to agree with all the choices we make. We don’t always agree internally. We’re also learning over time, and sometimes we get it wrong. But even when you’re skeptical of our choices, we hope these posts give a better sense of how we approach them — and how seriously we take them. And we believe that by becoming more open and accountable, we should be able to make fewer mistakes, and correct them faster.

Our first substantive post, later today, will be about responding to the spread of terrorism online — including the ways we’re working with others and using new technology.

We want your input on what other topics we should address — and what we could be doing better. Please send suggestions to hardquestions@fb.com.

Categorías: Redes Sociales

Celebrating 30 Years of the GIF

Jue, 06/15/2017 - 05:59

On June 15, we’re celebrating the 30th anniversary of the GIF, which has made communicating on the internet more joyful, more visual and let’s face it, a whole lot funnier! To mark the big 3-0, we’re:

  • Taking an inside look at GIF popularity on Messenger
  • Announcing that GIFs in comments are now available to everyone on Facebook (yay!)
  • Introducing some new and exclusive GIFs we’ve created featuring some of the internet’s biggest stars
  • Asking you to help us answer the age-old debate of how to pronounce the word “GIF”

An Inside Look at GIFs in Messenger

With this milestone approaching, we took a look at how GIFs have transformed the way people communicate with each other since introducing GIFs in Messenger in 2015:

  • People on Messenger sent nearly 13 billion GIFs in the last year, or nearly 25,000 GIFs every minute
  • GIF sends on Messenger have tripled in the past year
  • New Year’s Day 2017 was the most popular day ever for GIF sends on Messenger, with more than 400 million GIF sends

GIFs in Facebook Comments are Finally Here!

We know people love communicating with GIFs on Messenger, and we’re also making it easier to use GIFs on Facebook. Today we’re introducing the ability to add GIFs in comments for all people on Facebook globally.

Just tap the GIF button when you go to make a comment, type in what you’re looking to say, and add the GIF that really nails it!

The GIF Party

We’re also celebrating the 30th anniversary the best way we know how — a GIF party with some of your favorite stars.

GIPHY Studios created 20 GIFs featuring some of the internet’s most recognizable faces: DNCE, Logan Paul, Amanda Cerny, DREEZY, Patrick Starr, Violet Benson, Wuz Good, Brandi Marie, and Landon Moss.

Each GIF is a unique and shareable morsel of human expression. They will be available to use by searching #GIFparty when sharing a GIF on Facebook or Messenger or by visiting GIPHY.com/Facebook.

Logan Paul

Violet Benson

Amanda Cerny

Landon Moss

Ending an Age-old Debate: How Do You Pronounce GIF?

Finally, we’re looking to solve the debate over how the word GIF is pronounced once and for all. Over the next few days, if you live in the US you might see a poll on Facebook asking you to cast your vote. You can also vote by visiting Facebook’s official Page on your mobile phone. To find the Page, search for “Facebook” in the main Facebook app.

We’ll report back here on whether the “hard g” or “soft g” pronunciation reigns supreme.

Categorías: Redes Sociales

Announcing Updates to Safety Check

Mié, 06/14/2017 - 15:00

By Naomi Gleit, VP Social Good

As part of our ongoing commitment to build a safe community, today we’re announcing several updates to Safety Check:

  • Introducing Fundraisers in Safety Check: people in the US will have the option to start a fundraiser from within Safety Check
  • Expanding Community Help: Community Help will be available on desktop and for all crisis types where Safety Check is activated
  • Adding more context with a personal note: now people can share a personal note in their Safety Check News Feed story with friends and loved ones
  • Introducing crisis descriptions: get more information about a crisis from NC4, our trusted third party global crisis reporting agency, within the Safety Check tool

Introducing Fundraisers in Safety Check
Following a crisis, one way people give and request help is through fundraising. To make this easier, we are introducing Fundraisers in Safety Check. Within Safety Check, people will be able to create or donate to a fundraiser for charitable and personal causes to help those in need. Fundraising provides a way for people who are also outside of the crisis area to offer help. Fundraisers in Safety Check will start to roll out in the coming weeks in the US.

Expanding Community Help
Since we launched Community Help earlier this year on iOS and Android, we have been inspired by the offers and requests for help generated by the community and want to make sure that those in need are able to access Community Help through any platform. Community Help will be available in the upcoming weeks on desktop, giving people another way to access the tool. Additionally, Community Help is now available for all crises where Safety Check is activated.

Adding more context with a personal note
After marking themselves safe, people share additional information to help reassure friends they are safe and to provide more context about the crisis. To make this easier, people can now add a personal note to tell their friends more about what’s happening from within the Safety Check tool. This note will appear in the News Feed story that is automatically generated when people mark themselves safe.

Introducing crisis descriptions
When people receive Safety Check notifications, they may have limited information about the crisis. To help provide additional context on crises and make sure people have the information that they need, we have started adding descriptions about the crisis from NC4, our trusted third party global crisis reporting agency.

Safety Check has been activated more than 600 times in two years and has notified people that their families and friends are safe more than a billion times. Keeping the community safe means everything to us at Facebook and we hope that these updates to Safety Check continue to do just that.

Categorías: Redes Sociales

Using Data to Help Communities Recover and Rebuild

Mié, 06/07/2017 - 18:00

By Molly Jackman, Public Policy Research Manager

After a flood, fire, earthquake or other natural disaster, response organizations need accurate information, and every minute counts in saving lives. Traditional communication channels are often offline and it can take significant time and resources to understand where help is desperately needed.

Facebook can help response organizations paint a more complete picture of where affected people are located so they can determine where resources — like food, water and medical supplies — are needed and where people are out of harm’s way.

Today, we are introducing disaster maps that use aggregated, de-identified Facebook data to help organizations address the critical gap in information they often face when responding to natural disasters. Many of these organizations worked with us to identify what data would be most helpful and how it could be put to action in the moments following a disaster.

This initiative is the product of close work with UNICEF, the International Federation of the Red Cross and Red Crescent Societies, the World Food Programme, and other organizations. It is an example of how technology can help keep people safe, one of our five areas of focus as we help build a global community.

Based on these organizations’ feedback we are providing multiple types of maps during disaster response efforts, which will include aggregated location information people have chosen to share with Facebook.

Location density maps show where people are located before, during and after a disaster. We can compare this information to historical records, like population estimates based on satellite images. Comparing these data sets can help response organizations understand areas impacted by a natural disaster.

Movement maps illustrate patterns of movement between different neighborhoods or cities over a period of several hours. By understanding these patterns, response organizations can better predict where resources will be needed, gain insight into patterns of evacuation, or predict where traffic will be most congested.

Safety Check maps are based on where our community uses Safety Check to notify their friends and family that they are safe during a disaster. We are using this de-identified data in aggregate to show where more or fewer people check in safe, which may help organizations understand where people are most vulnerable and where help is needed.

This type of information can help response organizations understand which neighborhoods suffered the most damage following an earthquake and where people might be in need of help as they evacuate their homes and eventually return.

We are sharing this information with trusted organizations that have capacity to act on the data and respect our privacy standards, starting with UNICEF, the International Federation of the Red Cross and Red Crescent Societies, and the World Food Programme. We are working with these organizations to establish formal processes for responsibly sharing the datasets with others.

Over time, we intend to make it possible for additional organizations and governments to participate in this program. All applications will be reviewed carefully by people at Facebook, including those with local expertise.

We believe that our platform is a valuable source of information that can help response organizations serve people more efficiently and effectively. Ultimately, we hope this data helps communities have the information they need to recover and rebuild if disaster strikes.

Categorías: Redes Sociales

Making Facebook Live More Accessible With Closed Captions

Mar, 06/06/2017 - 18:45

By Supratik Lahiri, Product Manager, and Jeffrey Wieland, Director of Accessibility

Making Facebook accessible to everyone is a key part of building global community. Today we’re allowing publishers to include closed captions in Facebook Live, helping people who are deaf or hard of hearing to experience live videos. Now, if your captioning settings are turned on, you’ll automatically see closed captions on Live broadcasts when they’re available.

Over the past year, daily watch time for Facebook Live broadcasts has grown by more than 4x, and 1 in 5 Facebook videos is a Live broadcast. By enabling publishers to include closed captions with their Live broadcasts, we hope more people can now participate in the exciting moments that unfold on Live.

Today’s milestone represents the next step in our efforts to make content on Facebook accessible to more people. It’s already possible to add captions to non-live videos when uploading them to Facebook Pages, and publishers can use our speech recognition service to automatically generate captions for videos on their Pages.

For more information on adding closed captions to Facebook Live broadcasts, click here. For more information on Facebook’s accessibility features and settings, click here, and follow news and updates from the Facebook Accessibility team here.

Categorías: Redes Sociales

Facebook Celebrates Pride Month

Lun, 06/05/2017 - 15:00

By Alex Schultz, VP & Executive Sponsor of pride@facebook

As Pride celebrations begin around the world, Facebook is proud to support our diverse community, including those that have identified themselves on Facebook as gay, lesbian, bi-sexual, transgender or gender non-conforming. In fact, this year, over 12 million people across the globe are part of one of the 76,000 Facebook Groups in support of the LGBTQ community, and more than 1.5 million people plan to participate in one of the more than 7,500 Pride events on Facebook.

This year, we’re excited to unveil more ways than ever before for people to show their pride and support for the LGBTQ community on Facebook:

Update Your Profile Pic with a Rainbow Frame
Throughout the month of June, you might see a message from Facebook in your News Feed wishing you a Happy Pride and inviting you to add a colorful, Pride-themed profile frame. Additionally, you might also see a special animation on top of your News Feed if you happen to react to our message.

React with Pride
You may see a colorful, limited-edition Pride Reaction during Pride Month. When you choose this temporary rainbow reaction, you’ll be expressing your “Pride” to the post.


Brighten Up Your Photos
In Facebook Camera, you can find some new colorful, Pride-themed masks and frames. If you swipe to the left of News Feed, click on the magic wand to bring up camera effects and you’ll be able to find the effects in the mask and frame category.


Support an LGBTQ Cause
In the US, start a Facebook Fundraiser or donate to your favorite LGBTQ cause. On Facebook, you can raise money for a nonprofit or people — for yourself, a friend or someone or something not on Facebook.

Facebook isn’t the only place to celebrate the cause. All across our entire family of apps, you will have the opportunity to show your support:

Join the #KindComments Movement on Instagram
The photo sharing app is committed to fostering a safer and kinder community, and this June will be turning walls in major US cities into colorful beacons of LGBTQ support where you can leave supportive comments on your posts. You can also celebrate Pride and be creative with stickers and a rainbow brush.


Frame Up with Pride on Messenger
During Pride month, you can add some love to your conversations with friends and family with Pride-themed stickers, frames, and effects in the Messenger Camera.

Our Commitment and Participation
Facebook has long been a supporter of LGBTQ rights, through our products, policies and benefits to our employees. Not only will we be a part of Pride activities in more than 20 cities around the world, including in San Francisco, where we first marched in 2011, but we will also celebrate with our employees by hosting events and discussions, as well as by draping the Facebook monument outside the Menlo Park headquarters in the rainbow flag, as the company has done each year since 2012.

Our commitment and support of the LGBTQ community has been unwavering. From our support of marriage equality and bullying prevention, to the many product experiences that we’ve brought to life, we are proud of our attention to the LGBTQ experience on Facebook, often thanks to the many LGBTQ people and allies who work here.

Last year, for the first time ever, we began publicly sharing self-reported data around our LGBTQ community at Facebook. In a recent, voluntary survey of our employees in the US about sexual orientation and gender identity, to which 67% responded, 7% self-identified as being lesbian, gay, bisexual, queer, transgender or asexual. We are proud to support the LGBTQ community, and while more work still remains, we are eager to be active partners going forward.

Happy Pride!

Categorías: Redes Sociales

Update on Trending

Mié, 05/24/2017 - 19:00

By Ali Ahmadi, Product Manager, and John Angelo, Product Designer

Redesigned Trending Results Page

Starting today, we’re introducing a redesigned Trending results page, which is the page you see when you click on a Trending topic to learn more about it.

You’ve always been able to click on a topic to see related posts and stories, but we’ve redesigned the page to make it easier to discover other publications that are covering the story, as well as what your friends and public figures are saying about it.

You’ll be able to see the new results page on iPhone in the US, and we plan to make it available on Android and desktop soon.

Now, when you click on a Trending topic, you’ll see a carousel with stories from other publications about a given topic that you can swipe through. By making it easier to see what other news outlets are saying about each topic, we hope that people will feel more informed about the news in their region.

The stories that appear in this section are some of the most popular stories about that topic on Facebook. These stories are determined the same way as the featured headline — using a combination of factors including the engagement around the article on Facebook, the engagement around the publisher overall, and whether other articles are linking to it.

There is no predetermined list of publications that are eligible to appear in Trending and this update does not affect how Trending topics are identified, which we announced earlier this year.

Making Trending Easier to Discover On Mobile

One of the things we regularly hear from people who use Trending is that it can be difficult to find in the Facebook mobile app. We’re soon beginning a test in News Feed that will show people the top three Trending stories, which they can click on to see the full list of Trending topics and explore what people are discussing on Facebook.

While most people will not see Trending in their News Feed as part of this small test, we hope that it will help us learn how to make Trending as useful and informative for people as possible. If you do see the Trending unit in your News Feed, you have the option to remove it in the drop-down menu which will prevent it from being shown to you in the future.

As before, we continue to listen to feedback about Trending and will keep making improvements in order to provide a valuable experience.

Categorías: Redes Sociales

Páginas