What happens when you report something on Facebook
It's easy to click the "report" button on Facebook, but what exactly happens once you do? The social network's safety team has a handy-dandy breakdown.
In a post on the Facebook Safety page , it's explained that the social network has multiple teams dedicated to handling reports made by users, 24/7:
Hundreds of Facebook employees are in offices throughout the world to ensure that a team of Facebookers are handling reports at all times. For instance, when the User Operations team in Menlo Park is finishing up for the day, their counterparts in Hyderabad are just beginning their work keeping our site and users safe.
There are four types of teams which review reports — the Safety team, the Hate and Harassment team, the Access team, and the Abusive Content team. The cited reason for a report determines which of the teams will see it. "For example, if you are reporting content that you believe contains graphic violence, the Safety Team will review and assess the report," the blog post offers.
If one of the teams' members finds that reported content violates Facebook's policies, then he or she can remove the content, warn the user who posted it, revoke a user's ability to share particular types of content, disable certain features for a user, completely disable a Facebook account or escalate an issue to law enforcement.
Alternatively, if content does not violate Facebook's policies, the social network offers ways for users to directly communicate "to better resolve their issues beyond simply blocking or unfriending another user."
You can take a peek at the image at the beginning of this post to see exactly how Facebook directs issues reported by users — if the image is making you squint, you can take a closer look at Facebook's reporting guide here.
Want more tech news, silly puns, or amusing links? You'll get plenty of all three if you keep up with Rosa Golijan, the writer of this post, by following her on Twitter , subscribing to her Facebook posts , or circling her on Google+ .
What Happens When You Report Someone on Facebook?
Understand the possible actions Facebook might take in response to reported account violations.
Facebook provides a platform for people to connect for exchange ideas and experiences. Moreover, this platform has implemented community standards to ensure a secure and respectful environment for every user. However, if someone violates these standards, you can report them. This article will discuss the process of reporting, the reasons behind it, and what happens when you report someone on Facebook.
Table of Contents
Reporting individuals on Facebook is crucial for ensuring a safe and respectful online community. To contribute to this effort, it is important to understand the reasons and consequences behind this action on Facebook. Go through this article to understand these parameters and be a responsible social media user.
For What Reasons Can You Report Someone on Facebook?
Facebook has community standards that outline the types of content that are not allowed on the platform. If you come across any content that violates these standards, you can report it to Facebook. Here are some reasons why you may want to report someone on Facebook:
- Harassment and Bullying
- Hate Speech
- Nudity and Sexual Content
- Violence and Graphic Content
Do you want to learn the consequences as well? First, you need to understand how to perform this process. After that, we will explore what happens when you report someone on Facebook.
Also Read : What Does Reporting Someone on Instagram Do?
How to Report Someone on Facebook?
Reporting someone on Facebook is a straightforward process. Here’s how to do it:
1. Go to the desired Facebook profile that violates Facebook’s community standards.
2. Click on the three-dotted icon from the top right corner of the menu bar of the profile.
3. Select Find Support or report .
4. Choose the desired reason for reporting the content from the options provided.
Note : You can also add additional details to help Facebook better understand the issue.
5. Click on the Report profile option to submit it for review. After this, you will learn what happens to the report you submitted to Facebook.
6. Lastly, click on Submit .
Once you submit your report, Facebook’s team will review it and take appropriate action based on their community standards.
Also Read : How to Turn On Facebook Protect
How to Report Someone on Facebook Mobile
To report an account on Facebook from your phone, follow these mentioned steps:
Method 1: Reporting a Profile
If you wish to report certain profile or post for violating the community guidelines of Facebook, follow these steps:
1. Open Facebook and go to the profile of the person you want to report.
2. Click on the three-dot icon next to the message bar .
3. Scroll down and tap on Report profile.
4 . Highlight the reason for reporting from the given options.
5. Click on Submit.
What Happens After You Report Someone on Facebook?
Once you report someone on Facebook, the platform’s team will review your report and take appropriate action. Here are some possible outcomes:
- No Action Taken : Facebook’s team may review the content and determine that it does not violate their community standards. In this case, they will not take any action.
- Content is Removed : If Facebook’s team determines that the content violates their community standards, they will remove it from the platform. The person who posted the content may also receive a warning or be temporarily or permanently banned from the platform, depending on the severity of the violation.
- Account is Disabled : If someone repeatedly violates Facebook’s community standards, their FB account may get disabled . This means they will no longer be able to access their account or use Facebook’s services.
- Legal Action : In some cases, the violation may be severe enough to warrant legal action. Facebook may work with law enforcement agencies to identify the person responsible for the breach.
It’s important to note that Facebook’s team does not disclose the outcome of a report to the person who reported the content. It is to protect the privacy of both the reporter and the person who posted the content.
Through this guide, we hope you have learned what happens when you report someone on Facebook . So, by reporting inappropriate content or other violations on this platform, you contribute to its efforts to implement community standards and ensure user safety. Share your remarks in the comments section, and don’t forget to come back for more interesting articles!
About The Author
Pete Mitchell
9 Ways To Fix A Smartphone Charging Port
How to Recover Contacts on Android Phone
How to Uninstall Software Update On Android
How to Disable Android Automatic Updates
How to Fix Instagram Music Not Working Business Account: Get Reel Music
How to Use Facebook Marketplace Without an Account
Leave a comment cancel reply.
Your email address will not be published. Required fields are marked *
How to report a problem on Facebook
You need to get bad stuff offline.
By Barbara Krasnoff , a reviews editor who manages how-tos. She’s worked as an editor and writer for almost 40 years. Previously, she was a senior reviews editor for Computerworld.
Share this story
Although you may not want to add to the trauma of the moderation staff who spend their days monitoring the worst of Facebook, disturbing and inappropriate content gets posted to the social network each and every day, and the company’s automated software is woefully ill-equipped at finding it and even worse at properly evaluating it. So how do you as a user report a post, message, photo, or other item on Facebook that you believe should be pulled?
Well, the first place to go is Facebook’s own page discussing how to report a violation, rather vaguely titled “ Report Something .” The page begins, “We’re sorry you’re having a bad experience on Facebook, and we want to help.” The guide has some broad suggestions on how to handle various situations and tools to get the process started, such as how to let the company know that someone is using Facebook to harass, spread hate speech, or otherwise violate its myriad rules and content policies.
Report on hate speech or other problem posts
To report something that you feel goes against Facebook’s Community Standards , the process is relatively simple:
- Click on the three dots to the right of the post.
- Choose “Give feedback on this post.”
- Select the reason you’re reporting the post. These include nudity, violence, harassment, terrorism, and hate speech, among others. Click Send. (You may be asked for more details in the next screen.)
You should get a response within a reasonable amount of time, either just thanking you for your feedback (in which case, probably nothing was done) or stating the action that was taken. It can be effective: I once reported a post for hate speech (along with several other Facebook users) and received a notice the next day saying that the post had been removed.
If you haven’t heard anything, you can check the status of your reports by going to the Support Inbox .
Report on an impersonation
Someone who wants to get at you for political, personal, or other reasons might use Facebook to impersonate you by creating an account with a name similar to yours and trying to make people believe it’s authentic. If somebody is masquerading as you on the platform, you can report that specifically to Facebook:
- Go to the profile page of the person who is impersonating you.
- Click on the three dots near the top right of the profile page and select “Give feedback or report this profile.”
- Choose the reason you are reporting the profile. Choices include “Pretending to be Someone” and “Fake Name,” among others.
There are, of course, times when the situation is too serious to simply send a message to Facebook and hope it handles it. Whenever you send a report to Facebook, a message at the bottom of the report box asks you to contact law enforcement — for example, if someone is in immediate danger, or if the image is of a child being abused.
Report an ad
This is actually something I’ve taken advantage of several times — not because an ad was egregiously offensive, but because it was highly irritating or had repeated on my timeline so many times I couldn’t bear to look at it again.
If you don’t want to leave the service entirely, your best option is, according to Facebook, to report it
- Click on “Hide ad” if you just don’t want to see it again or “Report ad” if it’s seriously problematic. The former will immediately get rid of the ad; the latter will ask you to say if the ad is offensive, a scam, or from a political candidate (who presumably you don’t want to hear from).
If you see something, say something
There are a number of other things you can do if you’re bothered by an obnoxious Facebook post or user. If you click on the three dots to the top right of a post, you can hide that post, hide the person from your timeline (or just snooze them for 30 days), or unfollow them. If you click on the three dots on a person’s profile, you can block them. And if things just get too annoying to deal with, you can always simply delete Facebook .
But under the presumption that you don’t want to leave Facebook entirely, your best option if confronted with something nasty is, according to Facebook, to report it. While you won’t be making the moderators’ lives any easier, you might prevent other Facebook users from having their days ruined.
Vox Media has affiliate partnerships. These do not influence editorial content, though Vox Media may earn commissions for products purchased via affiliate links. For more information, see our ethics policy .
- The secret lives of Facebook moderators in America
Tesla recalls all 3,878 Cybertrucks over faulty accelerator pedal
The invisible seafaring industry that keeps the internet afloat, meta’s battle with chatgpt begins now, us air force confirms first successful ai dogfight, samsung shifts executives to six-day workweeks to ‘inject a sense of crisis’.
More from Tech
Sony’s portable PlayStation Portal is back in stock
The Nintendo Switch 2 will now reportedly arrive in 2025 instead of 2024
The best Presidents Day deals you can already get
Interview: Figma’s CEO on life after the company’s failed sale to Adobe
Search results for
Affiliate links on Android Authority may earn us a commission. Learn more.
How to report someone on Facebook
Published on November 7, 2023
Facebook is an excellent way to keep in touch with friends and family, get involved with local communities, and share common likes and interests. Of course, your Facebook timeline could actually be quite the opposite, with plenty of hateful posts to be seen on the platform.You may have come across offensive posts, and we’ll guide you through the steps to report the person who posted them. If you’re considering reporting someone, you might also want to block them to ensure they no longer appear on your Facebook feed
QUICK ANSWER
To report someone on Facebook, go to their profile page and tap on the three horizontal dots button towards the top. Select Find support or report , select your issue, and confirm your report.
JUMP TO KEY SECTIONS
Who can you report on Facebook?
How to report someone on facebook web, how to report someone on the facebook app.
You can report anyone on Facebook by going to their profile page. The person doesn’t have to be on your friend list. You can also report individual posts, photos, videos, messages, pages, groups, events, comments, and Facebook ads. Facebook might remove the reported issue if it doesn’t match the company’s community standards.
Here are some steps you can follow to report someone on the Facebook website:
- Go to their profile page.
- Click on the three horizontal dots icon on the right of the page near the top.
- Select Find support or report .
You can select between the list of the different issues, which include Hate speech, Nudity or Sexual Content, Violence, Harassment, Unauthorized sales, and Pretending to be something. You can also select Something else if your reason isn’t on the list.
To report someone using the mobile app:
- Go to their profile.
- Tap on the three horizontal dots icon next to the Message button.
- Select Report profile .
- Choose from the available list of issues or pick Something else .
Click on the three horizontal dots icon on the right of their profile page. Go to Find support or report and select Pretending to be something.
Click on the three horizontal dots icon on the right of their profile page. Go to Report profile and select Harassment .
No, the other person will not know that you have reported them unless it’s related to copyright infringement.
Facebook might not remove a person’s profile or post after you report them. If you still see the person’s profile, you have the option to block or unfriend them. If you don’t want to see their posts, tap on the horizontal dots icon to the right of the post and click on Snooze (person) for 30 days .
Yes, if you accidentally report a person, you can cancel it. Click your profile picture at the top right corner and go to Help and support > Support Inbox . You will see a list of your reports here. Click on Cancel report . Note that Facebook will only allow you to cancel if it has not yet been reviewed.
Yes, you can go on their profile, tap on the three horizontal dots icon, then choose Report profile and select the problem you want to report. Alternatively, you can also unfriend or block the person if you no longer want to be connected.
Go to the page, tap or click on the three horizontal dots, and then select Report profile .
If you suspect that someone is impersonating you or someone you are responsible for on Facebook and you do not have an account, you can report them using this form . You will be asked to upload your ID and provide the imposter’s profile information.
- 50 most-followed Facebook Pages 2023
- 37 most-followed Instagram accounts 2023
- 50 most followed Twitch streamers 2023
- 11 most followed Threads accounts in 2023
- Facebook Guides
- Threads Guides
- Instagram Guides
- Snapchat Guides
- YouTube Guides
- LinkedIn Guides
- WhatsApp Guides
- Android Guides
- Browsers Guides
- Download Guides
- Google Chrome alternatives
- Einthusan alternatives for streaming
- Gumroad alternatives
- Brave browser alternatives
- Facebook alternatives
How to report a comment on Facebook
Don't worry if you see a concerning facebook comment. here's how to report a bad comment on facebook.
Post Contents:
Social media usage is not without its challenges. One common issue users face is dealing with inappropriate and offensive comments on their posts or those of others. While Meta, a Facebook company pledges to keep their platforms (Facebook, Instagram, Threads) clean from hate and harmful content, they can’t individually address every single issue, thanks to billions of users using the platforms. That’s where the Facebook reporting system comes in.
Facebook, being one of the most popular social platforms, provides users with the ability to report comments that violate its community standards. In this guide, we’ll walk you through the process of how to report a comment on Facebook.
In this guide: Report bad Facebook comments
When to report a comment on facebook.
When you come across a comment that is offensive, spammy or violates Facebook’s community standards, it’s important to identify it clearly. Ensure that you understand the context and nature of the comment before taking any action.
- Look for comments that contain hate speech, harassment, or graphic content.
- Consider the intent of the comment—is it meant to incite violence or harm?
- Verify if the comment violates Facebook’s policies by referring to their community standards.
A step-by-step guide to reporting a comment on Facebook:
Follow these simple steps to report a comment on Facebook:
1. Open Facebook and identify the Comment you want to report
Find the comment you want to report. It may be on your post or someone else’s.
2. Tap three dots and a menu will pop-up
Click on the three dots (…) located to the right of the comment. This will open a dropdown menu.
3. Find and tap report comment option from the menu
From the dropdown menu, select “Find support or report comment.” By following these steps, you’ll initiate the reporting process.
4. Choose an appropriate reason before submitting report
Facebook offers a range of reporting reasons to choose from. Select the one that best fits the situation. Common reporting reasons include:
1. Bullying or harassment 2. Hate speech or hate symbols 3. Graphic violence 4. Spam or misleading content Choose the reason that accurately reflects the violation to ensure appropriate action is taken.
5. Provide additional details if necessary
To enhance the effectiveness of your report, Facebook allows you to provide additional context and information about the comment. This step is crucial, as it helps Facebook’s moderation team understand the situation better. In a brief sentence or two, explain why the comment is offensive or inappropriate. If possible, attach screenshots that highlight the comment and its context. The more detailed your report, the easier it will be for Facebook to assess and take action.
6. Submit your report to Facebook
Once you’ve selected the appropriate reason and provided additional details, it’s time to submit your report. Click on the “Next” or “Submit” button, depending on the interface.
7. A confirmation report from Facebook will be received once submitted
You may receive a confirmation message that your report has been submitted. Facebook keeps your identity confidential during the reporting process. Congratulations, you’ve successfully reported the inappropriate comment!
These steps are for the web version of Facebook. The steps to report a comment from your phone (Android or iPhone) are similar. You can long-press a comment to see pop-up menu options for further procedure. Also, Facebook clearly mentions in their report options that,
Facebook reporting options:
Since the inception of these reporting options, Facebook has evolved and over the years added more options to it. These are the reporting options currently available on Facebook under which you can report a certain Facebook comment.
- Eating disorder
- Race or ethnicity
- National origin
- Religious affiliation
- Social caste
- Sexual orientation
- Sex or gender identity
- Disability or disease
- Something else
- Suicide or self-injury
- Child abuse
- False information
- Unauthorized sales
- Hate speech
- Something Else
Reviewing Facebook’s response
After you’ve submitted the report, Facebook’s moderation team will review the comment in question. They will assess whether it violates their community standards. You can expect one of the following outcomes:
- Comment Removal: If the comment violates guidelines, Facebook will remove it.
- No Violation Found: If the comment is deemed within the community standards, no action will be taken. Be patient as the moderation team takes the necessary steps.
To be honest, most of the time it just doesn’t work. But if done right, you can expect a positive response to your report. It’s important to report a particular comment in the right category to increase your chances of being heard.
Also, if the comment you’re reporting has your name tagged in Facebook , you can also remove yourself from tagged comment on FB .
How long does it take for Facebook to respond to a reported comment?
Reporting times can vary, but Facebook generally responds within 24-48 hours. However, response times might be longer during peak periods.
Can I report a comment anonymously?
Absolutely. Facebook keeps your identity confidential throughout the reporting process.
What if I mistakenly reported a comment?
Don’t worry. If you realize that your report was in error, you can contact Facebook’s support to rectify the situation.
How do I report multiple comments at once?
Unfortunately, Facebook doesn’t offer a feature to report multiple comments simultaneously. You’ll need to report each comment individually.
Will the person who posted the comment know that I reported it?
No, Facebook maintains your anonymity when you report a comment.
What if Facebook doesn’t take action against the reported comment?
If you believe that Facebook’s decision was incorrect, you can appeal their decision or provide additional information for reconsideration.
Can’t report comments on Facebook?
If you can’t report a comment on Facebook, please check your internet connection. Or it may be because you’re reporting too much and Facebook has limited this feature for your account.
Can I un report a comment on Facebook?
Yes, but the undo button is only available briefly once you’ve reported the comment. After that, the pop-up goes away and you won’t be able to take your report back.
So, this is how you could report a comment on Facebook. If someone’s abusing you or harassing you in the comments or even threatening you, that’s how you can get them what they deserve. A ban from Facebook. You can get to learn more about Facebook comments on noobspace.
GCC Vs. GCC High: Picking the right Microsoft 365 version for your needs
How to check who liked a comment on instagram 2024, muhammad abdullah.
Abdullah, aka "abdugeek," graduated in computer science and is a certified Growth Hacker. He loves writing about technology, such as gadgets, apps, social media, Mac, Android, and Windows. Abdullah has an 8 years of experience with technology gadgets, apps, software, hardware, and information technology. He loves playing with gadgets, exclusively Mac, iPhone, Android, Windows, Drones, Radios, Smartphone accessories, and any other device that he gets in his hands. Abdullah is also a versatile expereince in the digital marketing world and has an entrepreneurial mindset. He founded multiple digital startups in technology sector and constantly seeks new opportunities. Abdullah is also into movies, especially, Marvel Cinematic Universe and DC franchises. You can follow him on social media where you can learn more about what he shares with the world.
Related Posts
Why has my blue tick gone on Twitter (X)?
Twitter’s t.co short links explained
51 most followed X (Twitter) accounts in 2024
Why Facebook/IG Reel comments are disabled?
How to lock chats in WhatsApp
IFK slang meaning – Snapchat, Instagram, WhatsApp
26 must-have apps on your phone (Android+iPhone) 2024
How to download Facebook on any device
How to check who liked a comment on Instagram [year]
How to read Instagram messages without them knowing in [year]
Leave a reply cancel reply.
Your email address will not be published. Required fields are marked *
Save my name, email, and website in this browser for the next time I comment.
Please enter an answer in digits: 2 × two =
This site uses Akismet to reduce spam. Learn how your comment data is processed .
- Disclaimers
- Privacy policy
© 2024 noobspace - Technology Explained for Everyone | A GameBird Media site.
- 18 biggest X (Twitter) Spaces in 2023 (Updated)
- 51 most followed X (Twitter) accounts in 2023
- 50 most-followed TikTokers in 2023
- Twitter (𝕏) Guides
- TikTok Guides
- Netflix Guides
Business, lead, or agency partner? Get personalized help.
Reasons for reporting content
Agency Content Report Reasons
Business content report reasons, groups report reasons.
Did this article help you?
Related articles
- Tips for discussing fireworks with your neighbors
- Tips for respectful conversations
- Be respectful to your neighbors
- My content was removed on Nextdoor
- Review reported content as a moderator
What was the main reason you visited Nextdoor's help center today?
Did you find what you were looking for, how easy was it to find what you were looking for in help center today, what made your experience easy or difficult, your feedback is important to us and helps us improve the experience..
Transparency Center
Facebook Community Standards
Policies that outline what is and isn't allowed on the Facebook app.
Instagram Community Guidelines
Policies that outline what is and isn't allowed on the Instagram app.
Meta Advertising Standards
Policies for ad content and business assets.
Other policies
Other policies that apply to Meta technologies.
How Meta improves
How we update our policies, measure results, work with others, and more.
Enforcement
Detecting violations
How technology and review teams help us detect and review violating content and accounts.
Taking action
Our three-part approach to content enforcement: remove, reduce and inform.
Threat disruptions
How we take down coordinated adversarial networks to protect people using our services
Security threats
Challenges we investigate and counter around the globe
Threat reporting
Security research into the adversarial networks we’ve taken down since 2017
Our approach to the opioid epidemic
How we support communities in the face of the opioid epidemic.
Our approach to elections
How we help prevent interference, empower people to vote and more.
Our approach to misinformation
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
Our approach to newsworthy content
How we assess content for newsworthiness.
Our approach to Facebook Feed ranking
How we reduce problematic content in News Feed.
Our approach to explaining ranking
How we build AI systems.
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Quarterly Updates on the Oversight Board
Research tools
Content Library and Content Library API
Comprehensive access to public data from Facebook and Instagram
Ad Library tools
Comprehensive and searchable database of all ads currently running across Meta technologies
Other research tools and datasets
Additional tools for in-depth research on Meta technologies and programs
Community Standards Enforcement Report
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Intellectual Property
Report on how well we're helping people protect their intellectual property.
Government Requests for User Data
Report on government request for people's data.
Content Restrictions Based on Local Law
Report on when we restrict content that's reported to us as violating local law.
Internet Disruptions
Report on intentional internet restrictions that limit people's ability to access the internet.
Widely Viewed Content Report
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Regulatory and Other Transparency Reports
Download current and past regulatory reports for Facebook and Instagram.
The Facebook Community Standards outline what is and isn't allowed on Facebook.
Introduction
Every day, people use Facebook to share their experiences, connect with friends and family, and build communities. It’s a service for more than 2 billion people to freely express themselves across countries and cultures and in dozens of languages.
Meta recognizes how important it is for Facebook to be a place where people feel empowered to communicate, and we take our role seriously in keeping abuse off the service. That’s why we developed standards for what is and isn’t allowed on Facebook.
These standards are based on feedback from people and the advice of experts in fields like technology, public safety and human rights. To ensure everyone’s voice is valued, we take great care to create standards that include different views and beliefs, especially from people and communities that might otherwise be overlooked or marginalized.
Please note that the US English version of the Community Standards reflects the most up to date set of the policies and should be used as the primary document.
Our commitment to voice
The goal of our Community Standards is to create a place for expression and give people a voice. Meta wants people to be able to talk openly about the issues that matter to them, whether through written comments, photos, music, or other artistic mediums, even if some may disagree or find them objectionable. In some cases, we allow content—which would otherwise go against our standards—if it’s newsworthy and in the public interest. We do this only after weighing the public interest value against the risk of harm, and we look to international human rights standards to make these judgments. In other cases, we may remove content that uses ambiguous or implicit language when additional context allows us to reasonably understand that the content goes against our standards.
Our commitment to expression is paramount, but we recognize the internet creates new and increased opportunities for abuse. For these reasons, when we limit expression, we do it in service of one or more of the following values:
AUTHENTICITY
We want to make sure the content people see on Facebook is authentic. We believe that authenticity creates a better environment for sharing, and that’s why we don’t want people using Facebook to misrepresent who they are or what they’re doing.
We’re committed to making Facebook a safe place. We remove content that could contribute to a risk of harm to the physical security of persons. Content that threatens people has the potential to intimidate, exclude or silence others and isn’t allowed on Facebook.
We’re committed to protecting personal privacy and information. Privacy gives people the freedom to be themselves, choose how and when to share on Facebook and connect more easily.
We believe that all people are equal in dignity and rights. We expect that people will respect the dignity of others and not harass or degrade others.
Community Standards
Our Community Standards apply to everyone, all around the world, and to all types of content, including AI-generated content.
Each section of our Community Standards starts with a “Policy Rationale” that sets out the aims of the policy followed by specific policy lines that outline:
Content that's not allowed; and
Content that requires additional information or context to enforce on, content that is allowed with a warning screen or content that is allowed but can only be viewed by adults aged 18 and older.
VIOLENCE AND CRIMINAL BEHAVIOR
Objectionable content, integrity and authenticity, respecting intellectual property, content-related requests and decisions, enforcement.
Crowdtangle
Facebook Open Research and Transparency
RESEARCH TOOLS
Ad Library Tools
Privacy Policy
Terms of Service
- SUGGESTED TOPICS
- The Magazine
- Newsletters
- Managing Yourself
- Managing Teams
- Work-life Balance
- The Big Idea
- Data & Visuals
- Reading Lists
- Case Selections
- HBR Learning
- Topic Feeds
- Account Settings
- Email Preferences
4 Reasons Why Managers Fail
- Swagatam Basu,
- Atrijit Das,
- Vitorio Bretas,
- Jonah Shepp
Nearly half of all managers report buckling under the stress of their role and struggling to deliver.
Gartner research has found that managers today are accountable for 51% more responsibilities than they can effectively manage — and they’re starting to buckle under the pressure: 54% are suffering from work-induced stress and fatigue, and 44% are struggling to provide personalized support to their direct reports. Ultimately, one in five managers said they would prefer not being people managers given a choice. Further analysis found that 48% of managers are at risk of failure based on two criteria: 1) inconsistency in current performance and 2) lack of confidence in the manager’s ability to lead the team to future success. This article offers four predictors of manager failure and offers suggestions for organizations on how to address them.
The job of the manager has become unmanageable. Organizations are becoming flatter every year. The average manager’s number of direct reports has increased by 2.8 times over the last six years, according to Gartner research. In the past few years alone, many managers have had to make a series of pivots — from moving to remote work to overseeing hybrid teams to implementing return-to-office mandates.
- Swagatam Basu is senior director of research in the Gartner HR practice and has spent nearly a decade researching leader and manager effectiveness. His work spans additional HR topics including learning and development, employee experience and recruiting. Swagatam specializes in research involving extensive quantitative analysis, structured and unstructured data mining and predictive modeling.
- Atrijit Das is a senior specialist, quantitative analytics and data science, in the Gartner HR practice. He drives data-based research that produces actionable insights on core HR topics including performance management, learning and development, and change management.
- Vitorio Bretas is a director in the Gartner HR practice, supporting HR executives in the execution of their most critical business strategies. He focuses primarily on leader and manager effectiveness and recruiting. Vitorio helps organizations get the most from their talent acquisition and leader effectiveness initiatives.
- Jonah Shepp is a senior principal, research in the Gartner HR practice. He edits the Gartner HR Leaders Monthly journal, covering HR best practices on topics ranging from talent acquisition and leadership to total rewards and the future of work. An accomplished writer and editor, his work has appeared in numerous publications, including New York Magazine , Politico Magazine , GQ , and Slate .
Partner Center
To revisit this article, visit My Profile, then View saved stories .
- Backchannel
- Newsletters
- WIRED Insider
- WIRED Consulting
Amit Katwala
No, Dubai’s Floods Weren’t Caused by Cloud Seeding
Dubai is underwater. Heavy storms have caused flash flooding across the United Arab Emirates, leading to shocking scenes circulating on social media: Cars abandoned by the roadside, planes sloshing through flooded runways. Hundreds of flights have been canceled at Dubai’s busy international airport, and at least 18 people have died in neighboring Oman.
News reports and social media posts were quick to point the blame at cloud seeding. The UAE has a long-running program for trying to squeeze more rain out of the clouds that pass over the normally arid region—it has a team of pilots who spray salt particles into passing storms to encourage more water to form. The floods were positioned as a cautionary tale by some : Here’s what happens when you mess with nature. Even Bloomberg reported that cloud seeding had worsened the flooding.
The truth is more complicated. I’ve spent the past few months reporting on cloud seeding in the UAE for an upcoming WIRED feature, and while it’s true that the UAE has been running cloud seeding missions this week—it performs more than 300 a year—it’s a stretch to say that it was responsible for the floods. (In fact, as we were preparing this story for publication on Wednesday morning, the UAE’s National Center for Meteorology told CNBC it had not seeded any clouds before the storm struck on Tuesday.)
TikTok content
This content can also be viewed on the site it originates from.
There are a few reasons for this. First: Even the most optimistic assessments of cloud seeding say that it can increase rainfall by a maximum of 25 percent annually. In other words, it would have rained anyway, and if cloud seeding did have an impact, it would have been to only slightly increase the amount of precipitation that fell. The jury is still out on the effectiveness of cloud seeding in warm climates, and even if it does work, cloud seeding can’t produce rain out of thin air, it can only enhance what’s already in the sky.
Secondly, seeding operations tend to take place in the east of the country, far from more populated areas like Dubai. This is largely because of restrictions on air traffic, and means it was unlikely that any seeding particles were still active by the time the storms reached Dubai. Most of the scientists I’ve spoken to say the impact of cloud seeding has a very small, localized effect and is unlikely to cause flooding in other areas. But perhaps the best evidence that cloud seeding wasn’t involved in these floods is the fact that it rained all over the region. Oman didn’t do any cloud seeding, but it was even more badly affected by flooding, with a number of casualties.
It’s exciting to point the finger at a scary technology, but the real cause of the flooding is likely more banal: Dubai is comically ill-equipped to deal with rainfall. The city has expanded rapidly over the past few decades, with little attention paid to infrastructure like storm drains that could help it deal with a sudden influx of water. It’s largely concrete and glass, and there’s very little green space to soak up rainfall. The result is chaos whenever it rains—though to be fair, most cities would struggle to deal with a year’s worth of rain falling in 12 hours .
However, climate change may also be playing a role. As the planet heats up, the complex weather dynamics of the region are shifting and changing in ways that may bring more violent storms. City planners around the world are trying to make their cities “ spongier ” to help deal with flash flooding and save more water for drier parts of the year. Instead of using cloud seeding to turn the sky into a sponge, Dubai would be better off trying to turn the city into one.
Andy Greenberg
Matt Burgess
Caroline Haskins
Jessica Rawnsley
You Might Also Like …
Navigate election season with our Politics Lab newsletter and podcast
Think Google’s “Incognito mode” protects your privacy? Think again
Blowing the whistle on sexual harassment and assault in Antarctica
The earth will feast on dead cicadas
Upgrading your Mac? Here’s what you should spend your money on
Carlton Reid
Rachel Lance
Emily Mullin
David Kushner
Charlie Wood
What caused Dubai floods? Experts cite climate change, not cloud seeding
- Medium Text
DID CLOUD SEEDING CAUSE THE STORM?
CAN'T CREATE CLOUDS FROM NOTHING
Coming soon: Get the latest news and expert analysis about the state of the global economy with Reuters Econ World. Sign up here.
Reporting by Alexander Cornwell; editing by Maha El Dahan and Alexandra Hudson
Our Standards: The Thomson Reuters Trust Principles. New Tab , opens new tab
World Chevron
A court in the Azerbaijani capital Baku on Friday remanded the head of an independent media outlet for two months on charges of smuggling, Turan news agency reported, the latest move against independent journalists in the ex-Soviet state.
Iraq's Popular Mobilization Forces, an official security force, said its command post at Kalso military base about 50 km (30 miles) south of Baghdad was hit by a huge explosion late on Friday, and two security sources said it resulted from an air strike.
The United States will withdraw its troops from Niger, a source familiar with the matter told Reuters late on Friday, adding an agreement was reached between U.S Deputy Secretary of State Kurt Campbell and Niger's leadership.
IMAGES
COMMENTS
To get the link for a Facebook profile, Page, group or event: Enter the name of the profile, Page, group or event in the search box at the top of any page on Facebook, and click . Click on the name of the profile, Page, group or event you're trying to report. Copy the link (URL) found in your browser's address bar.
Learn more about how you report something that goes against Facebook's Community Standards. Report inappropriate or abusive things on Facebook (example: nudity, hate speech, threats) | Facebook Help Center
Use the links below to give us feedback about how a Facebook feature works or to let us know how we can improve the Help Center: Report a problem to let us know if something isn't working. Feedback for the Help Center. Feedback from the people who use Facebook has helped us redesign our products, improve our policies and fix technical problems.
See when we take action on your report and the decision we made. Request Review of our decision. For some types of content, you can't request a review, but we're adding more options.
The cited reason for a report determines which of the teams will see it. "For example, if you are reporting content that you believe contains graphic violence, the Safety Team will review and ...
The best way to report abusive content or spam on Facebook is by using the Report link near the content itself. Below are some examples of how you can report content to us. Learn more about reporting abuse. If you don't have an account or can't see the content that you'd like to report (e.g. someone blocked you), learn what you can do.
How to Report Things. The best way to report abusive content or spam on Instagram is by using the Report link near the content itself. You can also report a post or profile on Instagram. Below are examples of how you can report content to us. Learn more about how to report abuse.
You can cancel or check the status of a report you've made to Facebook.
Here's how to do it: 1. Go to the desired Facebook profile that violates Facebook's community standards. 2. Click on the three-dotted icon from the top right corner of the menu bar of the profile. 3. Select Find Support or report. 4. Choose the desired reason for reporting the content from the options provided.
Go to the profile page of the person who is impersonating you. Click on the three dots near the top right of the profile page and select "Give feedback or report this profile.". Choose the ...
To report a public page or group, click the ellipsis icon next to the Share icon and select Report Group. At the next window, choose the specific reason for reporting the page, such as Hate Speech ...
To report someone using the mobile app: Go to their profile. Tap on the three horizontal dots icon next to the Message button. Select Report profile. Choose from the available list of issues or ...
See less. Your account should represent you, and only you should have access to your account. If someone gains access to your account, or creates an account to pretend to be you or someone else, we want to help. We also encourage you to let us know about accounts that represent fake or fictional people, pets, celebrities or organizations.
For mobile users using the Facebook App. Open the Facebook app and navigate to the Page you wish to report. Tap on the '3-dot' menu icon below the cover photo. Select 'Find Support or Report Page'. Now select an appropriate reason that best fits your complaint. Tap on 'Next' at the bottom of your screen once you are done.
A step-by-step guide to reporting a comment on Facebook: 1. Open Facebook and identify the Comment you want to report. 2. Tap three dots and a menu will pop-up. 3. Find and tap report comment option from the menu. 4. Choose an appropriate reason before submitting report.
Learn why your content may have been removed from Facebook.
If you think your account was suspended by mistake, you may be able to appeal the decision by logging in and following the on-screen instructions.
Disrespectful. Uncivil or unkind. Be respectful to your neighbors: Civil conversations. Graphic content. Do not engage in harmful activity: No graphic, violent, sexually explicit, or adult content. Public shaming. Be respectful to your neighbors: Public shaming. Violation of someone's privacy.
The goal of our Community Standards is to create a place for expression and give people a voice. Meta wants people to be able to talk openly about the issues that matter to them, whether through written comments, photos, music, or other artistic mediums, even if some may disagree or find them objectionable. In some cases, we allow content ...
A police report is shedding light on the moments leading up to the shooting death of an 11-year-old Florence County boy. Top Story 'We understand': Myrtle Beach bowling alley raising money for ...
Ultimately, one in five managers said they would prefer not being people managers given a choice. Further analysis found that 48% of managers are at risk of failure based on two criteria: 1 ...
For Gen Z the popular view is that smartphones have made them miserable and they will live grimmer lives than their elders. More and more people in the West tell pollsters that today's children ...
There are a few reasons for this. First: Even the most optimistic assessments of cloud seeding say that it can increase rainfall by a maximum of 25 percent annually. In other words, it would have ...
In this video, I will talk about ASML's (ASML-0.94%) first-quarter earnings report, which met the company's expectations. Management still expects a flat 2024 followed by an acceleration in 2025.
April 17, 20249:07 AM PDTUpdated 28 min ago. [1/5]People walk through flood water caused by heavy rains, in Dubai, United Arab Emirates, April 17, 2024. REUTERS/Amr Alfiky Purchase Licensing ...
Photograph: Getty Images. Apr 16th 2024 | Hong Kong. W hen China's leaders set an economic-growth target of "around" 5% for this year, the goal was universally described as ambitious. Now ...