By Bear Howard
From time to time, I read the online community forum Nextdoor. Out of curiosity, I even took the time to review all their guidelines—both for users and the moderators who have the power to judge posts and comments, as well as the “members.” I have noticed that some familiar voices on the platform have suddenly disappeared. As it turns out, they had been kicked off. I’ve also seen posts with animated and colorful debates disappear and be removed. That led me to dig deeper. I wanted to learn more about Nextdoor—where it originated, how it operates, and where it might be headed. Here’s what I found. (And don’t miss the final comment on changes coming to the platform this summer at the end of this article.)
Click the arrow to listen to this article narrated for you.
Nextdoor – Going Behind the Curtain

In an era when the social contract increasingly plays out through screens, the idea of a “digital front porch” seems inviting—a space to meet your neighbors, borrow a ladder, post a lost dog, or sound the alarm on a break-in. It’s also a modern-day “soapbox.” Nextdoor, founded in 2008, built itself around that promise. It isn’t just another social network; it presents itself as a community utility. Unlike Facebook or Reddit, it stakes its legitimacy on trust, requiring verified real names and addresses.
This identity model and its hyperlocal design set it apart as a civic-oriented platform. But over time, cracks have appeared—not only in the structure, but in the foundation. These cracks have raised important and increasingly urgent questions: Is Nextdoor truly a platform for fair and constructive community discourse? Or is it a digital mirror reflecting the same biases, power imbalances, and opacity that plague broader society?
This essay examines that question by blending platform analysis, user experience, and the findings of professional journalists who have taken a deeper, more disciplined look at Nextdoor. The goal is not to vilify, nor to absolve—but to engage in an honest reckoning, one that any platform aspiring to mediate the public square must eventually face.
Nextdoor’s core model is built on trust—real people, verified homes, and neighborhoods defined by geography rather than ideology. On paper, this makes it a dream platform for civic engagement. Need a plumber? Post. Lost dog? Alert the neighbors? Worried about a suspicious car? Share the photo. Want to organize a block party, alert residents to a city council meeting, or raise money for a local nonprofit? It’s all possible.
The platform has implemented tools to minimize toxicity. Google’s Perspective API and built-in “Kindness Reminders” prompt users before posting something potentially harmful. The moderation system includes nearly 300,000 volunteer moderators as of 2024. According to Nextdoor’s transparency reports, about 90 percent of reported content is reviewed by a human within six hours. These numbers suggest an impressive infrastructure that supports both automation and human judgment at scale.
But statistics don’t always reflect lived experience. A closer look reveals growing discontent over how moderation decisions are made—and who is making them.
At the heart of Nextdoor’s fairness debate is its moderation model. Unlike other social platforms that hire trained professionals, Nextdoor relies on local-residents – so-called “Leads” and “Review Team” volunteers—to moderate their neighbors’ posts. These individuals are not trained journalists or civic officials. They are simply your neighbors. That sounds democratic, but in practice, it means that decisions about what is appropriate, or offensive, are deeply subjective.
The platform gives these neighborhood moderators broad powers but little oversight. There’s no mechanism for residents to vote moderators in or out. In some cases, users can’t even tell who the moderators are. And unlike traditional public institutions, there’s no clear appeals process, no ombudsman, and no outside review body.
Stories of arbitrary censorship abound. I read about a resident in Santa Clara, California, who reported being banned in 2020 after posting support for Black Lives Matter. That same year, journalists revealed that while posts expressing support for BLM were removed, racially charged content warning about “suspicious” people—usually people of color—was often allowed to remain. The company later acknowledged mishandling these cases and promised reforms. But the episode highlighted a deeper truth: Nextdoor was reflecting, and in some cases amplifying, societal biases under the guise of neighborly safety.
Journalists have not ignored this tension. In a 2025 report, the Associated Press described Nextdoor as having once been a “magnet for racists and cranks.” The platform had to remove its “Forward to Police” feature because it was disproportionately used to report people of color for trivial behaviors like walking a dog or sitting in a car. The AP noted that Nextdoor’s recent redesign—adding over 3,500 local news providers into the platform—was a clear effort to shift away from reactive moderation toward a more proactive, civically grounded content model.
That shift matters. Professional journalism operates under a distinct set of norms: verification, context, and fairness. It doesn’t just repeat what people say—it interrogates it. In bringing journalists into the fold, Nextdoor seems to be acknowledging that self-policing by volunteers, even well-meaning ones, is insufficient for true fairness. The platform now relies not just on neighbors, but on professionals, to stabilize the digital public square it created.
Academic research has backed up these observations. A 2024 study in APA Open found that user perceptions of fairness were critical to long-term engagement. When users believe they’ve been treated unfairly—regardless of whether rules were technically followed—they are more likely to disengage, retaliate, or abandon the platform altogether. In other words, a moderation policy is only as good as its public credibility.
Online forums like Reddit and neighborhood blogs offer further insight. Many users describe Nextdoor’s moderation as capricious, ideologically motivated, or just plain confusing. Complaints range from the trivial—posts about traffic tickets being deleted—to the serious—systematic suppression of minority viewpoints. The inability to form alternative “versions” of the same neighborhood or select different moderators means that unhappy users often have no recourse but to leave. This means the Sedona Nextdoor is the Nextdoor you’ve got and we’ll get.
Meanwhile, Nextdoor continues to frame itself as a force for good, citing low rates of harmful content and quick removal times. But as any seasoned journalist or civic advocate will tell you, absence of visible harm doesn’t mean harm hasn’t occurred. Bias can live in silence just as easily as in speech.
Nextdoor was never designed to be a neutral space. It was designed to be a local space. But localism brings with it the full weight of local bias, parochialism, and fear. What one neighborhood sees as vigilance, another might see as profiling. What one moderator calls inappropriate, another might consider civic activism. In this ambiguity, power thrives—quietly, invisibly, and often without accountability.
There is a bitter irony in all this. The very elements that make Nextdoor unique—its reliance on real identities, its local focus, its neighbor-led governance—also make it fragile. Without formal structures for fairness, transparency, and appeal, the platform risks becoming a digital HOA: full of rules, devoid of justice.
None of this means that Nextdoor is without value. The platform has enabled disaster response, revived lost pets, connected volunteers with the elderly, and informed residents of civic meetings and local initiatives. These are real and meaningful contributions. But if Nextdoor wants to be more than just a glorified bulletin board, it must invest in democratic infrastructure.
This means allowing users to vote for or review moderators. It means publishing data not just about what gets removed, but about who makes the decision and why. It means creating independent review boards—like editorial ombudsmen or citizen panels—to resolve disputes. It means embracing journalism not just as a content source, but as an ethical framework for truth and fairness.
The goal of any civic platform should not be perfection, but accountability. Nextdoor, like any institution claiming to support democratic life, must earn and re-earn the public’s trust. That requires more than kindness tips and automated filters. It requires structure, honesty, and above all, transparency.
Fairness online is not a feature—it’s a practice. And unless that practice is made visible, participatory, and reviewable, the promise of digital community will remain just that: a promise, never fulfilled.
After reviewing multiple sources analyzing Nextdoor that I used in this article, I wanted to better understand the changes currently being implemented under the leadership of CEO Nirav Tolia, who returned to lead the company earlier this year. Tolia, the founder of Nextdoor in 2008, appears focused on addressing both the platform’s long-standing profitability challenges and the persistent criticisms surrounding its community dynamics. The redesign now reportedly underway is aimed at creating daily value for users and fostering consistent engagement—much like Facebook or Instagram. However, I believe Nextdoor’s transformation goes beyond simply boosting ad revenue. It’s also about restoring user trust, curbing the chaos of inconsistent moderation, and improving the platform’s image as a fair and reliable digital space for neighborhoods.
What follows is a summary of the changes my research uncovered. I did not write this description myself; I’m simply passing along the most coherent explanation I found that outlines what’s changing and why these changes matter.
The future of Nextdoor is shifting from a chaotic neighborhood message board into a structured, daily-use civic tool—more like a local utility than a free-for-all forum. Instead of relying on self-appointed moderators and unpredictable neighbor commentary, the redesigned platform centers around curated news, safety alerts, and AI-powered local recommendations. These features are designed to make Nextdoor a habit—something residents check daily for reliable updates, trustworthy tips, and useful services—while minimizing the drama and dysfunction that have plagued the platform since its early days.
By reshaping how content flows and reducing the influence of rogue moderators or agenda-driven users, Nextdoor aims to rebuild trust and become a true companion to everyday neighborhood life. The shift is subtle but profound: less confrontation, more coordination; less personal judgment, more platform-guided clarity. It’s a move from passive posting and reactive moderation toward a proactive, civically minded design that invites users back for what matters—connection, not conflict.
Nirav Tolia (CEO) has stated that the goal is to turn Nextdoor into the “essential companion for neighborhood life,” not a place people visit just to complain or argue.
If you or you know of someone who was banned from any local Nextdoor, including Sedona Nextdoor, you can comment after my article as to what you think caused you to be “banned.”
As part of my research, I am very interested in hearing from the public on their experience with Nextdoor.
If a post of yours was taken down without a cause you agreed with, also comment. If you have opinions on the Nextdoor platform itself, comment. That way, my report can be educational, as well as giving those who believe they have been “banned” and treated unfairly, a place to express their opinion. This request is open to anyone, not only to people who have engaged with Sedona’s Nextdoor.
Editor’s Note: The opinions expressed above are not the opinion of Sedona.biz or it’s editors and publisher. They are solely the opinion of the author. Sedona.biz welcomes community input and is a cyber platform for the voices in our community that are not heard, otherwise.
12 Comments
I was banned almost 4 years ago for supporting Hom rule. They just blocked me for life with no explanation. I was able to get on several weeks ago and received a report from next-door I had received over 3000 comments and views. Good job within an hour, I was cut off again. Next-door is hyper nationalism and it’s worst. The group decides what their nation or area looks like or should look like demographically and economically that becomes their new nation. They’ve been pushed back against anything. It might change . Progressive statement, supporting change or something you’ll see very rarely on next-door. It’s basically the land of the bees not in my backyard.
Fantastic work Bear! Nextdoor is the epitome of hypocrisy and biased censorship! They block and ban all anti MAGA and anti Trump postings but sure as hell allow Biden and Obama bashing, anti migrant and homeless hate and hysteria, suggested animal cruelty and much more. It’s a great place for MAGA sycophants who come on here and whine bitch and cry about people using their first amendment rights while also whining and crying that they’re not permitted to use violent rhetoric against people with whom they disagree. I’ve been banned from ND for simply correcting a pro MAGA pro brown people deportation cultist on Federal Statutes governing immigration the Posse Comitatus Act. I didn’t name call or use any language other than what the US Code states for each. I am curious how many other people have been blocked and banned for similar excuses made up by the local monitors. I’d love to initiate a class action suit against those monitors for violating their own policies and especially for their biased censorship! So if you have experience similar censorship let me know and if we have the numbers I will seek out a pro bono attorney who has successfully sued ND for the same or similar nonsense.
MAGA wants rights that protect them and their POV while stripping everyone else of the same rights and more. ND is no different than Fox New’s News Parody or any other Russian propaganda network. They only want one narrative, one opinion and anyone who disagrees gets blocked and banned.
Both sides of the aisle have a place and voice on here thanks to Tommy’s adherence to the Constitution and dedication to the First Amendment. And I personally welcome the counterpoints no matter how factual insufficient or how harmful their propaganda may be. I have and will counter each and every lie with fact and truth, they despise this and feel only their POV should be permitted. Fortunately Tommy disagrees and allows both sides to have a platform. That’s the way the First Amendment works! Not by blocking and banning because the truth hurts fragile sensibilities!
I’m a moderator on Nextdoor. To use a word used in this article…yours is a parochial view. Do all the MAGA you want in a group. You probably tried to post to the main feed or place a comment in a post that was in the main feed. People want the farthest reach possible, and aren’t willing to relegate their comments to a group with a small number of people in it. So you violate the guidelines, your stuff gets taken down, and you get all made about it. That’s not what Nextdoor is for. It’s not a public platform. You have to play by the rules or go back to Facebook.
Your rules are NOT evenly enforced! Don’t even attempt to say they are because that’s absolute nonsense and you know it. Like you say you can MAGA it up in a group! What you cannot do is comment negatively about anything MAGA whatsoever! The same is NOT true for those who comment negatively on anything and anyone NOT MAGA affiliated.
Yes Steve, I lasted all but 24 hours as I was banned for explaining what home Rule does, and means. It is a completely useless platform that is more about spreading lies, being deceitful, and disrespectful. All while just trying to educate people!
Off the top of my head, speed bumps in our roundabouts, a tunnel through Thunder Mountain. I know there were 8 or 9.
I’m a moderator in Sedona. There are issues I’ve seen with both moderation, and mistakes (miss takes) people make when viewing moderator actions.
First, the guidelines are fairly clear. Some stuff is subjective, as this article correctly identifies. These are things both us and the platform should work together on. But I make the case below it’s the humans using the tech that need to change first.
Second, some basic info: It almost always takes more than a +1 vote to kill something (e.g., 5 remove and 4 keep results in the post being kept). It usually takes 4+ moderators voting on something. If only 3 vote, then after enough time 3 moderators might cause something to be removed. These are my best guesses based on experience. I have no empirical data on this. So if there is bias, that means at least 3 moderators had the same view, justified or not. That’s something I’d be more inclined to call Group think.
— Not A Public Platform – Free Speech Standards Don’t Apply —
1) Nextdoor is a private platform, not a public square. While rights to free speech are almost total on Nextdoor, people are not completely free to communicate in a tone that is mean, uncaring of the individual, slanderous, violent, or violates other guidelines. You should have read the rules when you signed up. If you don’t like it, Facebook is probably where you should be posting, which has fewer rules, and no goal of civility (as evidenced by way of a lack of enforcement in most cases).
— Strict Adherence To Local-Only Topics *In Main Feed* —
2) There is very strict adherence to guidelines. Take for example non-local topics in the main feed. Politics, religion even, can be discussed in groups, which people can opt in for, but not see otherwise. This is healthy. It keeps the main feed from exploding with contentious topics. People can’t help themselves because they want the farthest reach possible for their messages. That causes I’d guess 80% of the disgruntlement of those complaining about Nextdoor.
But for much of what violates this rule, people errantly conclude it is moderator bias. With few exceptions that are in the grey area, moderators are just following the guidelines. We don’t have a choice about placing a vote not in accordance with a guideline. But I’ll admit some get away with it. However I don’t see this being rampant from my experience.
— No National/World References —
3) Both automated and human reports concern using national issues or persons (e.g., candidates) on the national stage as examples to make a point about a local topic. I find often that if you reference the behavior of either a political figure, a federal agency, or an entire federal administration, it gets killed, UNLESS IT IS IN A GROUP. I don’t agree with this at all, but if you feel your content is being taken down. Do an honest assessment. Many legitimate non-local references get taken down.
For example, I was discussing with someone how lawyers by nature will exploit vague language to further aims (regarding the initiative to protect the Western Gateway property as a park) and cited as an example how we had to listen to arguments by Bill Clinton about what the definition of “is” is. That comment got killed. I see that repeatedly when referencing national political figures, or the policies of US administrations to make a point (again, within the context of a local topic). I can’t really call that bias but rather an overt sensitivity to keywords representative of nation or world level topics.
— Bias Of Violations In The Grey Area —
4) The first bias I notice in the voting patterns of moderators is to vote down posts and comments that don’t violate the guidelines but are in a grey area where the tone is just off enough (uncivil or unkind) to justify voting it down. I call this the “my ears are burning” vote. Here the point being made is valid, but the tone is snarky or recriminating but does not directly name anyone. I call this kind of written behavior “being addicted to the juice of invective”. Those that do this just blindly blow past the “friendliness reminder” Nextdoor displays when it detects unkindness. I have to admit to this fault from time to time. I’m trying to get into a habit of doing a final scrub for tone before I hit send. I’d bet doing that consistently would resolve almost all of the hits under this category.
Some moderators, I believe, keen to take out anything smelling of invective. But sometimes this can be taken too far, especially when a politician or candidate is the subject of discussion. Candidates should always be open for higher scrutiny, if done respectfully. This bias is often seen during run ups to government votes- city, county, state…with federal issues most of the time getting removed.
This tendency was writ large two Sedona city council votes ago when a populist candidate was running for mayor. I couldn’t believe the number of takedowns of things I felt didn’t technically violate the guidelines. This led many to call out bias, which was justified I felt at the time. Fortunately the last election wasn’t nearly as contentious. Not much was removed.
There should be, in my opinion, more clarification about communications that could be taken as mean-spirited but represent a valid point in a discussion. The solution, again, is a scrub for tone before hitting send. We all check ourselves, and encourage/nudge a select few to become nicer with each other.
— Sedona Is Almost Brutal To “Racists and Cranks” —
5) Regarding the removal of the “Forward to Police” feature, and racists and cranks in general, I think Sedona is doing it well, maybe to a fault. I’ve seen very vociferous comments to someone exhibiting racism or fear mongering when reporting suspicious activity. You’d better bring your A Game to Sedona Nextdoor if you’re going to suggest someone is doing something untoward, like approaching a house in an odd way.
— Minority Viewpoints Do Get Ganged Up On By Moderators —
6) This is, I feel from what I’ve seen as a moderator, something that happens. Fortunately it is somewhat rare and not persistent. Again, tone is everything here. I have found there is always a way to say something, if you deeply consider how your tone is perceived by others.
RE: Voting for or reviewing moderators. The role is fairly simple if applied fairly. If a moderator falls under the eagle eye of Nextdoor staff, a review of votes occurs. I’ve had this happen to me- a few votes Nextdoor thought should have the other way were highlighted. That’s about as good as it’s going to get. Community members are not going to be moderators FOR FREE under the level of scrutiny being proposed here. They’re just not. It would become a high pressure thing that no one would sign up for. The combination of no pay and low actual power makes this idea untenable.
Fairness online is indeed a practice. It has to start with the people making posts and comments. The moderators are there to cover the cases that slip through the cracks. If people are complaining about the moderators, then that means WAY TOO MANY THINGS are coming to the moderators. Issues with moderation are just the tip of the iceberg. What’s below the water is the human nature to creates things requiring moderation- no system is going to get around that.
So we keep the baby and keep swapping out the bathwater as often as we can. We need to eliminate myopia (parochialism was the word used here). We can nip away at margins using technology, but we need to open our hearts more to really move the needle.
Final Note: Moderation is not allowed under the guidelines in Nextdoor. We can continue the conversation here, or in the group I created (Sedona members) in Nextdoor:
Nextdoor Moderation Policies and Algorithms
https://nextdoor.com/g/6zau43vuj/?is=groups_section_rhr
That’s a great theory but not how Sedona ND Moderators function. It is very clear from what is permitted in the way of negative hateful rhetoric toward migrants and non MAGA voters and what is not permitted in the way of any comments to the contrary.
What’s preventing a handful of pro MAGA moderators from conspiring against non MAGA neighbors? Your rag is full of knee jerk reactions made by racist elder folks who think the Sinaloa Cartels are turning around in their driveways or trying to sell them solar.
People on ND openly say negative crap about Harris, Biden, Clinton, Obama blah blah blah! The moment anyone counters their little propaganda machine and either asks the poster to prove their statement true or by posting factoids that disprove what they are claiming to be truth when in reality anyone with the skill set to perform a google search would see the garbage being permitted to be posted. You’re the equivalent of a local version of the National Enquirer where nonsense thrives and truth goes to die!
Four white, male, Boomer keyboard warriors moaning that a social media site moderated/policed by racists, N.I.M.B.Y.s, “Karens” and other Boomer keyboard warriors kicked them off?
Between this post and the three comments, the title should be a paraphrase of that Grampa Simpons meme: “Four Old Men Yell at Cloud”
And a snarky comment such as this would easily be reported and removed from Nextdoor
And there in lies the problem with Nextdoor. They dish it out but cannot take it! Bunch of selectively enforced First Amendment hypocrites is what they are.
I’m hardly a Boomer (wrong generation) and though I am considered white I’m not a hate filled racist buffoon who hates others because people tell me to do so. That’s how racism and hatred spread until ignited like tinder in a powder room. Local Nextdoor posters openly support the mass deportations of people based upon the color of their skin under the very thinly veiled lie that they are all violent criminals. Which is utterly ridiculous considering the numbers and the numerous reports of wrongful un constitutional violations of the rights of those deported for cases of misidentification. There’s an entire ICE/DHS cheer section that cheer families separated, lives destroyed and the blitzkrieg like raids by masked nameless badge numberless supposed Federal Agents. Anything said on behalf of the people who end up being victims like those people who were allegedly illegally hired and mistreated by Colt Grill owners (Two of which are white). Those people were “collateral arrests” and have been deported. Think about that “collateral arrests!” So now technically Federal, State and Local Officers can arrest anyone within proximity to a suspected crime or criminal suspect.
This is no different from how the Stasi and KGB operated with impunity in Eastern Europe during the Cold War where censorship was the norm and hateful revenge reporting of neighbors and co workers for made up hysterical nonsense.
At least one can post whatever they wish on here so long as they willing to tolerate the wrath of truth and fact finding.
Being biased racist bigots is not generational specific although there are far more elderly racist than younger racist.But racist hatred is taught and learned by family member after family member and racist acquaintance after racist acquaintance after acquaintance. People grow up hearing racially charged fascist nonsense. Some grow up and realize the ignorance and futility of a life of hatred, while others embrace it and spread its lies and ugliness just like our current POTUS enjoys doing. Those who embrace it usually hide it away like a teenager confused about their sexuality, they cannot explain in logical terms. And those who embrace the life of racial and ethnic hatred of others are some of the biggest cowards on the planet. They either act in groups, ambush or simply motivate others to do their dirty work for them because they are physically and mentally deficient.