
Let’s Get You Up to Speed.
By the end of this UG article, you will understand:
- How evolutionary psychology and fast thinking shape behaviour in plant groups.
- Why gardeners form tribes and how these tribes influence beliefs.
- How social proof replaces scientific reasoning in community spaces.
- Why confident misinformation spreads faster than accurate information.
- How mob dynamics, bullying, and dogpiling appear in gardening conversations.
- How identity politics spill into plant discussions and shut down learning.
- How cult-like behaviour forms around communities or influencers, even without intent.
- Why these forces create the illusion of independent choice in plant purchases.
Got Things to Do? This is For You!
Gardeners often think they choose products based on logic, but social proof and tribal behaviour guide most decisions long before research begins. Online groups reward confidence, simplicity, and repetition, not accuracy. Emotional belonging and fast thinking make gardeners vulnerable to misinformation, mob behaviour, and cult-like group dynamics. Understanding these forces helps you cut through noise and make informed choices.
Table of Contents
Let's Dig In...
Most gardeners think their choices come from research, experience, or intuition. People like to believe they freely pick their grow lights, fertilizers, soil amendments, and plant care habits based on what makes sense. The truth is more complicated. The gardening world, especially the online version of it, is shaped by influence patterns that guide decisions long before anyone walks into a garden centre or taps “Add to Cart” on a product.
Plant communities are emotional spaces. They are places of belonging, identity, pride, frustration, and sometimes disappointment. This emotional backdrop activates something older in us. Evolutionary psychology shows that when people feel uncertain or overwhelmed, the fast, automatic part of the brain sometimes called the lizard brain takes over. It looks for shortcuts. It seeks safety in numbers. It trusts confidence over complexity. This is why gardeners become unusually vulnerable to social proof, misinformation, and persuasive group dynamics.
When plant owners do not have a clear scientific framework for evaluating technical topics like lighting, grow mix porosity, or fertilizers, they rely on the next most powerful thing. The tribe. The vibe. The confident answer. The repeated claim that feels familiar simply because they have heard it before. Influence researchers like Robert Cialdini, Maria Konnikova, and Kevin Hogan have shown how quickly people default to group beliefs when their emotional brain wants reassurance more than accuracy.
This UG article is the foundation for the series. It explores why gardeners trust the wrong information, why tribal behaviour is so strong in plant spaces, and how a community or influencer can accidentally or intentionally mislead its own members. Along the way, we will examine how cult-like patterns develop in groups that begin as friendly communities but slowly shift into closed belief systems. Understanding these forces will make you a more informed consumer and a more resilient gardener as we move into Parts 2 and 3, where we investigate the marketing strategies behind grow lights and fertilizers. The series may expand, so keep an eye out for its evolution.
The goal is not to shame communities, specific manufacturers, or any specific individuals. The goal is clarity. By understanding the psychology that shapes what gardeners and plant parents believe, you can approach plant care from a place of knowledge instead of social pressure or marketing spin.
Why Gardening Communities Become Tribes
People join plant groups like the PHA because they want connection, inspiration, guidance, and reassurance. Unlike many hobbies, gardening has an emotional core. Plants die. Success feels personal. Failure feels personal. There is identity wrapped up in skill where life is involved and belonging wrapped up in shared experience. This emotional landscape creates fertile ground for tribal behaviour and it reflects something deeper in our biology.
Evolutionary psychology teaches that humans bond in groups for safety. The brain evolved to treat belonging as survival. Disagreement inside the tribe feels dangerous. Novel ideas feel threatening. The older, faster limbic or emotional system that some call the "lizard brain" triggers automatic reactions long before the rational mind has time to evaluate anything.
“Belonging is a fundamental human motivation.” - Baumeister and Leary
Groups form easily when people feel uncertain. Gardening is full of uncertainty. Light levels, moisture, nutrients, pests, and environmental variables are unpredictable. In uncertainty, people do not search for evidence first. They search for reassurance.
Tribes provide reassurance by giving gardeners a sense of shared understanding. When people feel supported, they stop worrying about every decision. But tribes also create blind spots. The desire to maintain harmony makes it harder to question popular beliefs, and this sets the stage for misinformation to become accepted as truth.
How Social Proof Shapes Gardening Beliefs
Social proof is one of the strongest influence mechanisms documented by behavioural researchers like Robert Cialdini. When people are unsure what to do, they look to others for behavioural cues.
In gardening communities, social proof appears everywhere. A fertilizer becomes “good” simply because enough people repeat the claim. A specific grow light becomes “trusted” because it is mentioned often, not because anyone has reviewed its actual light output intensity (PPFD in µmol/m²/s) or tested its performance at a specific distance from a plant. When popular online personalities (Influencers) endorse a product, their followers amplify the message without questioning whether the claim aligns with plant science.
This is how misinformation gains authority. A group repeats the message until it becomes part of the community’s identity. At that point, challenging it feels like an attack, not a correction.
“We view a behaviour as more correct in a given situation to the degree that we see others performing it.” - Robert Cialdini
Social proof shortcuts thinking. The brain chooses the familiar answer instead of the accurate one because familiarity signals safety. Behavioural researchers like Robert Cialdini have shown that people instinctively use the behaviour of others as a guide when they feel uncertain. From an evolutionary standpoint, following the group increased survival, while disagreeing reduced it.
Repetition becomes truth. Familiarity becomes fact. As Maria Konnikova notes, people trust what feels familiar long before they trust what is true. The more often a claim is repeated in a gardening community, the more the emotional brain treats it as reality.
Pro Tip: When almost everyone in a group agrees instantly on a plant care claim, treat it as a signal to investigate, not a sign of accuracy.
The Confidence Problem: Why Uninformed Opinions Spread Faster Than Facts
Gardening groups move quickly. People ask questions because they are stressed or worried about their plants. They want fast answers, simple explanations, and emotional relief.
The brain prefers low effort thinking. As Daniel Kahneman describes, the fast emotional system acts automatically, while the slow logical system requires time and cognitive energy. In plant groups, the fast system takes over.
Confident answers appear knowledgeable, while simple answers appear correct, and reassuring answers appear trustworthy. This creates a perfect environment for misinformation. The group elevates the answers that feel good, not the ones grounded in scientific evidence.
Meanwhile, accurate information is slower. It needs nuance, and explanation. It will often contradict a tribe’s favourite beliefs. People interpret this as threatening, so they resist it long before they consider whether it might be true.
“Confidence is often mistaken for competence, but the two rarely travel together.” - Maria Konnikova
This is why misinformation spreads faster across gardening communities than evidence based guidance. Confident members unintentionally outrun those trying to share scientific explanations.
Group members reward emotional relief more than accuracy.
When New Members Challenge a Science-Based Tribe
A science-based gardening community like the PHA develops its own internal culture. Over time, members learn to value evidence, measurements, consistency, and real-world outcomes. They reward nuance instead of simplicity, test results instead of repetition, and clarity instead of comforting myths. This is a rare kind of tribe because it depends on slow thinking, accountability, and shared standards rather than speed, emotional reassurance, or popularity. From the first day the PHA was conceived, this has been the foundation.
But every thriving group attracts new members. Many arrive from spaces where myth is rewarded over method, where chaotic advice spreads unhindered, and where influence is built through charisma rather than accuracy. These newcomers often bring inherited family myths, social media shortcuts, plant folklore, TikTok tricks, YouTube miracle potions, and emotionally comforting claims that do not hold up under scrutiny.
This creates tension because the competing value systems collide.
Why New Myths Feel Threatening in a Science-Based Group
When a new member introduces a disproven claim, the tribe experiences the same evolutionary trigger described earlier. The fast-thinking system interprets disruption as danger. But in this case, the danger is not the unknown. It is the possibility of regression. A science-based community has invested time and energy in building a stable, evidence-first culture. Anything that threatens that progress activates a protective instinct.
Some members respond with education. Some respond with correction. Some respond with frustration. Some respond with defensiveness. None of these reactions are hostility for its own sake. They are biological responses to perceived group instability.
Influence researchers like Maria Konnikova, Robert Cialdini, and John V. Petrocelli note that when a group has developed shared beliefs and norms, members become guardians of those norms. Any claim that destabilized the shared framework is treated as a threat to cohesion, not simply an intellectual disagreement.
A myth introduced into a scientific space feels like a crack in the foundation.
The group reacts to seal the crack, not to shame the person.
The emotional brain does not distinguish between a social threat and a factual error. To the old neural circuitry, they are the same thing.
This is why correcting misinformation can feel sharp, even when delivered gently. It is why newcomers sometimes interpret correction as hostility. And it is why some respond by resisting, withdrawing, or seeking out others who validate the familiar myths they brought with them.
This dynamic does not mean the community is unwelcoming. It means the community is protecting its core identity, which is built around evidence, clarity, and accountability.
The Lizard Brain Meets the Laboratory Brain
A science-leaning group uses its analytical, slow-thinking system as its identity. When myth reappears, the emotional brain reacts first, because myths are sticky. They are simple, familiar, comforting, and widely repeated.
This creates a psychological collision:
- The newcomer brings familiarity, which the lizard brain likes.
- The group brings accuracy, which requires cognitive effort.
To someone untrained in the practice of logical science-based thinking, the tribe can appear harsh even when it is simply being precise. To the tribe, the newcomer appears careless even when they are simply repeating what they have always heard.
This misunderstanding is where fractures can begin.
Why Correcting Myths Often Backfires
Influence researchers like Petrocelli and Konnikova have shown that confronting false beliefs directly can strengthen them. When a myth is meaningful to the newcomer’s identity or self-worth, correction feels like rejection.
Correcting a myth is not just correcting information, it is challenging the story someone uses to feel competent.
“Beliefs hold people. People do not simply hold beliefs.” - John V. Petrocelli
For newcomers used to communities that reward myth with affirmation and acceptance, a science-based group like the PHA feels unfamiliar and unsafe. The emotional brain often interprets scientific correction as hostility, even when the tone is neutral or even kind.
Why Splinter Groups Form
When enough newcomers or less engaged members feel this discomfort, they often cluster together. They create sub-identities inside the main group, or they leave to find or form entirely new spaces that feel more familiar and emotionally safe.
This is the birth of a splinter group, and what largely drove me to start Plant Hoarders Anonymous.
From an evolutionary standpoint, this behaviour is predictable. People form micro-tribes when the dominant tribe’s norms conflict with their internal beliefs. Simplicity-based tribes and science-leaning tribes rarely blend peacefully for long because they reward opposite things.
A science-leaning tribe rewards accuracy because communities oriented around empirical outcomes tend to self-correct. Studies on scientific communication (Petrocelli, 2021) show that groups with evidence norms naturally filter out unsupported claims.
It rewards nuance because plant biology, lighting physics, nutrient chemistry, and microbial ecology are inherently complex. Oversimplification leads to failure; nuance leads to understanding.
It rewards measurement because PPFD, DLI, EC, pH, and mix porosity cannot be guessed or intuited. Measurement creates shared reality, reducing the influence of personality or social dominance.
It rewards slow thinking (Kahneman’s System 2), which emphasises deliberate reasoning. Groups that rely on slow thinking are more resistant to misinformation and less swayed by charismatic but inaccurate voices.
It rewards humility, a recognised marker of scientific literacy. Research by Tenney et al. (2019) shows that intellectual humility increases accuracy and reduces overconfidence, stabilising group norms.
A myth-based tribe, however, rewards confidence because the brain treats certainty as a proxy for truth. Even incorrect claims become persuasive when delivered with conviction, something Cialdini and Konnikova both document extensively.
It rewards simplicity because the cognitive load is low. Simple rules feel actionable, safe, and true, even when they contradict plant physiology.
It rewards emotional reassurance because humans prioritise relief over correctness when anxious or uncertain, especially in hobbies where failure feels personal, like gardening.
It rewards repetition because repeated claims bypass critical evaluation through the mere-exposure effect. Familiarity becomes a substitute for evidence.
And it rewards fast thinking (Kahneman’s System 1), which favours intuition, narrative, and social cohesion. Fast thinking makes myths feel right and scientific explanations feel abrasive or unnecessary.
For a deeper look at how the mind jumps to conclusions and struggles with careful reasoning, Thinking, Fast and Slow by Daniel Kahneman is a great read. It breaks down the mental habits that shape how we judge information without even realising it.
These reward systems are incompatible. Once separated, each tribe evolves more strongly in its chosen direction. One becomes more scientific, while the other becomes more myth-forward and emotionally driven.
Once a non-science splinter group forms, the smaller group often believes it is the “better” community. They perceive themselves as kinder, more open, more supportive, or more intuitive. Meanwhile, the larger science-based group continues focusing on accuracy, consistency, and measurable outcomes. Both groups feel validated. Both feel unified. Both see themselves as the reasonable one.
This is a classic pattern in group psychology. When people cluster around similar beliefs, those beliefs gain emotional weight. The splinter group reinforces its myths because myth feels familiar and comforting. The evidence-based group reinforces its standards because accuracy feels stabilising. Each group develops its own identity, and identity drives certainty more powerfully than facts.
“People like stories that make them feel certain, even when the truth is uncertain.” - Ryan Holiday
This is how new plant and gardening myth communities form. This is how misinformation recycles, and this is how influencers with myth-friendly messaging gain momentum and popularity. A splinter group built on simple, entertaining narratives becomes fertile ground for confident personalities to rise quickly, because the group rewards emotional reassurance over scientific clarity.
But this splitting is not failure. It is simply how human groups behave when values collide. When tribes diverge in their reward systems, they naturally sort into separate communities that reflect their preferred worldview. From an evolutionary perspective, this is predictable. From a psychological perspective, it is inevitable.
When Gardening Tribes Turn Hostile: Mob Dynamics and Cult-Like Behaviour
Even generally supportive plant and garden communities like the PHA develop negative patterns. Under the right conditions, entire tribes or smaller factions within can shift from helpful to hostile. This happens not because people intend harm, but because ancient social instincts activate in modern environments.
How Neutral Topics Become Identity Flashpoints
Lighting, fertilizer, soil mixes, and pest control should be relatively neutral topics as the underlying science is generally quite well established. Yet in many groups, some of these discussions can escalate quickly.
If a member challenges a popular 'best-of-breed' product or process, the group as a whole or smaller groups within can interpret it as a challenge to collective identity. The lizard brain reacts with defensiveness. People push back to protect the tribe, not because these ideas are necessarily wrong, often just because they are different.
Bullying, Dogpiling, and the Punishment of Dissent
Mob behaviour emerges when people feel safer attacking as a group than thinking individually. This kind of disruption was recently on display in the PHA. It reflects both the online disinhibition effect and deep evolutionary threat responses that push people to protect the tribe first and ask questions later.
The Online Disinhibition Effect
The online disinhibition effect describes the way people behave more aggressively, more confidently, or more impulsively online than they would in person. Psychologists like John Suler have shown that when people interact behind screens, several factors combine to lower natural social restraints.
Online, people feel:
- less accountable
- less visible
- less responsible for emotional impact
- more anonymous, even when their name is shown
- more protected by distance
- more aligned with their tribe than their own behaviour
This creates a psychological environment where the emotional brain becomes louder and the reflective brain goes quiet. The lizard brain reacts instantly without the softening influence of in-person social cues like tone, facial expression, body language, or shared space.
Why It Matters in Plant and Gardening Communities
The online disinhibition effect becomes even stronger in plant groups that form around a popular influencer. These communities often develop an emotional culture shaped by the influencer’s tone, confidence, and worldview. Members adopt the influencer’s language, repeat their advice, and defend their authority because the influencer becomes a central part of the group’s identity.
When a fact-based contributor enters a feelings-based collective like this, they are not just challenging information. They are challenging the influencer’s role in the tribe. This creates a volatile psychological environment.
This dynamic explains why:
- corrections escalate faster
- arguments intensify
- myths are defended more fiercely
- newcomers or outsiders are judged more harshly
- dogpiling happens in defence of the influencer
- silence from members is interpreted as agreement with the dominant narrative
- the group treats disagreement as a personal threat to its identity
In influencer-led groups, accuracy matters less than loyalty. The primary social currency becomes emotional alignment with the influencer’s tone and beliefs. So when someone enters with evidence, nuance, or scientific clarity, it disrupts the emotional contract the group has formed.
A fact-based correction is experienced as disloyalty, not information.
When a science-oriented member posts a correction, their tone often appears sharper online than it would in person. Not because they intend hostility, but because online environments remove the behavioural cues that soften disagreement. Meanwhile, members conditioned by the influencer’s confident, simplified advice interpret any deviation as disrespect.
The mismatch is predictable:
The fact-based contributor believes they are helping, while the influencer’s followers believe the contributor is attacking the tribe’s leader.
This is why the response is often disproportionate. It is not about the claim. It becomes about protecting the influencer’s authority.
At the same time, newcomers who rely more on feelings than facts may perceive even gentle correction as harsh. The disinhibition effect distorts tone, while the group’s emotional norms amplify feelings of threat. A factual explanation is interpreted as arrogance. A reference to evidence is read as condescension. A myth correction is framed as a personal insult to the influencer.
This is how tension escalates quickly and why influencer-centric communities often become closed systems. They defend the influencer’s worldview rather than the truth. Correction feels like betrayal. Evidence feels like disruption. Defending myths becomes a form of group loyalty.
(for more research on ODE check out this recent research - https://pmc.ncbi.nlm.nih.gov/articles/PMC11612148/)
Why the Brain Behaves This Way
Humans evolved to navigate conversations face to face, using micro signals to regulate behaviour. A raised eyebrow, a softer tone, a shift in posture, or a change in proximity all help us read intention and adjust our reactions.
Online, all of those cues vanish. Without that feedback, the brain relies on instinct instead of empathy. People anchor their behaviour to the tribe rather than the individual. Impulsiveness rises. Group loyalty strengthens. Judgement becomes faster. Defensiveness increases. Extreme opinions begin to sound normal simply because others repeat them.
How the Online Disinhibition Effect Fuels Splinter Groups
Disinhibition accelerates the tribal divide. Tension that might have resolved naturally in person becomes amplified online. A newcomer posts a myth. A long-term member corrects it bluntly. The newcomer interprets the correction as hostility. Others jump in. Threads split into factions. Emotional narratives form. Eventually, a subgroup leaves to form a new space where their preferred beliefs face less friction.
This is not failure. It is simply how online tribes behave when their reward systems collide.
How This Connects to Influence Psychology
The online disinhibition effect amplifies every influence principle described by researchers like Robert Cialdini, Maria Konnikova, Kevin Hogan, and John V. Petrocelli. When emotional cues disappear and social restraint drops, the brain becomes more susceptible to fast, instinctive influence triggers. Authority feels stronger. Social proof feels more compelling. Repetition feels more truthful. Familiar claims feel safer than accurate ones.
In this environment, confidence reliably masquerades as competence. A person who speaks with absolute certainty is believed more than someone who presents nuance, context, or scientific explanation. The lizard brain prefers clarity over complexity, even when the clarity is false.
This is the psychological terrain that makes influencer-led groups uniquely fragile. The influencer’s tone sets the emotional climate. Their worldview becomes the default lens through which members interpret information. Followers begin repeating their phrasing, defending their claims, and adopting their logic, even when the influencer’s expertise is thin or inconsistent. Loyalty becomes a currency, and accuracy becomes optional.
Once this pattern takes root, the influencer no longer needs to enforce credibility.
The tribe enforces it for them.
Correcting misinformation is reframed as arrogance.
Nuance is interpreted as negativity.
Evidence is treated as provocation.
The group protects the belief system surrounding the influencer, not the truth itself.
Over time, emotional loyalty replaces critical thinking. Members become more certain of their claims precisely because those claims are repeated within the group, not because they are true. The longer this dynamic persists, the more correction feels like threat and the more dissent feels like betrayal.
This is where ordinary online communities drift into something much more structured and much more psychological.
The Psychological Doorway to Cult-Like Behaviour
Cult-like behaviour does not require manipulation or malicious intent. It emerges naturally when three forces converge, and these forces are even more intense in today’s short-form video landscape.
A confident, central authority figure
Short-form platforms like TikTok reward charisma, certainty, and simplicity. A creator who speaks quickly, confidently, and with strong visual presence is elevated by the algorithm. Their delivery style creates a sense of entertaining expertise long before the audience has time to assess whether the information is accurate. Confidence becomes a proxy for competence.
A group that relies on emotional reassurance over empirical evidence
Video influencers excel at creating parasocial bonds. Their followers feel connected to them, trust their tone, and internalise their explanations because video feels personal. When topics are complex, like grow light physics or fertilizer chemistry, audiences often default to the emotionally comforting narrative rather than the scientifically demanding one. The influencer becomes the source of reassurance, not the science.
An online environment that strips away empathy cues and amplifies instinctive reactions
Short-form platforms intensify fast thinking because their design rewards immediacy rather than reflection. With only a few seconds to process information, the brain defaults to quick, instinctive judgments instead of slow, analytical reasoning. Comment threads favour speed over nuance, and platform features like Duets and Stitches amplify reactive behaviour by encouraging users to respond instantly, publicly, and emotionally. These mechanics reduce cognitive load by rewarding the fastest possible interpretation, not the most accurate one.
At the same time, video-based influencers trigger a form of parasocial bonding. Viewers see their face, hear their voice, and experience repeated micro-interactions that mimic real social contact. Evolutionary psychology shows that the brain treats these cues as markers of trust and familiarity, even when the relationship is entirely one-sided. This makes followers more likely to defend the influencer reflexively because the emotional brain interprets criticism of the influencer as criticism of someone “inside the tribe.”
Without the grounding cues of real conversation such as tone regulation, eye contact, pauses, or shared physical context, the brain operates on instinct. Tribal loyalty strengthens, nuance collapses, and followers begin defending the influencer as though defending the tribe itself. This shift sets the stage for a deeper transformation.
When these elements align, a video-driven community begins functioning less like a learning space and more like a belief system. Members defend the influencer as though the criticism were personal. Group narratives eclipse what can actually be measured or tested, and beliefs become emotionally fortified, resisting outside scrutiny. As this dynamic deepens, loyalty quietly transforms into the gateway for remaining inside the tribe.
Where This Leads: The Predictable Outcomes
When a fact-based contributor enters an influencer-driven, emotionally regulated group, the collision is inevitable. The science-based gardener brings measured, nuanced, evidence-backed information. They speak from slow thinking, accuracy, and clarity. The group, however, receives that information through loyalty, instinct, and emotional alignment. What the contributor experiences as help, the influencer’s followers experience as threat.
From this mismatch, predictable outcomes emerge. The group reacts defensively because identity is activated. Conflict escalates due to the online disinhibition effect, which lowers behavioural restraint and amplifies instinctive responses. Members increase their protection of community myths because defending them functions as a signal of loyalty. Nuance is rejected because nuance feels destabilising inside a belief-driven environment. Dissent is punished because dissent disrupts cohesion. As these tensions build, splinter groups form when differences can no longer coexist inside the same psychological space.
Over time, communities sort themselves according to worldview rather than accuracy. Evidence-based groups tighten their standards, prioritise clarity, and cultivate accountability. Emotion-based influencer groups double down on reassurance, familiarity, and myth retention. Neither trajectory is surprising, and neither represents failure. Both simply reflect how human psychology behaves in digital environments where identity, emotion, and social influence collide.
Identity and Belonging: How Tribal Attachment Shapes Consumer Behaviour
All of these group dynamics do more than shape the conversation. They also shape what people believe they are choosing freely.
Tribes offer safety, but they also demand conformity. When people identify strongly with a group, its beliefs become part of their identity. Disagreement feels personal.
Influence researchers like Mario Moussa explain that groups shape individual beliefs more than individuals shape group beliefs.
“Groups shape the beliefs of individuals far more than individuals shape the beliefs of groups.” - Mario Moussa
Tribal attachment means gardeners defend products they have never measured, practices they have never tested, and claims they do not understand. The goal shifts from learning to belonging.
Belonging is powerful. But it also creates predictability. Companies count on that predictability.
Emotional Vulnerability and the Need for Certainty
Gardening is emotional. People love their plants and do not want to harm them, and that protective instinct shapes far more of their decision making than they realise. When a plant struggles, or when a gardener feels uncertain about what to do, the lizard brain activates and searches for safety through simplicity. In these moments, the fast-thinking system takes over, prioritising emotional reassurance over analytical evaluation. This is why gardeners become especially vulnerable to:
- overly confident advice that promises certainty where none exists
- simplified rules that feel comforting because they remove complexity
- miracle claims that tap into hope and bypass critical thinking
- “must buy” products that exploit fear of making the wrong choice
- influencers who appear nurturing or knowledgeable, even when their expertise is shallow
When the emotional brain wants relief, confidence feels like competence and repetition feels like truth. The desire to protect a beloved plant merges with the psychological need for certainty, creating a powerful pull toward anything that feels soothing rather than anything that is scientifically accurate. This is one of the core vulnerabilities companies and influencers rely on when shaping gardener behaviour.
"People make decisions based on emotion and justify them with logic after the fact.” - Kevin Hogan
When emotion takes precedence over science, people make choices they believe are rational, but those decisions are actually guided by psychological shortcuts. The brain fills gaps in knowledge with familiar stories, confident voices, and whatever the tribe repeats most often. These shortcuts feel like logic because they reduce anxiety and offer instant clarity, but they lead gardeners toward choices shaped more by instinct and influence than by measurable reality.
Why Group Dynamics Distort Technical Understanding
Grow lights, fertilizers, and grow mix porosity are three of the most complex consumer topics in the gardening world. They require understanding:
- PPFD (Photosynthetic Photon Flux Density): The actual usable light that reaches the leaves, which determines whether a plant grows, stalls, or declines.
- Spectral quality: The distribution of wavelengths that influence leaf expansion, internode spacing, coloration, and photosynthetic efficiency.
- Beam angles: How narrowly or widely a light spreads photons, which affects coverage, intensity drop-off, and hotspot formation.
- DLI (Daily Light Integral): The total amount of light a plant receives over 24 hours, and one of the strongest predictors of growth rate.
- Nutrient solubility: How easily fertilizer compounds dissolve so roots can absorb them, which shifts with pH, temperature, and chemical formulation.
- Ionic uptake: The process roots use to absorb individual ions (like nitrate, potassium, calcium), each with its own transport requirements and limitations.
- Microbial viability: Whether beneficial microbes can actually survive in a given medium; many products marketed to indoor growers contain organisms that die quickly in soilless mixes.
- Grow mix porosity: The ratio of air space to water-holding space in a potting mix, which determines oxygen availability to roots, moisture stability, and the risk of hypoxic stress or rot.
Most gardeners have never been taught these principles, and that is not a failure. It is simply the reality of a hobby built on decades of inherited advice rather than formal education. These topics sit at the intersection of physics, chemistry, botany, and soil science. They do not lend themselves to simple rules or one-size-fits-all advice.
Because the information is complex, people naturally outsource their understanding to the group. But when the group holds incorrect beliefs, those beliefs replicate themselves. A misleading claim about light becomes a rule. A misunderstanding about fertilizer becomes a norm. A myth about potting mix becomes a tradition. Each repetition strengthens the illusion of truth, and over time the group’s collective confidence begins to outweigh the underlying science.
This is how misinformation becomes self-sustaining. It persists not because people are careless, but because the cognitive load of understanding these systems is high and the emotional reward of a simple answer is immediate. The tribe becomes the authority, not the evidence.
Pro Tip: If a claim sounds scientific but does not include measurable variables, treat it with caution.
The group becomes the authority, and truth becomes whatever the tribe can agree on, not what can be tested or verified.
How Echo Chambers Form and Why They Resist Correction
EcEcho chambers form when repetition is rewarded, dissent is discouraged, and anything that challenges the tribe’s identity is quietly filtered out.
No plant group sets out to mislead its members. The echo chamber builds itself. Familiar answers feel safe. New information feels risky. Over time, the group gravitates toward whatever maintains comfort.
Correction becomes difficult because:
- Unfamiliar facts create discomfort
- Certainty feels safer than ambiguity
- Shared beliefs strengthen belonging
- Challenging the group feels costly
In that environment, the stories that spread are the ones that feel clear and emotionally grounding. Gardening myths take root because they offer certainty, even when the truth remains more complicated.
The Role of Influencers in Shaping Community Beliefs
Most influencers rarely start as experts in the plant world. Some started their journey because they got stuck indoors during Covid. Their authority emerges because audiences respond to their confidence, tone, and emotional accessibility. In digital environments, repeated visibility becomes a substitute for competence, and charisma is misread as evidence.
They quickly become the psychological anchor of the group. Even without building a formal membership, the audience functions as their tribe. Followers adopt their language, defend their statements, and challenge critics reflexively. This is the psychological soil in which cult-like loyalty grows, often without any deliberate intent from the influencer.
Maria Konnikova and other influence researchers observe that charisma frequently outweighs data in online environments. Followers end up trusting the personality rather than the evidence.
What Happens When Social Dynamics Replace Science
When emotional belonging and fast thinking take over a plant conversation, the outcomes become surprisingly predictable. Gardeners buy poor performing grow lights not because the lights earn their trust, but because the group endorses them so confidently that the product feels safe. Ineffective fertilizers gain reputations as essentials simply because they are repeated often enough. When those products fail, many gardeners do not question the advice that led them there; they question themselves, assuming they misunderstood something or did not “do it right.”
Accuracy also becomes a threat in this environment. Evidence-backed explanations can feel like an intrusion, something that disrupts the comfort of shared belief. The group reacts defensively, pushing back against nuance because nuance forces slow thinking, and slow thinking is uncomfortable when the tribe has already aligned around an answer. Familiar claims, repeated often enough, begin to feel like truth. The myth becomes easier to accept than the correction.
This is the illusion of choice at work. Gardeners feel as though they are making independent decisions about what to buy, what to trust, and what to believe. But long before they compare products or read a single piece of research, the group has already shaped the boundaries of what seems reasonable, reliable, or “right.” Their decisions are made inside a narrative constructed by tribal behaviour, social proof, and emotional alignment rather than by genuine evaluation.
In other words, the choice feels personal, but the path leading to it was built by the group.
Wrapping It Up
Gardening tribes like the PHA and others are rich with passion, humour, and shared discovery, but they also carry psychological currents that bend the truth in subtle ways. These forces can suppress nuance, reward confidence over competence, and even nudge well-meaning communities toward cult-like patterns without anyone consciously steering them there. None of this happens out of malice. It simply reflects how human psychology operates when identity, emotion, and group belonging intertwine.
Once you see these dynamics, it becomes easier to step outside the emotional current that guides so many gardening decisions. You start to recognise how group beliefs, repetition, and social pressure create an illusion of choice long before product research begins. In Part 2, we move into the world of grow-light marketing, where companies combine persuasive copy, polished imagery, manipulated statistics, and influencer partnerships to sell performance that does not always exist. Influencers play a critical role here, acting as trust shortcuts for overwhelmed consumers and reinforcing claims that often bypass scientific scrutiny entirely.
Pro Tip: Slowing down is one of the most powerful forms of plant care. It protects both your plants and your decision making.
Article Resources & Some Great Books To Add to Your Personal Library

Ignore any doubles. Some I
| Author | Book Title | Relevance to This Article |
|---|---|---|
| Daniel Kahneman | 📘 Book Thinking, Fast and Slow | Explains the relationship between fast intuitive thinking and slow analytical reasoning, a foundational concept behind why gardeners rely on tribal cues and familiar myths. |
| Robert Cialdini | 📘 Book Influence: The Psychology of Persuasion | Documents core influence triggers like social proof and authority, both of which explain how inaccurate gardening advice becomes accepted as fact. |
| Maria Konnikova | 📘 Book The Confidence Game | Demonstrates how confidence easily overrides competence, supporting the article’s argument that charismatic influencers often appear credible even when their claims lack evidence. |
| Kevin Hogan | 📘 Book Invisible Influence | Explores emotional decision-making and subconscious persuasion, illustrating why gardeners follow advice that feels soothing rather than advice grounded in plant science. |
| John V. Petrocelli | 📘 Book The Life-Changing Science of Detecting Bullshit | Provides tools for evaluating vague or misleading claims, aligning with the article’s emphasis on why gardeners struggle to filter misinformation inside myth-forward groups. |
| Ryan Holiday | 📘 Book Trust Me, I'm Lying | Explains how repetition, simple narratives, and attention-driven platforms manufacture certainty, paralleling how plant myths spread quickly in influencer-led communities. |
| Mario Moussa | 📘 Book Composure and the Art of Influence | Describes how group identity shapes individual beliefs, supporting the argument that tribal behaviour in plant groups overrides measurable evidence. |
The Book Icon links to Amazon if you have an interest in buying the books in the table. Any of the books in the picture are equally excellent if your really want to understand the psychology of how we are manipulated. I have a lot more great books, relevant to the topic and basic human and evolutionary psychology that would also be relevant, but these are the most applicable to this article and how you are made to think you have choice in your purchase and lifestyle decisions. Your decisions are not yours.
Life is busy. Give Us Your Name & Email and We'll Send You Content.
