Panel Discussion about the effects of zero-search bot traffic in community

Together with Raquel Winkler from Opentext and Wilfried Rijsemus from 3sides, we kicked off the first Community Leaders Forum Meetup of 2026 with an open panel discussion. During this discussion, community managers came together to discuss how AI has impacted their community, with the notable increase of zero-search and bot traffic.
2024 was a landmark for Generative AI, where are we now?
In 2024, OpenAI launched GPT-4 for the public, and Google introduced AI overviews in the US, causing a major transformation in search. Initial concerns were about SEO, but over time, Community Managers also saw a wave of changes within their communities. Answers became instant, and people didn’t have to explore any further. The metrics that were used for years became illogic traffic generally decreased a lot and bot traffic at the same time sharply increased, and communicating the value of having a community to senior management became difficult. During this panel discussion, it became clear that many of these questions and concerns are still living among Community Managers, while simultaneously some have started implementing successful mitigation strategies and reframing metrics and reasoning to a ‘post AI time’.
Traffic in Community, Human vs Bots
Raquel from Opentext kicked off the conversation by sharing that the traffic in the community of Opentext has not necessarily decreased. The nature of Opentext’s community is technical and hyper-specific, and people are still seeking the full context, conversations, and human interaction. While this is one of the possible scenarios, other Community Managers mentioned observing different trends, such as a significant increase in bot traffic, or an overall decrease in traffic by around 15-20%, despite a higher proportion of that traffic coming from bots. This is especially true for transactional communities where people are stopping by to get answers rather than with the intention to find a sense of belonging, although belonging may still be an inadvertent result.
What makes a bot bad? Do we mitigate them or accept it as the new normal?
The commonality between all scenarios is that the proportion of bots is higher. This becomes particularly challenging as bots play a dual role. On one side, there are bots that crawl to facilitate spam or act maliciously. It is time-consuming to figure out how to stop them. These are clearly considered bad and disruptive to communities. On the other hand, some bots also help content being surfaced and cited in AI overviews, increasing visibility and reach.
While this type of bot traffic is beneficial, it also creates new responsibilities for Community Managers. Outdated or incorrect information may continue to be scraped and fed into Large Language Models, contributing to “zero-search” AI Overviews. Because these sources are publicly linked, outdated content can reflect poorly on the community, potentially harming its credibility and trust. As a result, Community Managers are now expected to actively audit, review, and keep all community content up to date.
Bots are inevitably part of a new normal, and this new reality needs to be accepted. However, there are some mitigation strategies to reduce the number of bad bots, such as;
- Hiding content behind the login
- Making use of SSO
- An ‘I am human’ button
- Products like Cloudfare, Akamai and Imperva to manage the bot traffic by identifying it and protecting traffic, rather than relying on the vendor only to identify abnormalities.
Communicating the value of having a Community
As traditional metrics lose their relevance and bot traffic increases, it has become progressively more difficult to communicate the value of a community to senior management. This is especially true for users of Community platforms where the pricing is dependent on the number of API calls. If growth is driven by bots rather than humans, this traffic and API calls are inconsistently spiking, and the proportion of bots becomes higher, having a community can start to look more like an unpredictable cost driver rather than a success.
The Good News
Luckily, the value of community has not gone up in smoke because of AI. Communities are no longer traffic engines, but they have become trusted sources of lived experience, knowledge, and context. These are things AI systems rely on but cannot create on their own. AI can’t replicate real conversations, discussions, and other dynamics of human interaction. There is a lot of value in seeing multiple members confirm, correct, or update a topic over time.
To communicate the value of communities;
- Success metrics need to be redefined around impact and trust, as communities still help customers through the AI overviews.
- Activities like webinars, giveaways and competitions can be set up in which people will want to get involved. This helps minimise the community to only serve as a hit-and-run for information.
- Expectations need to be reset as communities may not always drive, or increase traffic, but it builds trust, involvelement and engagement.
What a community manager can do today
The takeaways of this session are best practices that you, as a Community Manager, can start implementing today;
- Look at your current metrics, redefine them and reset expectations around Community traffic
- Remove outdated content from your community to avoid it surfacing on AI overviews
- Implement mitigation strategies like SSO, ‘I am human’ buttons, gating the community content behind login or protection services like Imperva or Cloudfare to manage bot traffic
- Set up a flagging system for significant traffic increase to signal it on time and use mitigation strategies
If topics like these interest you and you’re not part of the Community Leaders Forum yet? Sign up here.






