We are living through a digital land grab.
by Helen Andromedon Digital Security Trainer & Safe Sisters Co-Founder
Data brokers—those shadowy companies you didn’t consent to but who know your birthday, your shopping habits, your neighborhood, and maybe even your trauma—are thriving.
Corporations scrape, sell, and resell our personal data with little transparency and even less accountability. Yes, I said it. We are living through a digital land grab. And they’re just getting started. Artificial intelligence has entered the chat—and it’s hungry for fuel. That fuel is your life.
So pick up your pen and get out your notebook sis, because you going to want to keeps these notes analogue.
As someone who has trained women across East Africa in digital security through initiatives like Safe Sisters, I’ve seen first hand how deeply vulnerable we are when we don’t understand what’s being collected, how it’s used, and who profits.
Now, as AI systems explode across every sector—from healthcare and banking to online dating and hiring—we need to have a serious conversation about what it means to be safe, visible, and sovereign in the digital world.
Let’s break this down:
Data brokers are companies that collect, analyze, and trade information about you—often without your direct knowledge.
Some, like Acxiom or LexisNexis , are well-known in corporate and legal circles. Others are obscure. But they all do the same thing: extract behavioral data, repackage it, and sell it to marketers, political operatives, insurance companies, and yes—you guessed it, AI developers.
As outlined by my peers at the Electronic Frontier Foundation , many of these brokers aren’t even bothering to register with state regulators. Why? Because most states don’t require them to. In recent years, California, Texas, Oregon, and Vermont have passed data broker registration laws that require brokers to identify themselves. Fact check your state here!
[ Yes~ that's a Reddit link because no one else has yet made this directory. Please someone, go on! ]
This means that while you scroll, shop, and search, hundreds of companies are tagging your behavior and aggregating your digital exhaust into profiles that may influence what loans you’re offered, what jobs you’re shown, or whether an algorithm deems you “risky.” These are the modern surveillance pipelines—private, profit-driven, and often invisible.
As the Brennan Center notes, this unchecked data collection has huge implications for civil liberties. Law enforcement agencies have purchased data from brokers to sidestep warrant requirements. Political campaigns use it to target people based on race, religion, or immigration status. And AI models feed on this data to “learn” human behavior—biases and all.
If you think this is just about cookies or ad targeting, think again. This is about power.
Data grabs now replace land grabs. And again, they are just getting started.
What we once thought was simply extra small text next to equally small boxes to check so we could enjoy 'free' access on social media, messaging or other platforms are now waterfalls of our very data flowing freely to these dealers selling that which simply does not belong to them!
AI systems, especially large language models (LLMs) like ChatGPT, rely on massive training datasets. These datasets are scraped from the internet, purchased from brokers, or synthesized using previously collected user data. The problems ~ yes there are many ~ Much of this data was never truly “consented to” in the first place and that which has been grabbed from articles, publications, studies, journals and historical data - even if politely asked for - which it wasn't - was largely written, informed and structured by mostly male humans. Interestingly a bit less than half of the global majority of humanity.
As a UX researcher, I’ve seen how fast tech companies move to deploy AI tools—often skipping over basic questions like:
The Center for Internet Security warns that “AI can amplify existing risks” if not designed with transparency and ethics at its core. From phishing scams powered by generative AI to facial recognition systems that misidentify women and people of color, the implications aren’t theoretical—they’re already here.
And here’s the kicker: most of us have little to no ability to opt out.
You don’t need to have a smart speaker in your home or an Alexa listening in your kitchen to be vulnerable. If you’ve ever used a rewards card at a grocery store, taken a Buzzfeed quiz, or looked up health symptoms online, you’ve probably left data breadcrumbs. The question is: how do we sweep them up—and stop feeding the machine?
This is where the Safe Sisters approach comes in. Developed for women across Africa navigating hostile online spaces, this training model focuses on reclaiming digital agency—not through fear, but through skill-building, community, and consent.
Here are three concrete steps you can take right now to protect your data during the AI revolution! It won't fully 'fix the internet' as my title suggest but it sure is start and you might as well lean in from here, sis. It's going to be wild ride!
You have more rights than you think. Thanks to legislation like the California Consumer Privacy Act (CCPA), you can request that companies delete your data or stop selling it.
Start with these tools:
Need help getting started? The The Security Education Companion offers step-by-step guides.
Many of us come from communities that are disproportionately surveilled. Whether you’re an organizer, a journalist, or just someone who doesn't fit the algorithm’s mold—your safety matters.
Try these:
And if you’re navigating multiple identities—queer, neurodivergent, activist—consider creating digital boundaries: separate email aliases, password managers, and app profiles for different roles you hold.
No security practice is perfect. That’s why collective care is key. Host a digital safety meetup. Share the Safe Sisters guide with your network. Talk about what you’re learning. Normalize questioning the terms and conditions before clicking “accept.”
Cybersecurity doesn't need to be a bad word or so boring you'd rather just ignored it. It's a super powered tool, especially for women who have historically been gate kept from technology and as a result economic prosperity. And this super tool is more about culture than simply becoming cyber bouncer of sorts. As you harness your digital autonomy and sovereignty you begin to recognize that your digital autonomy is a human right. Period.
We often hear that data is the “new oil.” But oil is finite. Data, especially when extracted from people without consent, is renewable exploitation. We need a different metaphor. Data isn’t oil—it’s story.
It’s memory. It’s labor. And we deserve to own our stories.
This is why I helped co-create Safe Sisters. It’s also why groups like the Association for Progressive Communications and the African Women’s Development Fund are investing in feminist digital infrastructure.
Because if we don’t shape this next phase of the internet, it will be shaped without us.
In a world where AI is rewriting the rules of engagement, we must root ourselves in intentional, consent-based design.
We must build systems that center human dignity—not extract it.
Final Word: Please Take This w. You!
This Is NOT About Fear. It IS ABSOLUTELY About Power.
Dynamic adaptability—what my friends at INCIGHT call the mix of courage, humility, innovation, and self-awareness—is the skill of our era. It’s what makes digital resilience possible.
If you take nothing else from this post, take this:
🌱 The AI revolution isn’t coming—it’s here.
Privacy isn’t a luxury—it’s a legacy.🥸
🛠 And your data? That's absolutely YOURS to reclaim .
Let’s build an internet that remembers women. That protects dissent. That uplifts story.
That sees data not as destiny—but as responsibility.
Together, we can flip the script.
Your Privacy Rights are Human Rights.
Ready to take your business to the next level?
B4G is here to help you build your business 4 good.