Heather Burns, a tech policy and regulation expert from Glasgow, Scotland, is well-known for her work on digital regulations and political issues that impact web development. As a core member of the privacy team at WordPress.org, she helped create a suite of GDPR and privacy tools used by over 30% of websites across the web. Heather has also initiated efforts to build a cross-project open-source privacy coalition, collaborating with privacy teams from WordPress, Drupal, Joomla, Umbraco, Typo3, and other platforms.

In Part 1, we explored Heather’s journey into the privacy space and her thoughts on the evolution of privacy and her work as a privacy advocate. Now, in Part 2, we delve into how tech teams can develop with privacy in mind, the impact of the data revolution, how consumers can get involved, and what the future of privacy might look like.

AB: What is the most important thing to remember when designing for privacy?

HB: Over the past year, I’ve learned that we’ve been tackling privacy in the wrong way. We’ve often approached it from a “what does the law require?” standpoint. A lot of projects begin with the question: “Just tell us what the law says we have to do.”

However, privacy shouldn’t be a reactive, compliance-driven obligation. We need to stop thinking of privacy as something we handle only when the law demands it. We must integrate privacy from the start of a project, embedding it into governance and values. This requires defining what privacy really means, incorporating it into development guidelines, and making privacy as fundamental as accessibility.

By embedding privacy in the culture of your project, it becomes a part of your team’s DNA, and when future laws like CCPA or the revised ePrivacy directive come along, handling them will be much easier. Now is the perfect time to do this, and if it means taking a moment to rethink how we approach privacy, then it’s worth it.

AB: How do you see products and services changing in the future if we shift away from hoarding vast amounts of personal data?

HB: A key shift here is how we look at the user experience. When people think of GDPR, they often think of those annoying consent pop-ups. We need to work on improving the user interface and backend data handling, focusing not just on compliance but on making it easier for users to understand and control their data.

One thing I’d love to see is more development of pattern libraries—an attempt to create a universal visual language for managing data privacy. UX designers should be taking on this challenge, using open-source resources like the IF data permissions catalogue to help build systems that empower users to control their own data. It’s baffling to me that designers will argue over small UI details, like the border radius of a button, but not focus on creating a user-friendly visual language for data management.

AB: How would an increase in personal data requests impact web development and tech projects overall?

HB: As people start requesting more of their personal data, it’s going to force us to be more conscious of what data we collect in the first place. We need to ask ourselves: Why does your app need access to users’ contacts, location, or microphone? Why does a toothbrush track location? We need to recognize that what’s marketed as “innovation” might just be a form of surveillance.

The work Tap is doing is an excellent example of confronting both users and companies with the massive amounts of unnecessary data being collected and stored. For instance, I recently read about someone’s Spotify data log, which contained a record of every click they made, including the size and location of the interface window. That’s not useful user data—it’s excessive and invasive.

AB: Do you think some technologists view themselves as agnostic when it comes to developing for privacy, feeling they’re not responsible for how data is used by the products they create?

HB: I believe it’s a mix of two factors. First, there is an arrogance in the tech sector that suggests we should “move fast and break things,” where some feel above the law and prioritize their work over users’ rights. This mindset is held by a small but vocal group of people.

However, the bigger issue is that most businesses just haven’t been properly guided. When I started my web design business in 2007, I went to Business Gateway, Scotland’s public service for startup businesses. They provided plenty of brochures on VAT, HR, and copyright laws, but there was absolutely no mention of data protection or privacy. As a new entrepreneur, you’re overwhelmed with so many things, and when privacy and data protection aren’t on your radar, they tend to slip through the cracks.

The Future of Privacy

As privacy laws continue to evolve and more consumers demand control over their personal data, tech companies and developers will have to rethink their approach. Moving away from mass data collection and focusing on transparency, user control, and ethical design will be crucial in building a future where privacy is respected, not exploited.

In the next part of this interview, Heather will dive deeper into the evolving role of privacy advocates in tech, how collaboration between projects can strengthen privacy initiatives, and what steps we can take to ensure a safer, more private digital world for all.