Have we become so focused on technical expertise in digital tools, algorithms, and emerging technologies that we have neglected to examine their impact on society?
Digital marketing today is no longer a peripheral technical discipline. It is a central force in shaping how people understand themselves, relate to others, and navigate the world. To work in this field is to exercise influence, whether one chooses to acknowledge it or not. Yet critical thinking remains conspicuously absent from large segments of the industry.
A familiar refrain persists: “I’m just doing marketing.” The phrase suggests that communication is a neutral act, detached from social, political, and human consequences. It is not. It never has been. Communication reflects values, priorities, and power. To deny this is not innocence; it is abdication.
Every message carries bias. Every strategy encodes decisions about visibility, exclusion, and normalization. When these choices go unexamined, they are reinforced. When they are reinforced through digital platforms with global reach, their effects become systemic.
Too often, professional attention is consumed by understanding how an algorithm works, how it changes, or which communication trend is currently in fashion. Entire strategies are built around platform optimization and performance metrics, while far less effort is devoted to understanding the deeper social problems these systems interact with, or the lived realities of the populations marketers claim they want to reach. Technical fluency has come to replace social literacy.
One of the most troubling silences in digital marketing concerns the impact of social media on vulnerable populations. The effects on children and adolescents, on mental health, identity formation, and self-perception, are well documented, yet frequently minimized. Even less attention is paid to the consequences for human rights defenders, journalists, and activists, for whom online visibility often brings heightened exposure to harassment, surveillance, and targeted disinformation.
It is therefore striking that many marketing professionals dismiss as “absurd” the idea of restricting social media access for users under 16, while simultaneously reframing the issue as a matter of private family responsibility. The burden is placed almost entirely on parents, as if the effects of platform design, algorithmic amplification, and profit-driven engagement models could be contained within the home. This framing conveniently obscures mounting evidence of anxiety, addiction, declining self-esteem, and early immersion in economies of attention, consumption, and constant validation. Treating social media harm as a private matter is not neutrality; it is a way of absolving industry and institutions of collective responsibility.
The legal dimension reveals a similar pattern. Platforms are deployed, data is harvested, and processes are automated, yet many practitioners remain disengaged from the laws that govern these activities. Data protection regulations, child safety frameworks, digital rights legislation, and, more recently, the European Union’s AI Act are often treated as bureaucratic obstacles rather than as safeguards designed to limit structural abuse, particularly for those most exposed to digital harm.
The uncritical embrace of artificial intelligence follows the same logic. AI is introduced into marketing as a promise of efficiency and scale, while its geopolitical implications, embedded biases, data origins, and consequences for creative labor and civic space receive scant scrutiny. For civil society organizations and human rights defenders, such technologies can magnify existing risks when deployed without ethical and legal accountability.
The problem, then, is not technology itself. It is the persistent refusal to ask difficult questions about how that technology operates and whom it affects. Digital marketing is not an innocuous profession. It is a field that shapes behavior, perception, and public discourse at scale.
When those who work within it choose comfort over reflection, speed over understanding, and compliance over responsibility, influence is exercised without accountability, and its effects extend well beyond the industry itself.
An industry that shapes behavior, perception, and public discourse cannot afford to treat critical thinking as optional. The public consequences are simply too great.
I encourage you to…
- Care about the real-world impact of your strategy, not just its performance.
- Care about how and why you use technology, not only how efficiently it works.
- Care about ethics, accountability, and the protection of vulnerable people.
- Question what your work amplifies, normalises, or silences.
- Remember that AI and algorithms can execute and scale, but they cannot take responsibility.
In a world of infinite outputs, care is the ultimate ingredient.
