International Perspectives – How Parent-Daycare App Aula Is Transforming Care in Denmark’s Welfare System
Written by Victoria Andelsman Alvarez
Aula, Denmark’s official school-home communication app, may look like a practical tool, but it quietly transforms how care is performed, embedding parents deeper into digital systems of surveillance and redistributing responsibility.
“It’s merely a practical thing to organize things around being a parent.”
That’s how Jens, a father in Denmark, described Aula— a communication platform for parents with children in public daycare or school. Famously, Denmark prioritizes children’s welfare through generous parental leave and accessible early childhood education, while also being a highly digitalized country where digital media has become the primary mode of interaction between citizens and welfare institutions. From tax returns to health records, nearly all interactions with the state are now digitized.
This includes parents, who are expected to “be and act digitally,” connecting their private lives and those of their kids with public institutions through digital means and, increasingly, through mobile apps. Aula embodies this logic in childcare by centralizing communication, updates, scheduling, photo sharing, and so on. While it promises efficiency, it also brings complexities, particularly concerning privacy, autonomy, and consent.
Many, like Jens, describe Aula as a “logistical app.” However, my forty-plus interviews with 20 parents living in Denmark, conducted as part of my PhD research, reveal that Aula also fosters a sense of connected presence, offering parents a glimpse into their child’s day. This is especially reassuring in a context that encourages women to work full-time, supports dual-income households, and provides early state-provided childcare, while still expecting parents to remain consistently attentive to their children. As Laura explained: “If we haven’t seen each other for six hours, then it’s really nice to know what he actually does when I’m not there.”
Most research highlights parents’ choices regarding digital tools. Still, we need to look beyond the home to the larger systems—such as schools, governments, and tech companies—that increasingly make digital parenting feel necessary, if not unavoidable. What happens when an app becomes the primary mediator for that attentiveness? How does caregiving—and the capacity to act with care—shift when digital tools enter the already demanding lives of parents, children, and staff?
Meet Tania: Inside Everyday Parenting with Aula
Tania, a mother of twins, often opens Aula throughout the day—not just to browse photos or read messages, but to check nap times, which are logged by the daycare staff. One of her little ones is having ongoing sleep challenges, so keeping an eye on daytime naps has become a key part of making their nights smoother. Even when there are no new updates, she likes to log in just to check on the sleep data: “It’s very important for us to know how long he slept… We’re tracking to see whether the nap time was directly influenced by the night’s sleep, which it is.” For Tania, the twins’ sleep—or lack of it—structures family routines. She analyzes nap durations to identify patterns and adjusts their sleep strategy based on data from daycare staff and input from doctors.
Tania’s case illustrates how platforms like Aula don’t merely support parenting—their use reconfigures it by introducing new forms of vigilance, redefining what counts as attentiveness, and demanding that care becomes both data-driven and digitally legible. In this context, Aula serves as a care infrastructure embedding parental responsibility within institutional routines and digital systems. Tania’s experience highlights three interrelated shifts I want to discuss: a shift toward what may be called ‘coerced caring dataveillance’; the development of state-mediated transcendent parenting where parents must adjust not only to the rhythms of their children but also to those of the daycare; and evolving norms of state-parent collaboration, where constant monitoring and availability define what it means to be a “good parent.”
Coerced Caring Dataveillance?
Parents, especially mothers, experience substantial pressure to respond promptly to messages to avoid seeming inattentive, which exposes a sense of being monitored, not only concerning their children’s welfare but also regarding their own digital engagement. In practice, this means being constantly “on call,” managing a stream of data from daycare to device. The result is pressure to parent in ways that are legible through Aula—digitally visible and in alignment with institutional rhythms.
Opting out is not a realistic option. Essential communication takes place via the app, and parents who do not use it risk missing critical updates. Even when parents attempted to bypass the app and share information in person—typically for practical reasons—they were redirected by staff to use the digital system.
Moreover, non-participation can be interpreted as a sign of negligence or a lack of engagement in their child’s care. These practices are, therefore, also part of what Barassi calls coerced digital participation: individuals are compelled to comply and provide their data or risk exclusion from critical areas of social life. The pressure also comes from fellow parents, something Henrik explicitly stated, saying that while he checks the app once or twice a week, “some parents will say that that is not enough”. This perception carries social consequences in a culture where strong norms around parental responsibility, social trust, and cohesion prevail.
Ultimately, if not the need for logistical coordination, social pressures entangle parents in systems of caring dataveillance, i.e., surveillance practices that are justified through discourses of care and protection, where both children and parents are monitored through digital data flows. This, in turn, highlights the elusiveness of consent for both parents and children, necessitating a shift from generic warnings about “digital awareness” to a more systemic critique of how care and surveillance are intertwined with welfare state digital infrastructures..
Trust, Complacency, and Hidden Risks
Parents tended to shrug off the issue of surveillance, more worried about the increased labour that the app imposed. Yet, state and corporate actors rarely account for how digital media gather, combine, and share data. This makes informed decisions by even supposedly rational subjects impossible.
Paying attention to the embeddedness of parents’ digital practices in their everyday lives and worldviews also revealed that at least some participants trust that the data will not be misused—at least by state actors. Once again, the cultural and social context is key here, as the following comments by Jens exemplify:
“In China, there’s this point system… as long as these things are not everywhere here, then it doesn’t matter.”
This comment reflects a common perception among Danes that certain forms of surveillance are distant or unlikely to occur in their context. Such views are shaped by Denmark’s high levels of institutional trust, with surveys consistently showing that a majority of Danes believe public authorities act in the public’s best interest with 82% of the population between 15 and 89 years of age agreeing or highly agreeing that they generally trust public digital services, according to The Danish Agency for Digital Government.
However, drawing such comparisons can lead to a false sense of security. In Denmark, datafication is framed as “caring,” but it can still normalize surveillance and erode privacy. For instance, Amnesty International’s report, “Coded Injustice: Surveillance and Discrimination in Denmark’s Automated Welfare State,” reveals that fraud detection algorithms and automated decision-making are widely used in the Danish social benefits system, with little transparency, making substantive consent virtually impossible. As a result, many individuals unknowingly surrender their privacy rights and may remain unaware that they have been flagged, unable to contest decisions that profoundly affect their lives.
The Broad Picture: Rethinking Responsibility in a Datafied Welfare State
Most research to date has focused on what parents choose to do with digital tools, primarily at home. Instead, with this article, I want to encourage parents, scholars, and policymakers to consider the larger systems surrounding parenting that push digital tools onto families. Here, Denmark’s Aula offers a lens into the evolving relationship between citizens and the welfare state, as it redistributes agency among parents, institutions, and technologies, reshaping what it means to be responsible, engaged, and caring.
The prominence of Aula, driven by the welfare state, challenges the distinction between interpersonal and institutional practices, necessitating a more integrated approach that encompasses digital parenting practices across various contexts. This perspective finds, first, the care work entailed by using these apps extends parents’ context of agency beyond co-presence and transcends the household’s boundaries. It also goes beyond simply connecting the home and the educational institution to include, for instance, parents’ workplaces (Lim, 2019). Second, parents have little choice but to participate in at least some of the practices that cause children’s data to flow across various contexts and stakeholders, including corporate and institutional ones, not to mention across time.
When we consider the broader picture, we see how society is structured in ways that make data-intensive technologies necessary for parenting. We also recognize that care is not just a personal issue, but a political one shaped by institutions and power. This shifts the conversation to holding these entities accountable for their digital systems and data collection, advocating for data minimization, transparency, and ethical design that supports families without requiring excessive data or time.
Here’s the bottom line: Aula exemplifies how the use of mobile media apps for citizen-state relations may re-distribute agency, and redefine what it means to be a good citizen, or in this case, a “good parent” As apps like Aula become normalized, it is crucial to ask: what kind of parenting does their use promote, and at what cost to autonomy, trust, and privacy?
Looking for more content?
Our researchers and partners produce regular blog posts and research outputs focused on children and digital technology.