Issue 17 / Home

August 22, 2022
An abstract image of bubbles that look like frog eggs, which contain a scene that appears to be a view out a peephole

Monitoring the Monitors

Alex Ahmed, Sarah Fox wants you to hire nannies from its platform—and then watch them constantly.

Lyn started babysitting when she was twelve. And yet, despite her nearly twenty years of experience, she recently found herself unsettled while at a nannying job: she was being recorded on camera. Playful activities like building blanket forts, she told us in an interview, became laced with anxiety. “I intentionally think about the angle of the camera,” she said. “I’ll make really bad ones, with a lot of space and holes, so that the camera can still see us.” She devoted time to “making sure my mannerisms and my posture and, like, the way that I’m speaking to these children that, if it’s recorded on the camera, that there’s nothing that would look bad.”

Lyn found her job on, a platform for connecting nannies and other care workers with new clients. Founded in 2006, it now boasts a user base of thirty-one million across twenty countries. Its sweeping size and reach grants the platform an incredible amount of power to shape domestic work arrangements. It not only attracts individuals seeking care support, but also corporations and academic institutions, which provide free and subsidized subscriptions to Care’s Premium tier to their employees through the platform’s Care@Work program.

In its own words, is “the world’s largest online destination for care. We connect families with caregivers and caring companies to help you be there for the ones you love.” This emotionally laden statement is immediately followed by a coldly formal liability clause: “ does not employ any care provider or care seeker nor is it responsible for the conduct of any care provider or care seeker.” As part of this liability-avoidance strategy, Care encourages its users to surveil the care workers they hire, suggesting on a “Safety” page that clients should “continually monitor your caregiver’s activities.”

It makes sense, then, that in our extensive interviews with a diverse cohort of nannies from across the United States who use (and similar platforms like SitterCity), in-home surveillance was a recurring topic. We found that Care’s business model as an employment platform and subscription service not only explicitly and implicitly encouraged “monitoring”—which often reached uncomfortable levels for the care workers subject to it—but also made it difficult for nannies and babysitters to effectively advocate for themselves or negotiate employment conditions.

When nannies moved off the platform and into the space of a client’s home, they were often monitored through a variety of surveillance technologies and techniques—not just cameras, but also text messages, paper forms, and even the children being cared for. Cassandra told us that one of her clients expected “constant updates and documentation,” including photographs of the plates of food the children were eating. (All names have been changed to protect the privacy of our interviewees.) It made her feel as though her clients didn’t trust her to feed the children properly—worse, “It also made my job a lot harder, because like, when I’m texting, I’m not good at multitasking,” she said. “So, like, the girls would be trying to talk to me and I’d be like, ‘Hold on. I’m trying to take a picture of your food.’”’s marketing materials are geared towards parents, not workers. Until recently its website proclaimed, “Your safety is our priority: We are committed to helping you find a caregiver you can trust.” The platform preys on the feelings of protective parents—one of their blog posts intones that “nothing is more important than keeping your children safe from harm”—and pushes two contradictory messages. First, Care’s platform encourages parents to buy more stringent background checks, accomplished through direct contact with the carceral state, ranging in exhaustiveness from a database search of court records at one end, to a “hands-on” investigation performed by a team of “highly trained researchers and licensed private investigators” at the other. At the same time, Care’s messaging to parents implies that background checks are not enough to keep their children safe.

Regardless, nannies told us that monitoring makes them worse at their jobs, not better: it adds to the daily workload, and often prevents them from doing their job without interruption or from taking a break without feeling self-conscious. Indeed, the only party that clearly benefits from constant parental surveillance is Care, which tells families they need to monitor their nannies—painting them as suspicious by default—both discharging its own liability, and profiting from the background checks and membership fees that promise increased security.

Ongoing monitoring

The platform is split into multiple apps. On “ Caregiver,” nannies, dog walkers, cleaning workers, and others can sign up, submit to background checks, fill out their profile and upload photos, and search and apply for jobs. Another app, titled simply “,” allows care seekers to create job postings and browse caregivers’ profiles. With a Premium subscription (listed at $39.99 USD monthly, $89.97 quarterly, or around $160 USD annually), users can also contact caregivers directly—a power that was abused to repeatedly send unwanted messages to one of our interviewees—and to “have access to results of all background checks,” along with other “benefits.” Workers can also sign up for Premium for the same prices, granting them access to “higher ranking in search results,” and a promise that they will be “5X more likely to get hired.” Care Premium also comes with a free annual “CareCheck” background screening, which is required to begin applying for jobs. If workers choose not to sign up for Premium, the CareCheck costs $14.99 USD per year, due when they register.’s official materials admit that background checks can be inaccurate, yet two of our interviewees discovered that the platform’s bans are decisive and uncontestable, and the platform doesn’t disclose whatever supposed violation led to the ban. In addition, the platform advises care seekers to purchase additional background checks on their applicants “at time of hire” for increased security. They note that background checks are “not a substitute for conducting thorough in-person interviews, reference checks, online and social media searches, obtaining copies of the candidate’s identification documents, and conducting ongoing monitoring of any individual you hire.”

“Ongoing monitoring,” however, is not simply used to ensure the safety and well-being of one’s children. It’s also a retaliation tactic. Kendra told us that her client ramped up monitoring immediately after she attempted to renegotiate her pay and hours. “She had started asking more of me after that. So it was like, ‘Oh, well, you’re gonna need to step up your game.’ And it was like, ‘You need to write down every single thing that my kid does’… It was almost like a punishment… I had to write down every activity, I had to write down everything they ate, every time they went to the bathroom, every time. Like, everything.” This documentation took place on printed sheets of paper that her client supplied to her, which were inspected before she left the home “to make sure I did a good job.” She described this work as “exhausting, especially on top of having two children to watch. I would have to take time away from them to fill it out.”

Nannies sometimes don’t know the full scope of monitoring until they start a particular job—and the fact that their jobs are usually performed in clients’ homes puts them at a disadvantage in negotiations. As Marie put it: “People can act whatever way they want to on the phone,” she said. “Until you go into their house, you just don’t know.” She described one interview that stuck with her: “We’re just gonna watch you,” a parent told her. She was instructed to take their baby, a bottle, and a three-year-old downstairs while the parents observed her on a camera feed.

“There is no privacy”

Marie felt deeply uncomfortable that the family hadn’t told her before the interview that she’d be monitored. But many nannies resign themselves to surveillance, reasoning that a client has the right to do what they like in their own home. Jade told us, “I didn’t think it was all that big of a deal that they didn’t ask for consent or anything like that. Especially because I wasn’t doing anything that they would be worried about in the first place… When you step into someone else’s home, there is no privacy. That is something that you pretty much have relinquished as soon as you step into that home.” Celia adjusted her behavior for the presence of the cameras. “I used to sit down, back to the camera, and eat very fast because I’m feeling so weird,” she shared. Nevertheless, she concluded, “I know it’s not my house, you know, they can do whatever they want.”

Some interviewees quit in search of a better gig after their clients crossed a boundary. Sometimes this works, they told us; “more authentic” working relationships are possible. But at the heart of acquiescence to surveillance is the unequal status of the caregivers and the employers. Turning down needed income is not an easy decision. As Cassandra said, “Sometimes I work up the nerve to ask if they have cameras… it depends on how many offers I’m getting lately… If I’m more, like, desperate to get a job, I just won’t bring it up.”

Platforms like engage a geographically dispersed and atomized workforce, which has led some researchers to conclude that this and other challenges (such as disparities between workers using platforms to supplement their income versus those using it for full-time work) may have a sobering effect on labor organizing. But we should remember that examples of platform worker resistance are plentiful, including strikes by Deliveroo riders and UberEats drivers. Care workers already gather on Reddit and Facebook to discuss workplace issues and to share resources, job postings, and template contracts. It’s not a stretch to imagine this as a foundation for organizing that could give nannies a say on when, how, and if they are monitored while on the clock.

Alex Ahmed is a software developer at Sassafras Tech Collective, a worker-owned cooperative.

Sarah Fox is an assistant professor at Carnegie Mellon University in the Human-Computer Interaction Institute, where she directs the Tech Solidarity Lab.

This piece appears in Logic's issue 17, "Home". To order the issue, head on over to our store. To receive future issues, subscribe.