Stay Updated on Developing Stories

We've made new rules to protect our families. We must protect kids' privacy too.

Editor's Note: (Leah A. Plunkett is the author of SHARENTHOOD: Why We Should Think Before We Talk About Our Kids Online. She is an associate dean and professor at University of New Hampshire Franklin Pierce School of Law and a faculty associate at the Berkman Klein Center for Internet & Society at Harvard. The opinions in this commentary are her own. View more opinion at CNN.)

(CNN) The terms and conditions of our lives have changed beyond recognition in recent weeks. It's time the terms and conditions provided by digital technology companies be rewritten to match.

Consider the deal we've made with most tech companies: we give up our private information and they give us free or low-cost digital services in exchange for using this data however they want.

On any given day, this deal is dishonorable. During the Covid-19 pandemic, when children and their parents are relying on the internet more than ever, it's immoral. Data that is collected from and about our children, used by tech companies, and shared with third parties can have a serious impact on their future. Tech companies should safeguard our children's privacy by stopping these invasive practices.

Leah Plunkett

From school to sports to social activities and so much more, we are scrambling to get our kids online in order to adapt to our new reality. We are logging in to countless platforms to bring the world into our homes. In doing so, we're also giving out our children's personal information at an ever-increasing rate.

In disaster mode, we're giving our kids devices, asking them to talk with smart assistants, and doing so much more without having the capacity to read, let alone understand, what we're agreeing to. And as we find ourselves homebound, we are sharing private information about our kids online, in addition to setting them up to share their own information from social media to smart devices and beyond.

The risks are high when it's unclear who is getting that information or what they're going to do with it; marketing companies can use the data from our children to create targeted ads, while data brokers have been known to collect data about children as young as 2 years old.

Of course, there are people and institutions we want to allow digital access to our children's information. Take remote learning, for example: we want our children's teachers and schools to see when they've logged in, what progress they've made and where they might be struggling. This allows teachers to stay engaged with their students and address any potential problems, whether it's in real-time classrooms or recorded modules, discussion posts, or online assignments. Without information sharing, remote learning fails.

But there are key risks with remote learning: we don't want trolls or hackers to get access to our children's private information, which can happen when unauthorized third parties harass virtual classrooms in real-time (a recent outbreak of Zoombombing has led the New York Attorney General's office to open an investigation into the company's privacy practices, prompting the company to update its privacy policy and release a statement that they were implementing safeguards) or break into educational data repositories (as happened to educational software maker Pearson in fall 2018. The company states they found and fixed the vulnerabilities once they had been discovered).

We also don't want any of our children's personal information to be collected, used, or shared by ed tech providers or affiliated third party providers, beyond what is necessary for remote learning. This private information can be used by schools to digitally monitor students for potential safety risks and share this information with law enforcement, according to a report in Education Week. Data brokers can obtain this information and use it to build profiles of students based on ethnicity, affluence and lifestyle for marketing purposes, as a study by the Center on Law and Information Policy at Fordham University School of Law found.

Their information can also be aggregated, analyzed, and re-shared by data brokers with future gatekeepers like colleges, insurance companies, or employers. Information sharing -- which happens across all types of digital tech, not just ed tech -- can have serious effects on our children's opportunities, as data is being collected and used to shape their destinies.

Under federal student privacy law, schools that are using digital tech to handle children's "personally identifiable information" (PII) without getting parental consent up front should have contractual protections in place such so that the digital tech provider is obligated not to re-share PII or use it for purposes other than remote learning facilitation. Even under normal circumstances, however, it may be difficult or impossible for schools and school districts to negotiate these complex student arrangements -- understandably, they do what we do in our homes: click or swipe to accept boilerplate terms and conditions that typically don't offer meaningful privacy protection.

Given the pressure the pandemic has placed on our educational institutions, they are even less equipped to focus on best practices for privacy protection or even privacy law compliance.

As parents, we're even worse off than schools -- even if we do read all the fine print, we have zero ability to negotiate for stronger privacy protection. We can take or leave digital tech -- and right now, the vast majority of us can't leave it.

In the past, parents have had to rely on the government to enforce privacy laws. This past fall, the Federal Trade Commission and the New York Attorney General reached a record high settlement with Google and YouTube, a subsidiary, for the companies' alleged violations of children's digital privacy laws. At the time, Google said in a statement, "We know how important it is to provide children, families and family creators the best experience possible on YouTube and we are committed to getting it right," affirming that they would "limit data collection and use on videos made for kids only to what is needed to support the operation of the service."

We still need federal and state agencies to serve as watchdogs, but it's understandable that resources are spread thin right now. That leaves parents as both the first and last lines of children's privacy defense. Because parents have little bargaining power, what happens now with children's digital privacy is largely in the tech companies' hands.

Tech companies should switch their default settings so that they do not collect or share any information beyond the service or product that is being advertised. If we're using a fitness tracker to get our kids to exercise, for example, the data about our children's health should stay between the user, the device provider, and any third parties necessary to get us the data we need.

Tech companies should also be more transparent about their privacy policies and take a more proactive stance when it comes to safeguarding our children. When we blindly click "accept," we should be agreeing to terms and conditions that guarantee our children's data will not be used for marketing, advertising, product development and profile building, or sold to data brokers or similar entities.

If tech companies want to continue to engage in these activities, they should clearly itemize each specific activity and its purpose, including what data will be collected and which third parties are involved before asking for our explicit consent.

Many of us are currently stuck at home, relying on different tech platforms to maintain a semblance of our lives; tech companies should protect the privacy of our children, and their ability to explore, thrive, and grow at this time.

Outbrain