by Mary Rechtoris on June 20, 2019
The pace of technological innovation is astounding. iPhone users can unlock their phone through facial recognition technology. Researchers are working on leveraging technology to build new organs from scratch—in a petri dish.
As new technologies like these hit the marketplace, new privacy concerns and risks surface. Slow to respond to these changes, lawmakers often end up playing catch up.
Throughout his career, Bennett Borden has worked to advance technology in the law. As Drinker Biddle and Reath’s chief data scientist, he helps clients assess the risks that come with new innovations even when the law isn't clear.
Learning from (and Influencing) Human Behavior
Realizing he had an interest in law, Bennett attended Georgetown Law School in 2004. His timing aligned with e-discovery becoming more prominent in the legal sphere. Bennett then took a position at Gibson, Dunn, and Crutcher, where he strategized on using technology to examine data.
“Regulators were requiring companies to know where their information lived. Organizations had to know how to review and produce it,” Bennett noted. “Now, we see a rise in big data which gives us a wealth of information on what it means to be human and how we act.”
Using big data, organizations are making inferences about consumers based on behavior. As a result, this motivates consumers to invest in their product or service.
Not sold on this phenomenon? Check out your social media accounts. You'll likely see ads for products that align with recent behaviors you exhibited. Dedicated to running a marathon? Your accounts may bombard you with ads for new running shoes or Lululemon shorts.
The Murky Waters of Innovation and the Law
As these data habits evolve, the influx of innovation is outpacing the legal frameworks already in place to manage concerns about major issues like privacy, security, and discoverability.
“In our western society, we are laissez-faire when it comes to bringing new products to market,” Bennett said. He added: “We encourage companies to go out and create new products or offer new services.”
Having new tools and products to choose from isn’t a bad thing, of course. But sometimes we neglect to anticipate their consequences. Ethical issues and unknown privacy concerns can wreak havoc after these options emerge.
For example, a few years ago, Mattel brought a Wi-Fi connected Hello Barbie to market. This fancy new toy had the ability to record what children would say during playtime and respond.
“There was no control on what the doll would record and what it didn’t,” Bennett explained. “They didn’t think through the potential risks.”
Many psychologists and other experts raised moral and ethical concerns about the doll. For instance, a child may disclose sensitive or harmful information to the doll. This raises a slew of questions: Should Mattel report that information to authorities? Is that disclosure credible? What are the appropriate next steps, if any?
Nearly a month after the doll’s release, two parents filed a class action lawsuit against the company. They claimed the doll’s recording capabilities were a violation of privacy.
“There was no law about what information you can put into a listening device,” Bennett said. “What did and continues to exist are legal principles on how companies should assess and mitigate risk.”
When Amazon launched its Alexa virtual assistant, they did their homework. Alexa interacts with users and responds to voice commands. This innovation poses similar risks to Mattel's Hello Barbie doll. However, Amazon did their due diligence before launching the product. They worked with consumer privacy advocates, psychology experts, and suicide prevention folks, among others. Another differentiator was Alexa's ability for users to access their data—users can log into Alexa Privacy and delete recordings at any time.
What this Means for Young Legal Professionals
Every new innovation carries some degree of risk. The question that Bennett helps his clients answer is whether that risk is reasonable. If not, they may need to go back to the drawing board to avoid ramifications. To best illustrate this point, Bennett referenced the Chicago Tylenol murders.
In the early 1980s, an unknown person laced Tylenol bottles with potassium cyanide. The perpetrator was able to do this because the cap was easily removable and replaceable on store shelves. Seven people died because of the poisonings.
These fatalities led to regulatory changes such as anti-tampering laws. Now, drug manufacturers must package over-the-counter substances as securely as prescription medications.
“The industry quickly became aware of that risk,” Bennett said. “Likewise, today’s organizations must assess the burden on themselves compared to the potential harm to society. If harm outweighs burden, that is when organizations get into trouble.”
This balancing act is no small equation—it takes a lot of work and a lot of research. Companies are looking for experts to help them evaluate new products’ potential risk. As a result, new roles are surfacing in law firms, including chief data scientists, to apply the most up-to-date tools and techniques to these calculations.
As innovation booms and the law lags behind, these professionals can ensure companies keep consumer well-being top of mind.
“Technologically proficient attorneys are critical to the practice of law,” Bennett said. “I could not do my job and help lead our firm into the future without my background in legal training and data analytics.”
Mary Rechtoris is a member of the marketing team at Relativity, where she specializes in customer advocacy.