Ryan Calo

Photo of Ryan  Calo
Assistant Professor of Law

Phone: (206) 543-1580
Email:

Curriculum Vitae | SSRN author page



  • - The Seattle Times The school shooting in Marysville and its aftermath offer a stark look into how distressed teens use social media to share problems they might previously have discussed with a school counselor. Increasingly, Facebook, Tumblr and similar websites are trying to meet young people where they live. (11/1/14)
  • - International Business Times The Seattle Times is furious with the FBI after it emerged that the bureau impersonated Times journalists to install spyware on a 15-year-old bomb threat suspect. The disclosure is the latest example of a law enforcement agency masquerading itself online to dupe people into providing information. (10/28/14)
  • - Slate In the early days of dot-com, the law found the Internet unsettling. That a buyer in one location could access the website of a seller in any other forced courts to revisit basic questions of jurisdiction and federalism. The potential to share and edit software and other digital objects introduced novel questions of ownership and control. In the mid-’90s, a movement arose among legal academics to address these and similar challenges. The central tensions of “cyberlaw” flow from the characteristics that distinguish the Internet from prior or constituent technology such as computers or phones. (10/27/14)
  • - Science As robots take on societal roles that were once the province of humans, they are creating new legal dilemmas. (10/20/14)
  • - The Washington Post My prediction is that in fewer than 15 years, we will be debating whether human beings should be allowed to drive on highways. After all, we are prone to road rage; rush headlong into traffic jams; break rules; get distracted; and crash into each other. That is why our automobiles need tank-like bumper bars and military-grade crumple zones. And it is why we need speed limits and traffic police. Self-driving cars won’t have our limitations. They will prevent tens of thousands of fatalities every year and better our lifestyles. They will do to human drivers what the horseless carriage did to the horse and buggy. (10/14/14)
  • - USA Today Let's talk robots. Not science fiction film plots, '80s dance moves or frenetic 'they're stealing our jobs' narratives intending to draw upon readers' deep-seated anxieties -- but the realistic capabilities of robots and the influence of robotic technologies on the American workforce. (10/13/14)
  • - Buzzfeed A DEA agent commandeered a woman’s identity, created a phony Facebook account in her name, and posted racy photos he found on her seized cell phone. The government said he had the right to do that. Update: Facebook has removed the page and the Justice Department said it is reviewing the incident. (10/6/14)
  • - Slate Are robot babysitters ethical? Will the future of the Internet look like You’ve Got Mail? How can we use science fiction to inspire scientists?   (10/3/14)
  • - USA Today
    Facebook said Thursday that it will tighten oversight of research on "deeply personal topics" or that targets specific groups of people.
     
    But it did not say whether it would get consent from users before conducting research on them, nor is it clear what standards or guidelines researchers will adhere to. Ryan Calo, an assistant professor at the University of Washington School of Law, said Facebook is taking a step in the right direction.
    (10/2/14)
  • - The New York Times Facebook said on Thursday that future research on its 1.3 billion users would be subjected to greater internal scrutiny from top managers, particularly if it focused on “deeply personal topics” or specific groups of people. “This is a company whose lifeblood is consumer data. So mistrust by the public, were it to reach too critical a point, would pose an existential threat to the company,” said Ryan Calo, an assistant professor at the University of Washington School of Law, who had urged Facebook to create a review panel for research. “Facebook needs to reassure its users they can trust them.” (10/2/14)
  • - Forbes
    The pin-up page for the 2015 Corvette brags about the car’s many features, including its “industry-exclusive Performance Data Recorder,” which is like a Fitbit for the owners’ driving, collecting stats about a particular drive as well as audio and video. And there’s a bonus feature to the recorder, according to Chevrolet’s website: a “nanny cam.” “You can even capture video and data when someone else is driving the car with Valet Mode, giving you extra peace of mind.”
     
    Not exactly. Depending on which state the valet is, it might give the Corvette owner a criminal mind. While engineers may have thought a “baby monitor” for the car was a great idea, lawyers apparently didn’t review the surreptitious recording feature closely. Last week, as first reported on Corvette Forum, car parent company GM sent out notices to dealerships and to new Corvette owners warning them not to use the feature, because it’s a wee bit illegal in some states to record someone’s expletives about how awesome driving your car is without their consent.
    (9/29/14)
  • - New Hampshire Public Radio Professor Ryan Calo speaks to New Hampshire Public Radio about his Brookings Institute paper, The Case for A Robotics Commission.  (9/28/14)
  • - Ars Technica
    Ryan Calo, a law professor at the University of Washington, who has studied drone law, told Ars that the government shouldn't impose a double standard.
     
    "I'll say this: the government should not have a monopoly on drones, banning the use by the press and others while retaining the right themselves," he said. "This is an important technology and there needs to be symmetry."
    (9/27/14)
  • - Forbes While inferring what we want can save us time, make it easier for us to accomplish goals, and expedite finding things we expect will bring us pleasure, predictive technology also can create problems. Privacy scholars like Ryan Calo note that if marketers can use big data to predict when we’re susceptible to lowering our guard, they can capitalize on our vulnerabilities. A related concern was expressed when Facebook ran it’s infamous emotion contagion experiment. If social media companies can predict, with ever-finer precision, what makes users eager to engage with their platforms, they can design features that will manipulate us accordingly. (9/27/14)
  • - Slate
    On Thursday, the Australian Senate passed a bill that would increase the powers of domestic spy agency ASIO, giving it the ability to monitor all of the Australian Internet with a single warrant. It could also send anyone who “recklessly” discloses information that “relates to a special intelligence operation” to jail for up to 10 years. (Any operation can be considered special.) The bill is expected to pass the House, where it will be up for a vote on Tuesday at the earliest.
     
    The law will, if passed, dramatically increase the government’s powers of surveillance, but despite Abbott's reference to a "shift," it’s not necessarily inconsistent with existing Australian policy. Ryan Calo, assistant professor of the University Washington School of Law and author of a Brookings report on why the United States needs a federal robotics commission, pointed to Australia as a country with a more deliberate, and more consistently permissive, policy toward drones, surveillance drones included.
    (9/26/14)
  • - The Economist
    WHEN the autonomous cars in Isaac Asimov's 1953 short story “Sally” encourage a robotic bus to dole out some rough justice to an unscrupulous businessman, the reader is to believe that the bus has contravened Asimov's first law of robotics, which states that “a robot may not injure a human being or, through inaction, allow a human being to come to harm”.
     
    Asimov's "three laws" are a bit of science-fiction firmament that have escaped into the wider consciousness, often taken to be a serious basis for robot governance. But robots of the classic sort, and bionic technologies that enhance or become part of humans, raise many thorny legal, ethical and regulatory questions. If an assistive exoskeleton is implicated in a death, who is at fault? If a brain-computer interface is used to communicate with someone in a vegetative state, are those messages legally binding? Can someone opt to replace their healthy limbs with robotic prostheses?
    (9/25/14)
  • - Silicon Beat
    We’ve got cars without drivers out there. Companies are testing drone delivery. Specialized robots are being used inside and outside factories. Ethical, societal and legal concerns surrounding automation and robotics abound. So a new Brookings Institute report says it’s time for a federal commission for robotics.
     
    Ryan Calo, assistant professor at the University of the Washington School of Law and formerly at the Center for the Internet and Society, wrote the report. He believes robotics will bring about such a profound change that a new government agency is necessary.
    (9/19/14)
  • - The Washington Post
    Why does the United States needs a new federal commission focused solely on understanding our robot future? The real question is, why don't we?
     
    Ryan Calo is an assistant professor at the University of Washington School of Law, and in a new paper out from Brookings he makes the case that a new Federal Robotics Commission would help make sense of the various technology applications that separate human agency from execution. 
    (9/15/14)
  • - U.S. News & World Report
    That sounds very convenient, but it also raises questions about where data is stored and how it is used by those with access to it. Apple recently faced cybersecurity backlash after the hackers stole nude photos from iCloud online storage accounts owned by movie star such as Jennifer Lawrence, which the company said was due to poor password protection by the users, not a data breach of its systems.
     
    “It’s a reminder that anything you put in the cloud – even things you think are gone after deleting them – can still be there,” says Ryan Calo, assistant professor of law at the University of Washington.
    (9/10/14)
  • - Tech Times Though militarized drones strike terror into the hearts of those on the ground below, Google wants its autonomous aircraft to bring hope as the micro air vehicles deliver aid to those affected by disasters. (8/29/14)
  • - The Wall Street Journal
    "I don't know that Google is much better positioned than Amazon or anyone else in terms of technology, but the company has a track record of being influential in terms of policy," said Ryan Calo, a law professor at the University of Washington who studies robotics and privacy.
     
    Earlier this year, the FAA said it didn't contemplate autonomous drone delivery, effectively grounding Google's and Amazon's ambitions for now, Mr. Calo noted. However, he said having both Google and Amazon working to change the FAA's view increased their chances of success.
    (8/29/14)
  • - The Atlantic One area where Google will almost certainly have a major impact is in shaping the regulations that ultimately govern unmanned aircraft. “To a far greater degree than Amazon, Google has a history of working with policymakers and stakeholders on technology reform,” the University of Washington’s Ryan Calo, an expert on drone regulation, said. “Think net neutrality, fair use, privacy, and recently transportation. Adding Google’s voice could have a significant effect on regulatory policy toward drones.” (8/28/14)
  • - Smithsonian Previously, in the age of the studio photo, “you had to sit there and pose. You not only had to give your consent, you had to cooperate a lot,” notes Ryan Calo, an assistant professor of law at the University of Washington who specializes in privacy issues. With a hand-held camera, a picture could be taken of you unawares. (8/25/14)
  • - Venture Beat Google’s self-driving cars are designed to exceed the speed limit by up to 10 miles per hour because stubbornly obeying the law can be a danger to everyone on the road. The legal and philosophical consequences of this fact are fascinating. (8/19/14)
  • - The Washington Post The all-new version of Foursquare, announced Wednesday, “learns what you like, leads you to places you’ll love,” and tracks your every movement even when the app is closed. “I am not surprised to see Foursquare move to passive collection of location information. It seems to be something of a trend,” said Ryan Calo, professor at University of Washington School of Law. “The concern for consumers is that Foursquare or its partners will use this information in a way that surprises and disadvantages consumers.” (8/7/14)
  • - Ars Technica Newly published documents show that the San Jose Police Department (SJPD), which publicly acknowledged Tuesday that it should have “done a better job of communicating” its drone acquisition, does not believe that it even needs federal authorization in order to fly a drone. The Federal Aviation Administration thinks otherwise. (8/6/14)
  • - The Verge Ryan Calo, a lawyer specializing in robotics at University of Washington, believes anthropomorphic bots will raise new privacy concerns. Because we treat them like semi-humans, it feels like they’re always watching us. "If our spaces become populated by these artificial agents, we’ll never feel like we have moments off-stage," he says. The way someone has chosen to program their companion bot, or the way they treat it, could also become a trove of extremely personal information, he says. (8/5/14)
  • - Business Insider
    Ryan Calo, assistant professor of law at the University of Washington with an eye on robot ethics and policy, does not see a machine uprising ever happening: “Based on what I read, and on conversations I have had with a wide variety of roboticists and computer scientists, I do not believe machines will surpass human intelligence — in the sense of achieving ‘strong’ or ‘general’ AI — in the foreseeable future. Even if processing power continues to advance,we would need an achievement in software on par with the work of Mozart to reproduce consciousness.”
     
    Calo adds, however, that we should watch for warnings leading up to a potential singularity moment. If we see robots become more multipurpose and contextually aware then they may then be “on their way to strong AI,” says Calo. That will be a tip that they’re advancing to the point of danger for humans.
    (7/18/14)
  • - Wired How the legal system would deal with child-like sex robots isn’t entirely clear, according to Ryan Calo, a law professor at the University of Washington. In 2002, the Supreme Court ruled that simulated child pornography (in which young adults or computer generated characters play the parts of children) is protected by the First Amendment and can’t be criminalized. “I could see that extending to embodied [robotic] children, but I can also see courts and regulators getting really upset about that,” Calo said. (7/17/14)
  • - Forbes If an entrepreneur started up KidSexBots-R-Us, would it be legal? Ryan Calo, a law professor at the University of Washington, thinks it might be, based on the Supreme Court’s treatment of child pornography. “What appears to be child porn, but isn’t, is not illegal,” said Calo. Making or possessing child pornography results in severe legal penalties; those who watch child porn sometimes get longer sentences than people convicted of actually molesting children. However, in 2002, the Supreme Court drew a line between child porn and “virtual child porn” where the “child” is actually a young-looking adult or a computer-rendered image. It said images that are wholly faked, no matter how realistic they were, are legal. So the law might see sex with a “virtual child” the same way. At least in the U.S. (7/14/14)
  • - NBC News
    Ryan Calo, a drone expert and assistant professor of law the University of Washington, thinks the drones can be effective, but worries about how they might be used in the future after reports of them being rented out to agencies like the FBI and local sheriff's departments.
     
    "Once you have drones for this one purpose, you could start to use them more often domestically, and then they become part of an ever more militarized police force," he told NBC News. "That is a trend to be concerned about."
    (7/13/14)
  • - Forbes Ryan Calo, an academic at the University of Washington, was writing about corporate lab rats even before it became a hot topic of conversation. “It’s about information asymmetry,” he says. “A company has all this information about the consumer, the ability to design every aspect of the interaction and an economic incentive to extract as much value as possible. And that makes consumers nervous.” (7/10/14)
  • - Robotics Business Review
    It only lasted for two minutes and thirteen seconds (watch it for yourself below), but Ryan Calo’s “Big Idea” at this year’s Aspen Ideas Festival in Colorado is here to stay for a long, long time.
     
    What Calo said wasn’t overly profound; such things are difficult to pull off in two minutes—unless you’re Abe Lincoln, Shakespeare, or a Biblical prophet.
     
     
    Rather, he was making the kind of common sense that makes audiences nod in surprise agreement and then turn to one another and nod again, which in itself is a kind of profound reaction for an idea from a law professor from Seattle. But, this was Ryan Calo, and he has a habit of getting audiences to react to his ideas in that way.
    (7/8/14)
  • - Business Insider Ryan Calo, assistant professor of law at the University of Washington (and one of Business Insider's most important people in robotics), believes that robotic technology is advancing so rapidly with such heavyweight implications that the current structure of the US government will be ill-equipped to handle it, reports The Atlantic. (7/7/14)
  • - Marketplace Tech First up, Ryan Calo, Associate Law Professor at the University of Washington and an affiliate scholar at the Stanford Center for Internet and Society, talks about why companies like Facebook should be thinking about the ethics of information and consumer research. (7/7/14)
  • - Venture Beat
    University of Washington law professor Ryan Calo has recommended the creation of “Consumer Subject Review Boards”, which review the research of private companies. It’s akin to the Institutional Review Boards (IRBs) already standard at every major university.
     
    I met Professor Calo last week at the Atlantic Aspen Ideas festival; he later wrote to me, “I think Facebook would have fared better under this regime because they would have had a set of established criteria as well as a record of when and why it was approved.
    (7/5/14)
  • - The Seattle Times A woman alarmed by a drone flying around her Seattle high rise unknowingly launched a Portland business owner into a futuristic world of drones saddled with confusing policies. Now, Joe Vaughn could face a $10,000 fine for commercially flying his 25-pound drone. (7/5/14)
  • - The New York Times
    Ryan Calo, an assistant professor at the University of Washington School of Law who studies technology policy, has called for companies that conduct experiments on their users to create “consumer subject review boards,” a kind of internal ombudsman who would assess each proposed experiment and balance the potential risks to users against the potential rewards.
     
    “There’s enough pressure and understanding of this issue that these firms are going to have to come up with a way to make the public and regulators comfortable with experimenting with consumers,” Mr. Calo said.
    (7/2/14)
  • - Forbes
    When a technology company behaves badly, you hear one defense brought up repeatedly: they could have done so much worse. When Google decided that they would use your face in their advertisements, you shouldn’t have been outraged, you should have been relieved they didn’t tell everyone your darkest secrets. The message is, given what they know about you, you should be grateful that they treat you as well as they do.
     
    Meanwhile, there is an arms race to delve deeper into your personal information to make it actionable. While the last ten years were focused on how to collect as much information as possible, the next will be focused on how to turn that information into action. Legal scholar Ryan Calo argues that we need to watch out for “digital market manipulation” here – where companies use your background, details, and emotional state to coerce you into buying products you don’t need or paying higher prices than you normally would. He’s got a point; knowing and influencing your emotional state can be a major advantage in getting your attention, a factor that influenced Facebook to undertake this study in the first place.
    (7/2/14)
  • - Slate
    The government plans to use facial recognition and iris scanning to foreigners’ visa status as they’re leaving the United States, according to Nextgov. At a new biometric testing center in Upper Marlboro, Maryland, government officials will spend the next eight to 12 months working on the technology and its application for its premiere in 10 major airports by 2015.
     
    Ryan Calo, assistant professor of law at the University of Washington and a privacy expert, told me that he’s concerned with how facial recognition technology could judge the mental state of exiting passengers. “What I worry about with biometrics is the capacity to tell things like: Is this person nervous? Are they lying? … I worry about too closely studying human subjects at the borders, in or out,” he says. There are currently technologies that can register your emotion using facial recognition, and the new DHS program could include such abilities.
    (7/1/14)
  • - Mashable
    Facebook manipulated the News Feeds of hundreds of thousands of people to see if showing them mostly positive or negative posts affected their emotions. The research ignited anger among users, who accused the company of manipulation in the guise of science. But did Facebook actually break any laws? Mashable talked to law professors to separate fact from fiction. Several factors have to be considered when judging whether Facebook broke any laws. First of all, Facebook's terms of service (which the company calls its Data Use Policy) makes it clear that, when creating an account, a user consents to his or her data being used for "research" — although what kind of research is unclear.
     
    Ryan Calo, a privacy expert and law professor at the University of Washington, told Mashable that the study may be "creepy" but not necessarily in violation of any privacy law.
    (7/1/14)
  • - Forbes This weekend, the Internet discovered a study published earlier this month in an academic journal that recounted how a Facebook data scientist, along with two university researchers, turned 689,003 users’ New Feeds positive or negative to see if it would elate or depress them. The purpose was to find out if emotions are “contagious” on social networks. (They are, apparently.) The justification for subjecting unsuspecting users to the psychological mind game was that everyone who signs up for Facebook agrees to the site’s “Data Use Policy,” which has a little line about how your information could be used for “research.”  (6/29/14)
  • - The Atlantic Law professor Ryan Calo believes that robots are soon going to constitute a more abrupt departure from the technologies that preceded them than did the Internet from personal computers and telephones. Robotic technology is changing so fast, with such significant implications, that he believes the federal government is ill equipped to regulate the society we'll soon be living in. Hence his Friday pitch to an Aspen Ideas Festival crowd: a new federal agency to regulate robots. (6/28/14)
  • - The Wall Street Journal It’s a confusing time for those deciding whether to take a chance on law school. The odds of a law-school graduate landing a job at a large law firm have improved since the recession days, but the total number of available positions is still far lower than it was four years ago, as WSJ’s Jennifer Smith reported this week. (6/27/14)
  • - The New York Times
    Ryan Calo, an assistant professor at the University of Washington School of Law who specializes in robotics and drones, told me that the worry about drones colliding in the air, or people being hit by them, will start to ease as drones become smarter.
     
    “The next generation of drones, which are truly autonomous and can navigate using sensors and code, rather than people controlling them, will be much safer than the drones we’re seeing today,” Mr. Calo said in a phone interview.
     
    As Mr. Calo and others have pointed out, it is unlikely that drones will be permanently banned for commercial services in the United States. It’s only a matter of time before these vehicles are safe enough and powerful enough to deliver packages.
    (6/25/14)
  • - The New York Times
    But Ryan Calo, an assistant professor at the University of Washington School of Law, who specializes in robotics and drones, said the accidents that were occurring from private use of drones would become less common as the vehicles became safer and more autonomous. For now, fly with caution.
     
    “From a product liability standpoint, it’s pretty straightforward,” he said. “You buy this thing, you fly it, it’s likely your fault if something goes wrong.”
    (6/25/14)
  • - Business Insider
    Laws has to keep up with new technologies, and Ryan Calo has his eye on robot legalities, particularly with respect to policy and ethics.
     
    For example, Calo was quoted in this New York Times piece titled "When Driverless Cars Break The Law." Spoiler alert: it's complicated. "Criminal law is going to be looking for a guilty mind, a particular mental state — should this person have known better? If you’re not driving the car, it’s going to be difficult," he said.
     
    We need someone to think ahead towards what we haven't thought about yet, and Calo is so far the guy when it comes to the intersection of robots and the law. "Ready or not, robots are racing into our lives," he told the Wall Street Journal. "But for most people, the first time they’re going to really notice those robots...is when the systems go bad."
     
    (6/23/14)
  • - Wired
    License plate-derived intelligence is not new. Governmental agencies and law enforcement can easily collect license plates and link them to their owners to glean information about where and how we drive. With DiDi Plate, the threat is that the public would have access to information reserved for government officials, says professor Ryan Calo, a privacy and technology expert at the University of Washington.
     
    “The difference is that it’s in private hands,” he says, “with the opportunity to contact you.”
    (6/20/14)
  • - Geekwire
    A motto of the NSA, as revealed in the documents released by Snowden, is “Collect it all.” Set aside any discussion about when and whether the data collection is justified. When one side has a lot of it, and the other none, there’s a problem. The best argument I’ve heard for this comes from University of Washington professor Ryan Calo, who wrote a paper on the data collection being done by marketers and corporations. In a healthy consumer/marketer relationship, he argues, consumers have tools to resist marketers’ pull. When corporations can collect and exploit vast amounts of consumer data, they can nullify many of those tools, rendering consumers too weak for their own good.
     
    (6/19/14)
  • - KUOW Marcie Sillman talks to University of Washington law professor Ryan Calo about the Federal Aviation Administration's decision to allow BP to use drones in Alaska. (6/11/14)
  • - Marketplace "It was about consumer convenience," says Ryan Calo, a professor of internet and privacy law at the University of Washington. "The idea is that you drop a little file on a person’s computer and then you know them again when you see them." (6/4/14)
  • - Wired It’s quite clear: for most people, the link between government surveillance and freedom is more plainly understood by cars, rather than personal computers. As more and more objects become connected to the Internet these questions will grow in importance.And cars in particular might become, as Ryan Calo puts it in a 2011 article on drones, “a privacy catalyst”; an object giving us an opportunity to drag our privacy laws into the 21st century; an object that restores our mental model of what a privacy violation is. (5/30/14)
  • - IEEE Spectrum
    S.K. Gupta, a roboticist at the University of Maryland College, notes that cellphones were invented for people to talk, but we have found many new uses for them. “I believe that the same thing is likely to happen for home robots. Initially people will be interested in getting robots at home to help with basic household chores, but soon they will find new uses for these robots.”
     
    For this to come about, companies making domestic robots will have to give up control in the name of openness. How soon that will happen is unclear, though. A big impediment, argues Ryan Calo, a professor of law at the University of Washington, is the possible legal liability such openness would engender. People have come to expect their personal computers to sometimes act a bit buggy with third-party software, and as a consequence lawsuits are rare. But if personal robots ever went haywire, it’s likely that their owners would sue the manufacturer for damages.
    (5/29/14)
  • - Le Monde En tendant l'oreille, on pourrait presque entendre leur bourdonnement. Des drones civils surveillent déjà les lignes de la SNCF, nourrissent les reportages télévisés, captent les souvenirs des randonneurs, repèrent les déperditions de chaleur des habitations et, demain peut-être, livreront à domicile livres ou pizzas. (5/22/14)
  • - The New York Times
    There is little doubt that the technology behind driverless cars is nearly advanced enough for mainstream use. Google plans to make its biggest public display yet of its cars on Tuesday, when it takes reporters on spins around Mountain View, Calif. Carmakers like BMW and Toyota are also preparing to sell cars that drive themselves.
     
    Instead, the bigger question about driverless cars is a legal one. Who is responsible when something goes wrong?
    (5/13/14)
  • - Live Science When Raphael Pirker needed overhead shots for a commercial he was filming at the University of Virginia, instead of spending thousands of dollars to rent a helicopter, he attached a camera to a 5-lb. (2.3 kilograms) model airplane, creating a custom drone to capture high-flying aerial views of the campus. A year earlier, the 29-year-old photographer piloted a similar drone around the Statue of Liberty in New York, buzzing the monument's iconic crown and recording stunning close-up views of Liberty Island and downtown Manhattan. Drones have been used by the military for decades, but Pirker's videos offer a glimpse of just one possible way these robotic fliers could be used in the future.  (5/1/14)
  • - Ars Technica
    As such, robots are also affecting our society, law, and culture. At the 2014 “We Robot” Conference at the University of Miami that just wrapped up (April 4 to 5, 2014), scholars gathered to discuss a number of legal, ethical, and moral questions related to emerging robotic technologies. Conference topics ranged from considerations of regulatory schemes for domestic drone oversight to an ethical guide to human/robot interactions.
     
    At the conference, cyberlaw professor Ryan Calo discussed his forthcoming paper "Robotics and the New Cyberlaw." Internet law defined the vanguard of cyberlaw issues in the late 1990s and early 2000s, but Calo argues that the next wave of legal showdowns will relate to robotics, which have an altogether different set of essential qualities when compared with the Internet.
    (4/7/14)
  • - NBC News
    In the United States, someone injured by a small drone would have a strong case against the person remotely flying it, even if the injured party was simply startled by the drone and fell down, Ryan Calo, an assistant professor of law at the University of Washington, told NBC News.
     
    It’s not that different from lawsuits involving any other product. The story might be different, however, if the drone was hacked.
     
    “Then the person who hacked the drone would be responsible, not the operator,” Calo said. “The person flying it could be off the hook then. But it would be the operator’s obligation to prove it.”
    (4/7/14)
  • - C-SPAN
    Law Professor Ryan Calo presented his paper on the intersection of robotics and cyberlaw and what the future may hold for the two disciplines. He was joined in discussion by Professor David Post.
     
    “Robotics and the New Cyberlaw” was a panel of “We Robot 2014,” an annual conference on legal and policy issues relating to robotics hosted by the University of Miami Law School at the Newman Alumni Centerin Coral Gables, Florida.
    (4/5/14)
  • - Los Angeles Times The Federal Trade Commission and California Atty. Gen. Kamala Harris say that Facebook is misinterpreting how a children’s privacy law applies to teen privacy in a move that could undercut the giant social network in a federal court case in California. University of Washington law professor Ryan Calo said it was unclear what effect the FTC and the attorney general weighing in would have on the case. (3/24/14)
  • - NPR All Things Considered
    Imagine using image recognition when a drone is flying in the air and matching faces against faces on a kill list, he suggests. If a robot like that made a mistake, who would be responsible? The programmer? The manufacturer? The military commander who launched it on its mission?
     
    "It forces us to confront whether we really control machines," says Ryan Calo, a law professor at the University of Washington. Calo says these tensions won't just play out in the military, but will crop up whenever we are tempted to allow robots to make decisions on their own.
    (3/21/14)
  • - Los Angeles Times “If you want to surreptitiously record someone, there are much better things than Glass,” University of Washington law professor Ryan Calo said. “The reason that this is elevated to a national conversation is precisely because we are moving from handheld to wearable devices, and this is part of the growing pains we are seeing around that.” (3/18/14)
  • - Forbes “The judge noted in passing that the FAA’s public communication around defining UAS [unmanned air systems] was technically defective. He didn’t rely on this alleged defect—rather, he said even in talking about UAS, the FAA excluded modelers like Pirker again,” says drone law expert Ryan Calo. “Obviously some drones are subject to FAA regulation. Delta wouldn’t be able to remove pilots from its 747 and suddenly be free of FAA regulation. I imagine something like a Predator B would also clearly qualify as an aircraft without additional FAA regulation. The question is where the line is. I think would-be commercial operators like Amazon or Tacocopter should hold off both because the law is uncertain.” (3/17/14)
  • - Los Angeles Times A confrontation at a San Francisco bar involving a Google Glass tester points to the public's growing concern over the invasive nature of new technologies such as wearable gadgets and drones. (2/28/14)
  • - NPR All Things Considered
    Social media monitoring started in the world of marketing, allowing companies to track what people were saying about their brands. But now, with software that allows users to scan huge volumes of public postings on social media, police are starting to embrace it as well. Ryan Calo, a professor at the University of Washington law school who specializes in privacy issues, says police could run into trouble searching on the Internet.
     
    "If officers were [scanning social media] on the basis of gender and then making decisions on that basis, you could run into constitutional scrutiny," Calo says. "And you'd be almost sure to if your keyword involved the word 'Muslim.' "
    (2/28/14)
  • - Public Radio International
    As companies gather more digital data about potential customers, they have the ability to use that information to charge different prices to different users or steer different users to different offers.
     
    Ryan Calo, a law professor at the University of Washington, calls this the “mass production of bias,” in which companies use personal data to exploit people’s vulnerability.
    (2/28/14)

Last updated 5/5/2014