Kristen Niven, Acquisitions Editor of Cardozo Arts & Entertainment Law Journal, Volume 33, won first place in the 2014 Honorable William Conner Intellectual Property Law Writing Competition, awarded by the New York Intellectual Property Law Association. Her note, Towards a New Model for Social Media Newsgathering: AFP v. Morel and Digital Rights in the Age of Citizen Journalism, will be published in a forthcoming AELJ Volume 33.
In today’s panel concerning the reform of U.S. cultural property policy, panelists discussed whether there is a conflict between the Convention on Cultural Property Implementation Act (“CPIA”) and the National Stolen Property Act (“NSPA”) and whether it creates a problem. The general consensus of the panel was that while there exists tension between the two acts, there is no actual conflict between the CPIA and U.S. criminal laws, which can actually coexist. The panelists pointed to the fact that criminal cases do exist in the context of cultural property, but in most of the forfeiture cases there is another U.S. based offense included other than the NSPA line of cases. Panelist Andrew Adler discussed three of the main sources of tension between the two acts including: the disparity in the definition of the word “stolen,” the issue with repose, and a technical conflict with the burden of proof. From another perspective, panelist Michael McCullough stated that the most complicated area of concern is whether potential buyers risk criminal exposure when purchasing a piece due in part to a lack of clarity in the NSPA .
Another interesting point briefly discussed by the panel is the suggestion that the real underlying conflict stems from the international cooperation that has changed over the past 30 years. Moderator Jeanne Schroeder stated that the real conflict seems to exist between Congress circa 1983, when the CPIA was enacted, and the internationalist world today. Despite the fact that the panelists did not come to an agreement as to how this could be handled in the future, they all seemed to support the proposition that the CPIA and U.S. criminal laws do not conflict with one another.
A recent controversy within online gaming community involves the video blogger Anita Sarkeesian and her video series “Tropes vs. Women in Video Games.” Sarkeesian’s video critiques the portrayal of women in video games and related media. However, this controversy is particularly interesting because it involves a claim of copyright infringement.
In the video and promotional materials for the series, Sarkeesian used an image of “Princess Daphne” from the game Dragon’s Lair. The “official” depiction of Daphne in Dragon’s Lair looks like this:
Sarkeesian promoted her “Tropes vs. Women in Video Games” series with a collage of female video game characters including Princess Daphne. However, Sarkeesian used an image of Daphne pulled not from Dragon’s Lair, but rather from the page of a fan artist named “Tammy” from CowKitty.net. Sarkeesian’s promotional materials featured Tammy’s artwork with her gray background and artist signature removed, as seen below:
Tammy was upset that her work was used without her permission, and threatened to sue Sarkeesian’s company, Feminist Frequency. Sarkeesian and her lawyers responded by claiming that a “remixed collage is transformative in nature and as such constitutes a fair use of any copyrighted material as provided for under section 107 of the US Copyright law.” Allegations started to fly, and Feminist Frequency was challenged as not being a legitimate non-profit.
Summary of Ryan Harkins’s presentation in a panel on “Disclosure and Notice Practices in Private Data Collection” at Data Privacy & Transparency in Private and Government Data, April 4, 2014 at Benjamin N. Cardozo School of Law.
In addition to making visiting more websites possible, advances in technology have made collecting data from visiting all those websites possible. Further, advances in technology have meant the ability to analyze that data in ways beyond the imagination of researchers, with computers catching on to trends in data that people may have never noticed.
It’s all a part of “big data,” which is basically the ability to store and process massive amounts of data at rapid speeds, according to Ryan Harkins, Privacy Attorney at Microsoft. Advances in computing have allowed machines to draw insights or identify correlations that may not have even been contemplated at the time of data collection.
And with the advent of the internet of things, “there’s been an explosion in the amount of data that’s being collected about each and every one of us,” Harkins said.
Currently, the collection and use of this data is done under the guidance of the Federal Trade Commission’s Fair Information Practice Principles (“FIPPs”), which are designed to ensure that data about individuals is processed fairly while maintaining some level of privacy for the user.
At the heart of the FIPPs is consent, Harkins said, but the notice and consent model has been weakening over time. The current regime places the burden on individuals, who are expected to read, understand, and make informed decisions on complex data activities. “It’s become exceedingly difficult for individuals to do,” Harkins said.
According to Harkins, the key is striking the proper balance between informing users of the important issues, but not overwhelm them. “On the one hand, you want to be clear and comprehensive, but you want to be comprehensible and precise on the other,” he said.
But while big data has tremendous promise for societal benefit – an analysis of how email spammers mutate their algorithms to stay ahead of filters helped a study map the mutation of HIV – it also threatens to obliterate the notice and consent model altogether, Harkins said.
Harkins offered several ideas on how to unlock the potential of big data while providing privacy protection for individuals, including strengthening and adapting the notice and consent model, bolstering other principles in the FIPPS (for example, better security, transparency, and integrity requirements), and even using technology as the solution to the problem technology has created.
For more from this panel, click here.
Summary of Lorrie Cranor’s presentation in a panel on “Disclosure and Notice Practices in Private Data Collection” at Data Privacy & Transparency in Private and Government Data, April 4, 2014 at Benjamin N. Cardozo School of Law.
According to Cranor, who is an Associate Professor of Computer Science and Engineering and Public Policy at Carnegie Mellon University, part of the problem is the inerent user-unfriendliness of the standard block of text. Compare that, Cranor said, to a nutrition label on food, which presents information in a standardized format and language, with enough brevity to be useful at a glance but enough detail to cover all the necessary information.
Other efforts to improve privacy policies have been thus far unsuccessful. Mozilla has tried to use icons, but the symbols are not intuitive and difficult to learn. P3P format has largely been circumvented. A study showed that the AdChoices symbol, which is intended to show that the advertisement you see is being displayed based on websites you have visited in the past, is widely misunderstood, with nearly half of participants believing if they clicked the symbol it would take them to a site where they could buy advertisements.
A glimmer of hope, according to Cranor, can be seen in the financial privacy notices that accompany credit card bills and bank statements. In their new form, information on data collection and use is presented in a simplified chart form that is much more accessible to consumers and coming closer to actual “notice” than the brick of legalese that is most privacy policies. The next step would be to provide some method for consumers to compare privacy policies alongside the description of services, to allow consumers true informed choice in deciding which bank to use.
For more from this panel, click here.
As its name suggests, “big data” is huge. The meme refers to the collection and analysis of vast data sets collected everywhere in the digital domain from web searches, to social network communications, to Internet advertising, to even the numerous digital sensors integrated into our daily lives. Paired with increasingly sophisticated computing intelligence, the predictive power of big data has caused data to become an indispensible asset for businesses and the government. Generally, businesses use big data to target potential customers while the government uses data to monitor and enforce governmental and national security policies. Omnipresent data-mining algorithms as well as surveillance law used to generate predictions for businesses and the government, however, have raised privacy concerns and spurred a debate on the role of transparency in the Information Age.
Several panelists at the AELJ’s Spring Symposium on Data Privacy and Transparency addressed virtues and vices of the transparency as it relates to big data. On the most basic level, transparency promotes democratic values of fairness, accountability, and an informed American public. Transparency reports also provide a vehicle by which individuals may raise their privacy rights before the courts without having to rely on companies or government agencies to bring privacy cases on their behalf, thus promoting efficiency and autonomy. Another virtue of transparency is the promotion of innovation. While it would admittedly be impossible to obtain consent for every way that data could be used in the future, big data is collected as raw material for data-driven discoveries so that transparency could be useful in promoting innovation regarding how data is analyzed and how patterns and behaviors are revealed.
On the other side of the debate, one of the major vices of transparency is in the realm of national security. The argument holds that transparency would undermine legitimate government aims. Were transparency to disclose the government’s data collection techniques or disclose which players cooperate with the government, wrongdoers could work around the techniques or avoid interacting with those players, and thus game the system. More fundamentally, however, the argument against transparency is that disclosure only works when the information becomes embedded in decision-making, which may fail to occur regarding big data. Comparing big data to nutrition information, the characteristics of transparency that make nutritional labels successful are arguably not present; for example, in big data the policy purpose for the disclosure may be unclear such that transparency reports do not really effect decision-making. Despite its vices, the panelists agreed that big data transparency serves an important public function, as big data is here to stay.
Comments from a viewer at the Data Privacy & Transparency in Private and Government Data, April 4, 2014 at Benjamin N. Cardozo School of Law.
In June of last year, Edward Snowden revealed to the public the lengths the NSA will go in the name of national security. But what interest does our nation value the most? Should we put the need to ensure national security above all else? Or should even the means of creating security be held accountable under the democratic process and the three branches of the government?
Nathan Cardozo began his speech with conflicting quotations from our own government. On the one hand, President Obama has affirmed “we can and must be more transparent” but on the other, the Department of Justice has stated our country has an “unquestioned tradition of secrecy.” The question Mr. Cardozo poses is how these both can function as legitimate representations of our government’s interests? Moreover, why is an unquestioned tradition of secrecy permission for such a practice to continue? Rather, as Mariko Hirose explained, when Snowden illuminated the practices of the NSA, it cracked the door for transparency, and that transparency can lead to informed thinking, and then to accountability. This transparency, and as such accountability, must continue without leaks forcing the government’s hand.
Currently, despite the public knowledge that certain practices of the NSA exist (such as the coordination between AT&T and the NSA), the NSA continues to treat these policies as state secrets. However, if through the proper democratic procedures (such as legislative action, executive action, and judicial reform) the actions critical to ensure national security are held accountable, perhaps national security and the democratic process do not need to be mutually exclusive interests.
Summary of Helen Nissenbaum’s presentation in a panel on “Disclosure and Notice Practices in Private Data Collection” at Data Privacy & Transparency in Private and Government Data, April 4, 2014 at Benjamin N. Cardozo School of Law.
Notice and consent might be a solution to some problems, but Helen Nissenbaum doesn’t think it is the solution to the privacy problem.
Consent is at the heart of so much privacy litigation. Nissenbaum, Director of the Information Law Institute and a Professor of Media, Culture and Communication, and Computer Science at New York University, explains that many of the issues that arise from consent stem from the fact that there is a huge blur between consent and coercion, as there is so much trouble when it comes to understanding and implementing. Websites that have privacy policies that state “be sure to return to this page to make sure you have the most current version of this policy,” simply highlight how much these policies are being ignored. The fact is, many companies don’t even need you to consent to much of the data that they collecting.
In a way, Nissenbaum points out, notice and consent is really a sham. She thinks we waste entirely too much time focusing on notice and not enough time thinking about the ways to actually limit data collection. Once this data collection is limited to areas in which such data is really needed, then we can aim to retrieve real, meaningful consent.
For more from this panel, click here.