Skip to end of metadata
Go to start of metadata



The problem

Working towards an open science environment with optimal opportunities for reuse of research data can be perceived as contradictory to the adequate safeguarding of intellectual property rights (IPR) for companies that invest in public-private partnership projects and researchers who want to use their own results. This is a fallacy, because rules and legislation to protect the IPR of private parties will continue to exist in the future. In fact, project partners will be stimulated to think about their data policy. Opt-outs and careful examination by partners of what to share and what not to share within projects will ensure that private parties will still be able to profit from their investments, and researchers will still be able to use their own results. Public-private and public-public cooperation can be hindered by a lack of clarity about this issue during the transition.
The re-use of personal data for scientific purposes also needs some further thinking. A deeper insight in the tension between privacy and open science is needed.


The solution

Clarify IPR regimes to all parties involved in public-private partnership projects and potential new parties who are not aware of the possibilities.
Set rules and conditions for public funding of research in which open (data) is the default standard.
Implement 'privacy by design' to overcome legal and operational uncertainty.


Concrete actions

  • Research funders and the European Commission: set open data as the default standard for publicly funded research and communicate clearly that this does not equate to relinquishing intellectual property in public-private and public-public partnerships.
  • Research Performing Organisations and private partners: think actively about what to share and what not to share and avoid automatically choosing the safest option (i.e., not sharing).
  • Research funders and Research Performing Organisations: develop and set standards on privacy by design also in negotiations with other partners on reuse of data.



Expected positive effects

  • Continuous engagement of private parties in public-private partnership projects;
  • New solutions, products and services, to be developed by actors who do not currently have access to the data they need, or even do not know that the data exists and can be useful for their business;
  • Privacy-enhancing conduct in research projects, thus safeguarding trust.
  • No labels

9 Comments

  1. Anonymous

    I have some concerns with  "within projects will ensure that private parties will still be able to profit from their investments"

    This is not a goal on itself,

    It is rather more important to include a mechanism to ensure societal access to technology, knowledge and products (such as medicines) at a reasonale price! This is really important to include, because otherwise open science might have unwanted effects that are difficult to forsee rightnow! And as said at the meeting, making a profit is not a problem. But not mentioned was that the profit should be reasonable and it should not limit societal access.

    Suggestion: (bold is added tekst)

    Opt-outs and careful examination by partners of what to share and what not to share within projects will primarily be established to ensure societal access to knowledge, technology and products (such as medicines) for a reasonale price, while considering that private parties will still be able to profit from their investments, and researchers will still be able to use their own results.

    If this is included you can also add to the "Expected positive effects"

    Ensure societal access to knowledge, technology and products (such as medicines) for a reasonale price

    This would be great, also considering the current activities of the Dutch minister Schippers. This would be a great way to have a large and additive effect of Open Science! To have real societal impact! 

    for questions please contact me at 

    mdejong at aidsfonds dot nl

     

  2. The collocation of IPR and privacy issues is a bit odd; the omission of security issues is definitely weird. (Recall the case of the Dutch virologist who created an airborne variant of the deadly H5N1 virus and had to be stopped from making the procedures public.)

    With respect to the latter two, what is being overlooked is the unforeseeable developments of new and more powerful techniques for extracting information from large data sets. (The recent success of so-called ‘deep learning’ methods to develop a programme that can beat the world’s best g-player at his game is an example: until recently the game of go was considered ‘safe’.)

    What is needed are restrictions on privacy- and security-sensitive that aim to be “future-proof”; minimally, the restrictions should be periodically tested on the latest information extracting techniques, and revised when necessary. 

     

    Martin Stokhof, on behalf of the Open Access Working Group of the European Research Council


    Proposed changes:

    3. Improve insight into issues concerning IPR, privacy and security


    The problem

    Working towards an open science environment with optimal opportunities for reuse of research data can be perceived as contradictory to the adequate safeguarding of intellectual property rights (IPR) for companies that invest in public-private partnership projects and researchers who want to use their own results. This is a fallacy, because rules and legislation to protect the IPR of private parties will continue to exist in the future. In fact, project partners will be stimulated to think about their data policy. Opt-outs and careful examination by partners of what to share and what not to share within projects will ensure that private parties will still be able to profit from their investments, and researchers will still be able to use their own results. Public-private and public-public cooperation can be hindered by a lack of clarity about this issue during the transition.

    The re-use of personal data for scientific purposes also needs further thinking. A deeper insight in the tension between privacy and open science is needed, and strategies for making privacy-sensitive data safe for future developments in information extraction techniques shuld be developed.

    In some cases there may also be a tension between open data and security, where access to research data could lead to use that presents a safety or health risk to certain groups or the population at large. Assessment of such risks should be part of relevant data management plans, and monitoring the possibility of novel ways in which data can be used in ways that form a security risk should be implemented.


    The solution

    Set rules and conditions for public funding of research in which open (data) is the default standard.

    Clarify IPR regimes to all parties involved in public-private partnership projects and potential new parties who are not aware of the possibilities.

    Implement 'privacy by design' to overcome legal and operational uncertainty. Make storage of and access to privacy- and security-sensitive data as future-proof as possible, with access strategies than can be revoked, and with periodic monitoring of possible threats.


    Concrete actions

    • Research funders and the European Commission: set open data as the default standard for publicly funded research and communicate clearly that this does not equate to relinquishing intellectual property in public-private and public-public partnerships. Develop a set of rules that minimise the risk of misuse of privacy- and security-sensitive data and implement those rules,
    • Research Performing Organisations and private partners: think actively about what to share and what not to share; avoid automatically choosing the safest option (i.e., not sharing), but remain vigilant to the possibility of future information-extraction tools that might create new privacy-breaches, and future developments that might form new security risks.
    • Research funders and Research Performing Organisations: develop and set standards on privacy and security by design also in negotiations with other partners on reuse of data.

    Expected positive effects

    • Continuous engagement of private parties in public-private partnership projects
    • New solutions, products and services, to be developed by actors who do not currently have access to the data they need, or even do not know that the data exists and can be useful for their business;
    Privacy-enhancing and misuse-avoiding conduct in research projects, thus safeguarding trust.

     

  3. Anonymous

    Suggest references to IPR in the various sections of this document (eg here, and section 2 on TDM) are cross-referenced, and attention given to make sure they are consistent and logically presented.  Neil Jacobs / Jisc, UK

  4. Anonymous

    Response on behalf of Creative Commons Europe: 

    As stated, the activities to be pursued in this call to action on open science will be in service of two pan-European goals for 2020: 1) Full open access for all scientific publications, and 2) A fundamentally new approach towards optimal reuse of research data. We do not understand why Action 3 should take into consideration activities that “will ensure that private parties will still be able to profit from their investments.” The authors of this Action should clarify how such a statement (and activities suggested therein) will aid in the work toward the overarching goals for 2020, especially in light of the proposed solution to “Set rules and conditions for public funding of research in which open (data) is the default standard.”

    Gwen Franck, Regional Coordinator CC Europe

  5. Anonymous

    training for open science should includeawareness building for licensing issues and the diverse possibilities of exploitation incl alternative ways such as the creation of commons. Katja Mayer / Uni Vienna

  6. Anonymous

    2nd concrete action ("think actively about what to share"): this should be addressed not only to 'RPOs and private partners' but also to individual researchers who at the project level decide what to share and what not.

    Dagmar Meyer, Brussels

  7. Anonymous

    Regarding the alleged fallacy with respect to IPR, there might be a misunderstanding: the concern of Philips is not that our existing IPR wouldn’t be protected anymore in an open science environment, but that it would be very difficult or even impossible to protect future project results by means of IP (e.g. by applying for patents) or confidentiality once all research data from the project are out in the open.

    Jan van den Biesen, Head of Public R&D Programs, Vice President, Philips Research

     jan.van.den.biesen@philips.com

    1. Anonymous

      it is not about all or nothing, though! there are many levels of openness, and there are many exceptions possible. Katja Mayer Uni Vienna

  8. 'Improve insight into IPR and privacy/data protection' would be a more correct title for the third Action.

    Privacy is not an issue but a fundamental right (see for instance: Article 8 of the European Convention on Human Rights). In the same way as IPR is not an issue, but a set of rights.

    However, we fully agree with the statement made by Celina Ramjoué (see her comment on the Amsterdam Call for Action on Open Science); in which it is argued to address privacy/data protection in a separate Action:

    Action 3 (IPR/Privacy):

    - IPR and Privacy/data protection (including privacy by design as a proposed solution) should be split into two actions (related to optimal re-use of research data) as these are two broad, important and different fields.