Despite the European legislator’s latest efforts to protect both natural persons, with regard to the processing of personal data (meaning any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, see Article 4(1) of the GDPR), and free movement of such data, private businesses not only massively process (collect, organize, alter, disclose, combine, erase or destroy) personal data (Pasquale, 2015: p. 56) , but also view the latter as an asset (Almunia, 2012; Cohen, 2000: p. 1375; O’Neil, 2016: p. 151 ; Prins, 2006: p. 228 ; Hoofnagle, 2003; Michaels, 2008) and, thus, sell it, exchange it or even produce it ( Crawford & Schultz, 2014: pp. 94-95, 98 ; Prins, 2006: pp. 226-230 ). Scholars speak of theft of humanistic property (Mann, 2000) and argue that natural persons should get paid for such processing (Laudon, 1996: p. 103) . Users by a “single mouse-click” give their valid consent (Recital (32) of the GDPR ) to the processing of their sensitive information. Some authors worry that algorithms discriminate against consumers, when projecting the “perfect ad” (A29DPWP, 2013: p. 46) , promoting the “appropriate good” at the “appropriate price” ( Turow & McGuigan, 2014: pp. 27-29 ; EDPS, 2015: p. 19 ), predicting criminal behaviors (Chander, 2017: p. 1026) or “evaluating” the accused before sentencing courts (State v. Loomis, 2016) .
Such practices reveal a need to regain the users’ control over their personal data. This paper briefly examines the potential to subject personal data to trade secret protection rules and studies the outcome of attributing commercial value to such data.
2. Regaining Control by Subjecting Personal Data to Trade Secret Rules
If personal data were treated as trade secrets, it seems that the data subjects would better control their private information without limiting the free movement of this data (Prins, 2004; Malgieri, 2017) . Personal data could be protected by rules that govern trade secrets (Samuelson, 2000) , which ensure such control that enables that the secret holder not only keeps private information well-hidden, but also benefits from its exploitation (Franzoni & Kaushik, 2016) .
Trade secrets, or know-how (i.e. a package of (non-patented) practical information, resulting from experience and testing by the franchisor, which is secret, substantial and identified, see Article 1(3)(f) of Commission Regulation (EEC) No 4087/88 ; Article 1(1)(i) of Commission Regulation (EC) No 772/2004 ), are separately protected in the law and differ from industrial property rights, copyright and neighboring rights (Article 1(1)(g) of Commission Regulation (EC) No 772/2004 ). Under Article 2(1)(a-c) of Directive (EU) 2016/943 of the European Parliament and of the Council (hereinafter referred to as “ Directive (EU) 2016/943 ”), “trade secret” means information which is secret (in the sense that it is not, as a body or in the precise configuration and assembly of its components, generally known among or readily accessible to persons within the circles that normally deal with the kind of information in question, see Article 2(1)(a) of Directive (EU) 2016/943 ), has commercial value because it is secret, and has been subject to reasonable―under the circumstances―steps, by the person lawfully in control of the information, to keep it secret. Thus, each “item of information” may be protected as a trade secret, given that the latter extends to the commercial data, such as information on customers and suppliers (Recital (2) of Directive (EU) 2016/943 ). Moreover, with regard to its legal status, right to know-how, in contrast with other intellectual property rights, does not constitute an absolute right (Lemley, 2008: p. 330) . It is a stand-alone legal right capable of being separately protected. In addition, a trade secret right is not exclusive (Recital (16) of Directive (EU) 2016/943 ).
Under Directive (EU) 2016/943 , the acquisition of a trade secret without the consent of the trade secret holder shall be considered unlawful, whenever carried out by unauthorized access to, appropriation of, or copying of “data” (e.g. documents, objects, or electronic files), lawfully under the control of the trade secret holder, containing the trade secret or from which the trade secret can be deduced (Article 4(2)(a) of Directive (EU) 2016/943 ). The use or disclosure of a trade secret shall be considered unlawful whenever carried out, without the consent of the trade secret holder, by a person who is found to have acquired the trade secret unlawfully or has been in breach of a confidentiality agreement or any other duty not to disclose the trade secret or has been in breach of a contractual or any other duty to limit the use of the trade secret (Article 4(3) of Directive (EU) 2016/943 ). The acquisition, or use or disclosure, of a trade secret shall also be considered unlawful whenever a person, at the time of the acquisition (or use or disclosure), knew or ought, under the circumstances, to have known that the trade secret had been obtained unlawfully (Article 4(4) of Directive (EU) 2016/943 ).
Given the above, one can conclude that when a person, a licensor, has provided “data”, which constitute trade secrets, to another for a particular purpose, these data cannot be used for other purposes without obtaining the licensor’s permission. At the same time, license rights may be lawfully further transferred by the licensee, only if such right to sublicense has been agreed, i.e. only if the initial licensor has given her consent. Indeed, the above conclusions constitute general rules that govern trade secrecy at international level (Samuelson, 2000: pp. 1155-1156; Byrne, 1998: pp. 210-211; McDaniel, 1986: pp. 45-47) .
If it were accepted that personal data, under certain conditions, fulfill the trade secret’s definition, the treatment of the right to personal data as a right to a quasi (Malgieri, 2017) , at least, trade secret could guarantee adequate consumer protection. Besides, one could claim that personal data ought to be secret, in the sense that it ought not to be generally known among or readily accessible to persons within the circles that normally deal with the kind of information in question. Moreover, given that personal data possibly subject to exploitation, it ought to have commercial value, because it is secret by default, at least in Europe. A private company, which is lawfully in control of personal data, is by law obliged to take reasonable measures to keep its personal data secret. Hence, given contemporary reality, personal data “ought to” simultaneously fulfill trade secret’s definition. If licensing rules that govern trade secrets were applied to personal data processing, consumers, as licensors, would provide data to businesses for a particular purpose. Thus, it would be prohibited to use such data for other purposes without the licensor’s permission or to further transfer license rights without the initial licensor’s consent (Hon, Millard, & Walden, 2012) .
Under rules that govern trade secrets, confidentiality and, thus, trade secret rights, should be used as a tool for business competitiveness and innovation management (Article 5(1)(f), Recitals (39), (49) and (83) of GDPR; Recital (2) of Directive (EU) 2016/943 ). Hence, if personal data were treated as quasi trade secrets, a dynamic competition would very likely take place. In this context, perhaps stronger guarantees could be provided, in particular with regard to the transparency of the processing of personal data. Perhaps we would see new, high-quality services to the end user and initial licensor. Private companies, which would be unable to further transfer personal data to third parties without the licensor’s permission, would likely endorse new technologies to transparently and rationally process personal data for specific purposes, which would have, in any case, been determined by the licensor (Samuelson, 2000) .
3. Positive Scenarios of Attributing Commercial Value to Personal Data
If personal data were governed by trade rules, individuals would likely be more aware of their own information’s value and power. This awareness should be considered to be an essential prerequisite that would enable persons to control their data.
An individual may enjoy a specific service, such as, for instance, a free e-mail service, having given her consent to the processing of her data. The company supplying the email service processes the user’s data in multiple ways (by collecting it, analyzing it, correlating it etc.). Even if the user has freely given her consent to this processing, a case in which “too much mouse-clicking” would be required (Recital (43) of GDPR ), the e-mail service provided remains “one and single”. Thus, one should wonder whether this one service “costs” or should cost that much as is worth the fee against which it is offered. The fee is or should be equal not to the value of a single data processing operation but to the value of unlimited, in multiple ways and perpetual further processing of personal data.
Could commercial value be attributed to personal data to enable the individual to realize, after having been aware of this value (Prince, 2018: p. 22; Malgieri & Custers, 2017) , the “heavy costs” she pays for “free” digital services?
International institutions have recognized that personal data may be provided or used as money in exchange for the supply of digital content and digital services (Article 3(1) of Proposal of the European Commission, 2015; Report of the European Parliament, 2017 ). At the same time, sensitive items of information tend to become or have―in practice―already become the new currency of the Internet age. Given countless services and huge volume of content, which are, on a daily basis, supplied to the consumer, in exchange for which personal data are provided, the consumer’s data undeniably have some non-negligible economic value, the exact calculation of which has already become subject to scientific investigations (OECD, 2013; Chirita, 2018: p. 11 ; EDPS, 2014: p. 9 ).
As some scholars argue, personal data that concern some general attributes, such as age, “is worth less” than other, more sensitive, personal information e.g. data concerning health, which are “worth about 26 cents per person” (Malgieri & Custers, 2017: p. 6) . The value of the information “George drinks chocolate at home” must be lower than the value of the information “every Saturday at 09.00 a.m. George and his girlfriend, named Stella, drink chocolate at George’s apartment (31 rue Dedieu 987776 Villeurbanne), whilst listening to jazz”. Although under some studies or some “personal data’s value calculators” (Steel, Locke, Cadman & Freese, 2013; Curtis, 2015; BCG, 2012) , the total value of all this information may be less than a “dollar per person and peruse”, each person provides her data constantly and on a daily basis. An individual may provide same personal data (e.g. an e-mail or a last name) to multiple companies, whereas each enterprise need only collect data from each person once. Furthermore, numerous correlations of data, which are promoted by Big Data technologies, may attribute “higher value” to the same data, given that they may be combined in multiple ways.
So personal data have economic value, which is in fact countable. There would be nothing unreasonable, it follows, in adopting rules, which would enable individuals to know the value of their data and, thus, to better control processing (Malgieri & Custers, 2017: p. 10) .
All these “scenarios”, cited above as examples, which could be the outcome of a possible subjection of personal data to trade rules, would enable persons to better control the processing of their personal data. At the same time, free movement of the latter would be ensured and, thus, personal data would be transparently processed for the benefit of humanity and science.
To subject personal data to trade rules might help further resolve other crucial issues, such as that of the processing personal data of the deceased. Provisions on the protection of natural persons with regard to the processing of personal data do not apply to the personal data of deceased persons (Recitals (27), (158) and (160) of GDPR ). Hence, any enterprise may, unconditionally, without consent and for any purpose it may wish, process a huge volume of personal data of subjects/users, who passed away, after having “ticked too many boxes” during their lifetime (Bouc, Han, & Pennington, 2016: p. 636) .
If personal data have economic value, perhaps we should consider negative that this value should end up in private companies, which were or are collecting them, while the subject was, or is alive, and which will perpetually continue to process them and exploit them for profit in multiple ways. This should perhaps be deemed negative, not so much because of an unjustifiable economic benefit which private enterprises earn, but because perhaps some of this value of personal data should, as a matter of justice, probably reach the subject’s heirs. This is because these are the “loved ones” and the only persons with whom in the offline environment the subject shared and to whom she trusted her personal information that concerned her health, her personal conversations, or even, the fact that she “went for a walk” somewhere sometime during her lifetime. The individuals should have the right not to let these items of information, which may now “by a single mouse-click be turned” to private companies, become after their death private companies’ perpetual source of income. Besides, given that personal data commercialization would enable the former to be transferred as assets, some significant social problems could be resolved, such as that of lack of financial resources in the area of some non-profit institutions that pursue vital e.g. social, educational or cultural objectives and which the data subject might wish to include in her will.
Setting aside personal data of deceased persons and focusing on those of the living ones, we should question whether the use of potential future personal data with commercial value, which any individual could produce just by living her life, could help resolve some even more important global issues, such as poverty, given that every person could, perhaps, earn some financial benefit in exchange of her data.
In the past, when it was asked “What would people do when they no longer needed to grow food to survive?”, the answer was given by the industrial revolution ( Lemley, 2015: p. 513 ; Overton, 1996 ). In an age of abundance and a world without scarcity (Lemley, 2015: p. 514; Rotman, 2013) , humanity could be directed towards new exchange models, where people would be rewarded for their acts of benevolence and contribution to societies. Individuals might just find new things to do or they would devote their time to the creation and production of knowledge, just because there would be time to be devoted to such purposes. Optimistic scenarios are countless and perhaps preferable, for example, to the opaque processing of personal data in favor of private businesses’ interests, without the knowledge of the data-subject and, sometimes, through discriminatory procedures.
 Bouc, A., Han, S.-H., & Pennington, N. (2016). “Why Are They Commenting on His Page?”: Using Facebook Profile Pages to Continue Connections with the Deceased. Computers in Human Behavior, 62, 635-643.
 Chirita, A. D. (2018). The Rise of Big Data and the Loss of Privacy. In Bakhoum, Gallego Conde, Mackenordt & Surblyte (Eds.), Personal Data in Competition, Consumer Protection and IP Law—Towards a Holistic Approach? Berlin: Springer.
 Cohen, J. (2000). Examined Lives: Informational Privacy and the Subject as Object. 52 Stan. L. Rev., 1373-1438.
 Crawford, K., & Schultz, J. (2014). Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms. Boston College Law Review, 55, 93-128.
 Curtis, S. (2015). How Much Is Your Personal Data Worth? The Average Consumer Values Their Personal Data at ￡3,241, According to New Research. Telegraph.
 Directive (EU) 2016/943. Directive (EU) 2016/943 of the European Parliament and of the Council of 8 June 2016 on the Protection of Undisclosed Know-How and Business Information (Trade Secrets) against Their Unlawful Acquisition, Use and Dis-closure.
 EDPS (2014). Preliminary Opinion of the European Data Protection Supervisor, Privacy and Competitiveness in the Age of Big Data: The Interplay between Data Protection, Competition Law and Consumer Protection in the Digital Economy.
 Franzoni, L. A., & Kaushik, A. K. (2016). The Optimal Scope of Trade Secrets Law. In-ternational Review of Law and Economics, 45, 45-53.
 GDPR (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation).
 Hildebrandt, M. (2007). Profiling into the Future: An Assessment of Profiling Tech-nologies in the Context of Ambient Intelligence.
 Hon, W. K., Millard, C., & Walden, I. (2012). Who Is Responsible for “Personal Data” in Cloud Computing? The Cloud of Unknowing, Part 2. International Data Privacy Law, 2, 3-18.
 Hoofnagle, C. J. (2003). Big Brother’s Little Helpers: How Choice Point and Other Commercial Data Brokers Collect and Package Your Data for Law Enforcement. North Carolina Journal of International Law and Commercial Regulation, 29, 595-638.
 Lemley, M. A. (2008). The Surprising Virtues of Treating Trade Secrets as IP Rights. Stanford Law Review, 61, 311-354.
 Malgieri, G. (2017). Quasi-Property in Consumer Information: Trade Secrets and Con-sumer Rights in the Age of Big Personal Data. In M. Bottis, & E. Alexandropoulou (Eds.), Proceedings of the 7th International Conference on Information Law and Ethics, ICIL 2016, Broadening the Horizons of Information Law and Ethics. A Time for Inclusion (pp. 376-400). Thessaloniki: University of Macedonia Press.
 Malgieri, G., & Custers, B. (2017). Pricing Privacy—The Right to Know the Value of Your Personal Data. Computer Law & Security Review: The International Journal of Technology Law and Practice, 34, 289-303.
 Mann, S. (2000). Computer Architectures for Protection of Personal Informatic Prop-erty: Putting Pirates, Pigs, and Rapists in Perspective. First Monday, 5.
 McDaniel, T. B. (1986). Shop Rights, Rights in Copyrights, Supersession of Prior Agreements, Modification of Agreement, Right of Assignment and Other Contracts. AIPLA Quarterly Journal, 35, 45-47.
 Michaels, J. D. (2008). All the President’s Spies: Private-Public Intelligence Partnerships in the War on Terror. California Law Review, 96, 901-966.
 OECD (2013). Exploring the Economics of Personal Data: A Survey of Methodologies for Measuring Monetary Value. OECD Digital Economy Papers, No. 220, Paris: OECD Publishing.
 Overton, M. (1996). The Agricultural Revolution Reconsidered. In Agricultural Revolu-tion in England: The Transformation of the Agrarian Economy 1500-1850 (pp. 193-207). Cambridge Studies in Historical Geography, Cambridge, MA: Cambridge University Press.
 Prince, C. (2018). Do Consumers Want to Control Their Personal Data? Empirical Evi-dence. International Journal of Human-Computer Studies, 110, 21-32.
 Prins, C. (2006). Property and Privacy: European Perspectives and the Commodification of Our Identity. In L. Guibault, & B. Hugenholtz (Eds.), The Future of the Public Domain, Identifying the Commons in Information Law (pp. 223-258). Alphen aan den Rijn: Kluwer Law International.
 Proposal of the European Commission (2015). Proposal of the European Commission for a Directive of the European Parliament and of the Council on Certain Aspects Concerning Contracts for the Supply of Digital Content.
 Report of the European Parliament (2017). Report of the European Parliament on the Proposal for a Directive of the European Parliament and of the Council on Certain Aspects Concerning Contracts for the Supply of Digital Content.
 Steel, E., Locke, C., Cadman, E., & Freese, B. (2013). How Much Is Your Personal Data Worth? Use Our Calculator to Check How Much Multibillion-Dollar Data Broker In-dustry Might Pay for Your Personal Data. Financial Times.
 Steppe, R. (2017). Online Price Discrimination and Personal Data: A General Data Pro-tection Regulation Perspective. Computer Law & Security Review, 33, 768-785.
 Turow, J., & McGuigan, L. (2014). Retailing and Social Discrimination: The New Normal? In S. P. Gangadharan (Ed.), Data and Discrimination: Collected Essays (pp. 27-29). Open Technology Institute.