‘Prevention is always better than cure, tokenization has proved itself to be one such method to prevent a data breach. The author of the present article aims to give an all-encompassing explanation of tokenization in this fast-paced world. The international view on tokenization has been also stated in the present article. With the rising cyber crimes all over the world and the recent RBI mandate which created a lot of confusion in the minds of customers and merchants, the importance and advantages of tokenization have been explained in brief.
Keywords: tokenization, data breach, cybercrimes.
In today’s tech-savvy world, protecting data is highly crucial; one needs various technologies and processes to ensure complete security. Tokenization is one such way where one can protect itself from a data breach. Tokenization is a process wherein sensitive data is converted to various symbols known as tokens. Tokens are mere pieces of symbols that contain valuable information; let’s take the example of the game poker, in that game the chips represent money, although the chips have no real value of their own, they are important to the game because they represent money, similarly, the tokens are valuable because they represent sensitive data. A token represents a set of rules which is encoded in a smart contract. Each token belongs to a specific blockchain address; blockchain addresses are nothing but sequences of alphabets and numbers which are different for every person. The sensitive information which is replaced by tokens is stored in token vaults also known as blockchains. Tokens are essentially considered digital assets. Platform tokens, governance tokens, utility tokens, and non-fungible tokens or NFTs are some of the many types of tokens in the blockchain. Unique tokens can be used as a means of tracking and tracing for a more transparent product lifecycle (e.g., food, pharmaceuticals), and even the access to services can be tokenized, e.g.
letting the holder use a car-sharing platform once a specific token is acquired. Organizations can issue a limited amount of tokens for licensing digital content or other digitized resources (Zhou et al. 2019).
Tokens have proven to be one of the safest ways to store sensitive information because there is no real relationship between tokens and the numbers they represent; unlike encrypted data, tokenization is irreversible and undecipherable. Depending on where tokens are used, tokens can be single-use; for tasks that don’t require retaining information, or they can even be tenacious for things that require retainment of information, for example, a recurrent client’s Master card number that should be put away in a data set for repeating exchanges. One of the key features of tokenization is that, even if the tokenized environment gets breached, original data won’t be compromised.
Tokenization is usually done in cases where one has apprehension of the breach or when one is dealing with sensitive information like card details, transaction details, criminal records, etc. Tokenization is an advanced technology that pseudonymizes data and protects an individual’s sensitive data. Tokenization helps companies achieve PCI DSS (Payment Card Industry Data Security Standard) compliance by reducing the amount of PAN data stored in-house. Instead of storing sensitive cardholder data, the organization is only responsible for handling tokens, making the task of the organization much easier. Less sensitive data automatically translates into fewer compliance requirements to follow, which leads to faster audits.
Detokenization is a process that exchanges tokens for original data. Detokenization can only be done by the original system. There is no other way to detokenize the used tokens.Tokenization is not a security framework that prevents hackers from entering organizations and data frameworks it is merely a technology that can safeguard sensitive information. An appropriately constructed and carried out cloud tokenization stage can preclude the vulnerability of sensitive information, thus preventing the hackers from getting hold of any data which can be used against an individual or an organization for any kind of monetary loss.
TOKENIZATION IN INDIA
With only a cream population of India having Internet literacy, tokenization is relatively a new concept in the country. Only a few Indians own cards and do not have in-depth knowledge about the same, making them an easy target for cybercriminals. These cybercriminals often take advantage of people’s ignorance and ask them to give out sensitive card information, gullible Indians often incur a huge monetary loss because of the complexity of increasing cybercrime in the country. There is a steady increase in cybercrimes in the country. Although various acts like the Information Technology Act of 2000 provide maximum protection to the victims and strict punishments to the criminals because of the advanced technology, it is very difficult to investigate cyber crimes and get hold of cyber criminals.
India is home to a lot of small and medium business enterprises; SMEs’ lack of security is exploited by cybercriminals, who use automation to attack thousands of applications simultaneously. Tokenization as a Service (TaaS) addresses the financial constraints of SMEs by rapidly deploying cloud-based tokenization solutions. This is rapidly driving the expansion of the SME tokenization market.Cyber criminals also target the BFSI industry (Banking Finance Services and Insurance industry) as the industry deals with huge amounts of money.The BFSI industry is always looking for ‘cutting-edge payment security products and services to protect its employees in order to combat this issue.
Realizing the importance of tokenization and observing the malpractices happening in the country, the Reserve Bank of India published a mandate regarding tokenization which came into effect on 1st October emphasizing the importance of tokenization and encouraging more and more customers to tokenize their card information. The Reserve Bank of India is also planning to give already tokenized card information to the consumers, with only their banks and the Reserve Bank having the authority and the system to detokenize these tokens and access the card information.With this move, the Central Bank wants to make sure that Fintech Platforms and Merchants cannot store any consumer’s card information, which will make cybercriminals hacking in Fintech and Merchant platforms unfruitful. Although this move of the RBI is being criticized by many merchantsand consumers becausevarious E-Commerce platforms like Apple Store are finding it difficult to adjust to the newly introduced tokenization but this bold step of RBI is the need of the hour in ensuring cyber protection throughout the country.
INTERNATIONAL PERSPECTIVE ON TOKENIZATION
Countries all over the world are realizing the importance of data morphing, for data sovereignty and compliance; tokenizing data and utilizing the transparency features of the blockchain are proving to bevery helpful for all countries.The major advantages of tokenization are security, privacy, democratization, monetization, decentralization, and transparencywhich are all essential for the development of citizens of the country.
When data is tokenized in any way,data has the potential to become a promising asset class. The use of blockchain makes it possible to value data transparency, which can then be monetized and incentivize honest data contributions.Goldman Sachs predicts that cryptocurrencies and blockchains will be essential components of the data economy’s infrastructure.Providers can monetize all or a portion of the data since they have control over the entire set.
Superpowers like the USA, China, and Russia recognize the importance of decentralization and monetization of data and are therefore willing to invest a hefty sum of money in blockchain technologies like tokenization. Since the language of China is different from most the countries, the country has found different ways to morph its data using methods like segmentation tokenization, segmentation tokenization has proved to be essential to Pre-trained Language Models (PMLs) for the Chinese PLMs, and existing tokenization techniques typically treat each character as a single token.However, they overlook the distinctive Chinese writing system’s sub-character level, where additional linguistic information exists below the character level. Tokenizing sub-characters, to use the data in a particular way by encoding the input text by converting each Chinese character into a brief sequence based on its glyph or pronunciation and then using sub-word tokenization to build the vocabulary from the encoded text.The entire process of tokenization of languages like Mandarin is very different compared to most countries because of the difference in the script.
These countries being fundamentally capitalistic realize that data breach is a huge threat to their respective positions in world politics and are therefore actively bringing laws to support citizens to get data morphed in an easier manner.
Tokenization of data from a European regulatory standpoint is highly crucial to understand the position of tokenization in European countries, the Markets in Europe will have to face certain Crypto-Assets Regulation (MiCAR), which will come into effect at the end of this year. If they are already subject to a different regime, such as the Markets in Financial Instruments Directive, MiFID, all crypto assets aresubject to regulation under MiCAR. EEA member states like Germany and Franceregulate crypto assets service providers, while the Netherlands, Luxembourg, and Liechtensteinonly require self-obligatory registration.Within the EEA, a securities prospectus for the issuance of securities (including tokenized securities sui generis) is fairly uniform in terms of the documentation requirements.For fundraising purposes, the issuer may tokenize data and request regulatory approval for the relevant securities prospectus, and passport it to other EEA member states.
AUTHOR’S OWN VIEW
In the fast-paced world, with a lot of information being stored andtransferred every day; tokenization has the potential to become part and parcel of lives. With the growing cybercrimerates all over the world, safeguarding data has become highly crucial, tokenization has proved to be an effective tool in reducing data theft in today’s modern era. Many countries have guaranteed the right to privacy to their citizens; tokenization is an effective way to ensure that this right is exercised by the citizens.It is essential to understand that customers are not required to go through any hassle to get their sensitive information tokenized; instead, it gives them a sense of security from cyber fraud.Although for a layman, the concept of tokenization might be a little tricky to understand yet every country is taking maximum efforts to educate its citizens about computers and various blockchain technologies. Tokenization has indeed become a sine qua non in this technologically advanced era.
Blockheadtechnologies.com,https://blockheadtechnologies.com/what-is-a-blockchain-token-is-it-just-cryptocurrency/(last visited 28 May 2022)
 Analyticsinsight.net,https://www.analyticsinsight.net/the-different-types-of-tokenization-in-blockchain/(last visited 28 May)
 Heines, Roger; Dick, Christian; Pohle, Christian; and Jung, Reinhard, “The Tokenization of Everything: Towards a Framework for Understanding the Potentials of Tokenized Assets” (2021). PACIS 2021 Proceedings. 40.
 Tokenex.com,https://www.tokenex.com/resource-center/what-is-tokenization (last visited 27 May 2022)
 Imperva.com, https://www.imperva.com/learn/data-security/tokenization/ (last visited 29 May 2022)
 Tokenization Market, https://www.marketsandmarkets.com/Market-Reports/tokenization-market-76652221.html (last visited 6 October 2022)
 Philip Sander, Data Tokenization: Morphing The Most Valuable Good Of Our Time Into A Democratized Asset, FORBES (Jul 6, 2021,05:25am EDT), https://www.forbes.com/sites/philippsandner/2021/07/06/data-tokenization-morphing-the-most-valuable-good-of-our-time-into-a-democratized-asset/?sh=95690152860c
 Si, C., Zhang, Z., Chen, Y., Qi, F., Wang, X., Liu, Z., Wang, Y., Liu, Q., & Sun, M. (2021). Sub-Character Tokenization for Chinese Pretrained Language Models. arXiv. https://doi.org/10.48550/arXiv.2106.00400
 Ibid p.7