Cryptotology is the “science of secrecy”. It comprises cryptography, which aims to protect secrets, as well as cryptanalysis, which aims to decipher them.
A cryptographic solution is the practical application of cryptography for a specific security purpose, such as authenticating a user on a mobile network, validating a payment, or protecting the integrity of an identity document.
Modern cryptographic solutions consist of computer programs that result from the implementation of one or more cryptographic protocols, which may use different cryptosystems. A cryptographic solution also includes key management, which consists of defining how (static) keys are generated, distributed and managed in the environments in which the solution is deployed (i.e., smart cards, payment terminals, smartphones, servers, etc.).
The aim of modern cryptography is to provide specific security features that enable secure communications via an unsecured channel to prevent a third party from:
Therefore, the 3 main purposes of modern cryptography are:
The message must be comprehensible only to the recipient. If the content is intercepted, it must remain unintelligible to any third party who does not have the key to decipher it.
The message must be protected against any attempt at modification by a third party. The recipient must be able to verify that the message has not been altered during transfer.
The recipient must be able to verify the origin of the message, in other words, be sure that the message actually comes from the person with whom they have established a communication channel, and not from a third party.
Encryption means transforming plaintext (a non-encrypted message) into ciphertext (a message in encrypted form, also called a cryptogram) using a cryptographic algorithm. Decryption (or deciphering) retrieves the plaintext from the ciphertext. This cryptographic function ensures message confidentiality.
Hash functions and Message Authentication Codes (MACs) are used to verify the integrity of a message, i.e. that it has not been altered (intentionally or otherwise) during transmission.
The result of a hash function is called a hash. This message fingerprint, calculated based on the message alone, makes it possible to detect any alteration during transmission if it differs between when it is sent and when it is received (while making sure that the hash cannot be modified either). Similarly, a MAC, calculated based on the message and a secret key, is used to verify message integrity. Using a secret key to generate the MAC also enables the recipient to authenticate the sender, who is the only other holder of the secret key previously shared between them. The recipient is therefore certain of the message’s authenticity, in other words, of its origin.
An electronic signature (or digital signature) not only ensures message integrity and authenticity, but also includes non-repudiation mechanisms.
These non-repudiation mechanisms assure the receiver of the signed message that the sender cannot contest either the sending of the message or its content. An electronic signature is therefore equivalent to a handwritten signature, in the sense that it establishes the responsibility of the signatory. Anyone must be able to verify a signature by applying the verification algorithm, corresponding to the signature algorithm initially applied to the message.
Modern cryptography is based around two main models:
symmetric cryptography, in which the sender and recipient use the same secret key;
asymmetric cryptography, in which two mathematically closely related keys are used: a private key that must remain secret (and is known only to one person) and a public key that can be shared without compromising the security of the cryptosystem.
Symmetric cryptography is the older of the two models and has been used for decades, for military communications, in particular.
In this closed network context, senders and receivers are identified in advance and are relatively few in number, making it possible to share a secret key between them via a secure channel. With this single-key model, both sender and receiver are guardians of the cryptographic system secret.
A symmetric algorithm is deemed secure provided that no cryptanalytic method has been identified that can find the secret key faster than by exhaustively testing all possible keys. Confidence in a symmetric algorithm is therefore established over time. The design of a new algorithm is always based on a combination of different techniques developed by cryptologists to parry previously identified attacks. However, this does not mean that no cryptanalysis will ever be able to break it.
The most well-known and widely-used symmetric algorithms are:
Asymmetric cryptography is a more recent concept, resulting from the work of Whitfield Diffie and Martin Hellman in the 1970s. It became particularly relevant with the spread of computers and the new usages introduced by the Internet.
In this open network context, where the number of users multiplied, using a single secret key per pair of users quickly becomes impractical because the number of keys evolves by the square of the number of people involved, not to mention the impossibility for a sender and recipient to exchange secrets beforehand. Sharing a public key, accessible to any potential sender without the risk of compromising the security of the cryptographic system, overcomes this difficulty. In this model, the recipient, sole holder of the private key, is the guardian of the cryptographic system’s secret.
Asymmetric algorithms are based on hard problems such as factoring large integers or calculating discrete logarithms. A mathematical problem is considered hard if it cannot be solved by computation because the amount of calculations involved would be impractical (even if it were theoretically possible). Put simply, confidence in an asymmetric algorithm is built up over time, based on the difficulty of solving the underlying mathematical problem.
First-generation asymmetric cryptosystems all use relatively long keys. The most important new features they introduced were key exchange and signature with non-repudiation. The best-known are:
Introduced in 1985, this new-generation cryptography enabled a paradigm shift by using discrete logarithm problems on elliptic curves: instead of applying discrete logarithm calculations to integers, these calculations are performed based on the points of an elliptic curve. This makes it possible to work with keys of significantly shorter length than first-generation asymmetric cryptosystems, while achieving an equivalent level of security—because the volume of calculations required to break an elliptic curve algorithm increases significantly faster with key size.
This advantage of elliptic curve cryptography makes it very valuable for performing cryptographic computations in a resource-constrained environment such as secure elements that can be found on smart cards or embedded in various devices. Asymmetric cryptosystems based on elliptic curves have numerous applications in embedded cryptography. They are used for IMSI encryption in 5G SIMs, as well as in passports for mutual authentication between the chip and the reader.
Most asymmetric elliptic curve cryptosystems are the legacy of first-generation discrete logarithm cryptosystems, for example:
A Public Key Infrastructure (PKI) is a system in which a certification authority certifies the ownership of a public key. This certification, performed by a trusted third party, prevents the distribution of false public keys. Thanks to the exchange of digital certificates issued and signed by the certification authority, users can be sure they are using the public key of the entity with which they wish to communicate, and not that of an impostor posing as the entity in question. The entity also has the assurance of dealing with a legitimate user. In other words, a Public Key Infrastructure (PKI) establishes the necessary trust between two parties for initiating secure communication based on an asymmetric key system.
In the banking and identity solutions sectors, the public keys of the component (the chip in a payment card or passport, for example) are signed by an authority to guarantee that the component is genuine. This certificate is required when using the product to verify that it is valid and recognized by the issuing authority.
In short, symmetric cryptography relies on two entities sharing the same secret to communicate securely, while asymmetric cryptography allows anyone to encrypt a message that only one entity will subsequently be able to decrypt.
In practice, secret key and public key cryptography are often combined, with asymmetric cryptography enabling a secret key to be established via an unsecured channel, which is then used to protect the content of an exchange using symmetric encryption. This switch from the public key to the secret key can be made using different methods: using a cryptosystem dedicated to key exchange, such as Diffie-Hellman, or using a key establishment protocol based on RSA encryption (the secret key is generated, encrypted and then sent using the public key). For example, in 5G protocol, symmetric keys are exchanged using an ECDH asymmetric cryptosystem and are then used to encrypt the IMSI (the unique identifier of a user on the mobile network) using an AES cryptosystem.
One of the main reasons for the combined use of asymmetric and symmetric cryptography is that public key encryption algorithms (used for asymmetric encryption) take much longer to execute, and are therefore less efficient than secret key algorithms (used for symmetric encryption), mainly due to the complexity of the operations tackled by asymmetric cryptography.
One of the core principles of modern cryptography, “Kerckhoffs principle“, states that cryptographic system security must not depend on the secrecy of the cryptographic method, but on the secrecy of the key. From this principle stems the crucial importance of keeping the key in a secure place (from creation to use), or failing that, preventing it from being exposed and used in clear mode in the absence of a secure element to protect it.
Loading (or generating) keys in the cryptographic solution target environment is the first step to be performed securely to ensure that keys are protected, even before securing their storage and use. This is where key management comes into play when personalizing end products (whether on a production site or remotely): where do the first user-specific key(s) come from to secure the entire cryptographic solution? Are they “loaded” or “generated” in the location where they will be used? How do they transit securely via a server if the implementation target is a cell phone? Key management is a crucial step and needs to be ultra-secure when providing critical products and services related to identity, payments and telecommunications. That is why it also requires the use of cryptography and dedicated security protocols.
Once the keys have been loaded (or generated), their protection in the target environment (for storage and use) will depend on whether it is a secure element or a non-secure environment.
To avoid exposing a secret key, smart cards are equipped with a microprocessor that enables them to perform certain cryptographic operations themselves, and incorporate physical countermeasures to prevent physical and logical intrusions. This type of implementation, based on a secure element, enables high-security embedded cryptography.
Embedding cryptography in a secure element solves the problem of key storage but involves certain constraints when designing a cryptographic solution. These are linked to:
Cryptographic solutions embedded in identity, payment and mobile connectivity products are high-security embedded solutions, both from the hardware standpoint (with ultra-secure hardware chips, including physical countermeasures that prevent them from being opened or read), and from the software standpoint (with additional mathematical countermeasures against physical attacks when implementing the cryptographic solution). These solutions are evaluated by independent laboratories, and must be certified before they can be distributed on the market (Common Criteria, FIPS, etc.).
Careful implementation of countermeasures at the hardware and software layers of the secure element is crucial to protect the key, while limiting memory consumption and cryptographic operation execution time.
The performance of a high-security embedded cryptographic solution therefore goes hand in hand with the use (or even design) of a chip with enhanced hardware security and optimized performance (typically with accelerators dedicated to cryptographic operations).
The digitization of payment, identity and other sensitive services drives the deployment of applications containing identification information and personal data on a wide range of devices, including smartphones or other types of connected devices in the IoT for example. In cases where using a secure element to protect application keys and data is not an option (due to lack of a secure element, the absence of standardization between devices, or sovereignty issues), software security is the best approach to avoiding friction in terms of deployment and user experience.
However, this presents certain challenges. Software on an unprotected device (which an attacker can disassemble, or whose memory may be read to access the code) is a highly exposed environment, in other words a “white box” in which the attacker can see everything and has considerable resources at their disposal.
Solutions commonly used for less sensitive applications, such as device security features or “code obfuscation” tools, are not sufficient to protect the keys and data of sensitive applications. Differential analysis attacks can crack such protection in a matter of minutes.
This is where white-box cryptography comes in. It ensures that the key protecting the data is neither present nor manipulated in cleartext on the device. It involves hiding the key in the application code, using specific mathematical techniques when implementing the cryptographic solution, while maintaining its original functions.
However, white-box cryptography remains a compromise in terms of security and performance compared to what can be offered by a secure element specifically designed to protect sensitive services. To limit the impact in the event of a security breach, it is therefore advisable to combine a “white-box” cryptographic solution with reinforced security protocols (key expiration mechanisms, regular application updates, etc.).
There is a significant difference between theoretical knowledge of cryptography and having the expertise to use it to secure sensitive data and services in different environments.
The RSA algorithm may be taught to 16-year-old students, but not just anyone can design cryptographic solutions capable of dealing with the various types of attacks. Cryptographic expertise in protecting sensitive data and services in the banking, telecoms or identity sectors goes far beyond theoretical knowledge of algorithms and their functionalities. In particular, it involves developing protocols adapted to the specifications defined within each sector. This requires a thorough analysis of the needs and constraints of the use case: what are the security objectives? In which environment will the cryptographic solution be executed? What certification criteria must be met? Who is the potential attacker (and what means will they be willing to use, depending on the value of what the solution protects), etc.?
To ensure the robustness of cryptosystems, some cryptographic experts occasionally put themselves in the shoes of attackers: they perform cryptanalysis. Here, they test the theoretical resistance of cryptosystems. Cryptanalysis systematically involves the submission of new cryptographic algorithms to standardization bodies. This is how the cryptographic community builds confidence in new cryptosystems.
Thus, the experts who develop and implement cryptographic solutions can rely on standardized cryptosystems so long as no cryptanalysis or major technological development has challenged them. While certain experts focus on defining security protocols and ensuring that they are theoretically flawless, others intervene at implementation level.
In the field of high-security embedded cryptography, some of these experts carry out security tests on end products, even going so far as to invent new types of physical attack, in order to propose countermeasures to stay one step ahead of seasoned attackers with substantial resources at their disposal. These countermeasures must be effective against potential attacks, without excessively increasing execution time of the cryptographic solution and negatively impacting the user experience. They must also be frugal, as over-intensive use of memory or excessive energy consumption could impact the correct functioning or lifespan of the secure element.
To anticipate new threats and meet growing security challenges, cryptographic experts must keep up-to-date with the latest theoretical and technological advances. That’s why, as well as applying their know-how to concrete use cases in their sector of activity, they play an active role in academic research.
IDEMIA experts give both one-off and regular lectures at leading French universities (Bordeaux, Paris VI, Lyon) and supervise doctoral theses. They also publish peer-reviewed research articles at conferences and in specialized scientific journals such as Transactions on Cryptographic Hardware and Embedded Systems (TCHES) and are themselves invited to be on the program committees of these conferences and specialized journals to review their peers’ publications.
In the face of ever-increasing numbers of large-scale attacks against data centers and critical infrastructures, cybersecurity is making the headlines. It has become a major focus for governments and businesses worldwide. Although not as well known to the general public as cybersecurity, cryptography is one of its essential components.
Cryptography is at work at the heart of “secure elements” such as microchips, which are now widely used in our phones, bank cards, identity documents and some of our cars, as well as in an exponential number of connected devices. The cybersecurity of many systems is therefore often based on that of the chip, a secure enclave that protects cryptographic keys and enables calculations to be carried out in a controlled environment.
Since the 1990s, the use of secure chips has become widespread, particularly in the fields of payments, identity, mobile communications and the Internet of Things (IoT), but also in access control, transport and pay TV. The use of embedded cryptography – or more precisely high-security embedded cryptography – to encrypt sensitive data and communications, authenticate the various parties involved in an exchange or transaction, sign electronic documents and protect our identities, has become part of our everyday lives.
In the payments sector, asymmetric cryptography is used for interoperability reasons, as it is not possible to store the secret keys of all cards on all terminals. Authentication of the payment card at the terminal is carried out by means of a challenge-response protocol, during which the card sends its certificate (i.e., its public key signed by the bank) and the bank’s certificate, enabling the terminal to verify that the card is indeed a valid card issued by the bank in question. This step is essential if the terminal is offline, to authorize a transaction that will only be transmitted once the terminal has regained access to the network. If the terminal is online, this authentication step is not essential, as the card communicates directly with the banking network to perform the transaction.
To encrypt and authenticate the transaction, the payment card uses the secret key loaded onto the chip during personalization. This is symmetric cryptography.
In the field of cellular connectivity, be it for mobile or IoT communications, symmetric cryptography is traditionally used by mobile operators to ensure that the SIM (removable SIM, eSIM or iSIM) attempting to connect to their network is authentic, in other words that it has not been counterfeited. In concrete terms, the SIM provides the network with its identifier, the IMSI (International Mobile Subscriber Identity), and the network verifies that it holds the secret key corresponding to this identifier. Once this authentication has been performed, the SIM uses its secret (static) key to generate session (ephemeral) keys, which enable the phone to encrypt communications on the network.
More recently a new use of – asymmetric – cryptography has been introduced into 5G SIMs to enhance the privacy protection of mobile or IoT device users by encrypting their IMSI. The SIM encrypts its IMSI using the network’s public key before transmitting it for authentication, thus protecting it against any interception attempts that could compromise users’ anonymity and enable them to be traced. Recently, this new functionality has been strengthened further (as part of a proof-of-concept) with the use of one of the first post-quantum algorithms, thus protecting 5G SIMs against attackers collecting encrypted IMSIs in the hope of decrypting them later.
In the field of identity documents (passports or other identity documents containing a chip), one of the challenges is to ensure that only people with physical access to the document, and in particular to the MRZ (or optical reading zone) on the data page, can read the data contained on the document’s chip, thus preventing the chip from being read without the bearer’s knowledge. Once this precaution has been taken and mutual authentication has been performed between the document and the terminal, a secure channel is established and encrypted communications can take place with the document chip, for example to check the validity of a document and authenticate the document holder based on their biometrics using an automatic border control equipment.
To return more generally to the cybersecurity challenges associated with essential payment, connectivity and identity services, it is also important to emphasize that the cryptographic expertise required does not stop at the development and implementation of embedded cryptographic solutions within secure elements (or their white-box equivalent), although this is a fundamental basis for protecting these critical services.
To ensure end-to-end security, cryptographic experts are also much sought after to design solutions that manage or interact with secure elements: OTA solutions for updating SIM parameters, eSIM management solutions for remote activation of mobile subscriptions, tokenization solutions for activating digital payment cards, solutions for issuing digital car keys, and so on.
Post-quantum cryptography is a new generation of cryptography aimed at making cryptographic systems resistant to the phenomenal computing power of quantum computers. It is one of the most dynamic fields of research in cryptography today.
Quantum physics and its unique properties, such as superposition of states and entanglement, opens up a whole new realm of possibilities for securing communications. The concept of “quantum cryptography” refers to this new form of cryptography, capable of harnessing the properties of quantum physics by exchanging photons to deduce a secret.
Post-quantum cryptography”, on the other hand, is a development of current cryptography designed to parry attacks using the power of quantum computing. It is intended for the computers and smart cards we already use every day.
Although quantum computers have not yet reached maturity, research into post-quantum cryptography has been booming, fueled by increased investment in their development. The very first seeds in this research field were planted in 1994 with the publication of Shor’s algorithm. Harnessing quantum computing power, Shor’s algorithm would take just a few days (or less) to solve the complex mathematical problems on which asymmetric cryptography has been based up to now, challenging the security of solutions based on asymmetric cryptography. Post-quantum asymmetric cryptosystems must therefore be based on fundamentally different mathematical problems.
With the emergence of quantum computers, symmetric cryptography is facing slightly faster brute-force attacks, but these can be easily countered by increasing the size of the keys used to maintain an equivalent level of security.
The question is not whether, or even when, quantum computing will one day be able to crack today’s asymmetric algorithms, but rather if this is a threat we are willing to live with. And the answer is no. For sensitive information whose confidentiality and integrity must be preserved for years, if not decades, such as government secrets, passport data or bank details, the risk is very real. Such data, although encrypted and impossible to use at present, if intercepted now could be deciphered very easily once attackers gain access to quantum computing power.
As no one can predict with certainty when potential attackers will actually have access to the power of quantum computing, it is essential to protect sensitive communications and information against it as of now, while taking into account the fact that post-quantum algorithms are still in their infancy. This is where hybrid cryptography can already prepare for the transition from traditional asymmetric cryptography to post-quantum cryptography.
Since trust in post-quantum algorithms can only be established over time, hybrid cryptography is the approach currently recommended by the main security standard setters, such as the ANSSI for European identity products that must comply with Common Criteria, or the NIST for American identity products that must comply with FIPS (Federal Information Processing Standards). This is also the approach favored by the GSMA in its standardization work for the telecoms sector.
Hybrid cryptography combines traditional and post-quantum asymmetric cryptographic techniques until post-quantum cryptosystems have matured sufficiently to be used on their own. Thus, a hybrid encryption protocol will perform “dual encryption” so to speak, using both a pre-quantum algorithm (already in use today) and a post-quantum algorithm, so that even if a potential attacker were to break the post-quantum encryption layer, the data would still be protected by pre-quantum cryptography.
Hybrid cryptography will therefore remain useful for as long as post-quantum cryptosystems are still maturing and quantum computing power is not widely available to challenge traditional asymmetric cryptosystems.
With the first efforts to standardize post-quantum algorithms under the impetus of the NIST, and the selection of the first four candidate algorithms for standardization in 2022, a first step has been taken, but research continues. In this context, crypto-agility is paramount.
Crypto-agility means being able to change all or part of a cryptographic solution already deployed in the field, typically to replace the first post-quantum algorithms that may prove insufficiently secure. While in a purely software environment – with no particular memory constraints – this kind of development will not present any real difficulties, this type of upgrade will prove far trickier in an environment with a high level of constraints, such as the secure element of a smart card.
Crypto-agility will rely on existing solutions to keep the operating systems of secure elements up to date and to personalize solutions out in the field, but changing all or part of a cryptographic solution is a major operation when compared to an ordinary bug fix. These solutions will therefore have to evolve to incorporate the latest security features required to protect these particularly critical operations.
Product architecture will also need to be as flexible as possible because a change of algorithm not only means changing the key and the associated certificates, but also potential impacts linked to the size of the algorithm, not to mention the possible need to adapt protocols—all which may lead to greater RAM consumption on the chip.
Depending on the type of environment, the cryptographic solution may be updated remotely (directly via the mobile network in a SIM, eSIM or iSIM, for example), or may require a connected device to establish a secure connection with the chip (payment terminal, NFC-compatible smartphone, dedicated ID readers). The procedure will be defined by the issuing authorities, standardization bodies or industry players within each sector.
With post-quantum algorithms expected to take at least another decade to mature, crypto-agility has a bright future.
Backed by decades of expertise in cryptography and pioneering research in anticipating the post-quantum era, IDEMIA is constantly building new defenses to protect the sensitive information of ordinary citizens and consumers more effectively.
Billions of people around the world use our cryptographic technologies on a daily basis, in a variety of forms: contact and contactless payment cards, SIMs and eSIMs for consumer and IoT applications, electronic identity cards and passports, transit cards, secure access cards for businesses, NFC key cards for cars, biometric cards to protect cryptocurrency wallet keys, etc.
All of these secure components are the fruit of our complementary expertise in high-security embedded cryptography, chip design and secure OS development.
In addition to cryptography embedded in secure physical components, our experts are also tackling the challenges of using cryptography in unsecured environments (or environments in which the security component is not deemed usable for sovereignty-related reasons). This is the case for many solutions deployed on smartphones (mobile payments, mobile identity, digital car keys), which require a variety of cryptographic implementation approaches to meet our customers’ different cybersecurity challenges.
Our cryptographic expertise also underpins our services and solutions: banking personalization services for issuing physical and digital cards, remote updates of SIM parameters or cloud-based eSIM subscription activation, security of personal data in identity management platforms… the list is long!
The expertise IDEMIA has built up over the decades in securing connectivity, payment and identity services in the physical world has already proved of huge benefit in helping our customers with their digital transition. In the post-quantum era, this expertise is now more than ever a platform for enabling our customers to deploy highly secure solutions, even going beyond the chip, to offer digital services that meet current and future challenges to both consumers and citizens.
Related videos
Latest News