Cryptography & Technique, Algorithms ,Concept, History in English

Cryptography Concept & Technique, Algorithms, Types,Concept, History - 22nd Edition

Cryptography

It is a method of safeguarding information and communications by the use of codes so that only the individuals who they are intended to be read can be able to read and interpret it.

In computer science, the term “cryptography” refers to secure methods of communication and information derived from mathematical concepts and rules-based algorithms that transform messages into challenging to discern. These algorithms are used to generate cryptographic keys and digital signing, verify the privacy of data internet browsing internet, and for confidential communications, such as emails and transactions with credit cards.

Cryptography techniques

Cryptography is closely associated with the field that comprises Cryptology and the field of cryptanalysis. It encompasses techniques like microdots, combining words and images, and other techniques to conceal data in storage or transit. In today’s computer-driven society, cryptography is commonly associated with scrambling plaintext (ordinary text, often called plaintext) into ciphertext (a procedure known as encryption) and back to plaintext (known by the term decryption). The people who are involved in this area are called cryptographers.

Modern cryptography is concerned with four goals:

1. Confidentiality. The information will not be understood by anyone who was not intended.

2. Integrity. The information is not altered during storage or transport between the sender and the intended recipient without being detected as a change.

3. Non-repudiation. The person who created or transmitted the information can’t be able to deny later their motives for the creation or sharing of the data.

4. Authentication. The receiver and sender can confirm their identities and the source/destination of the data.

Cryptosystems and procedures that satisfy one or more of the above requirements. The term “cryptosystems” is often used to be referring to algorithms and computer programs; however, they also cover the control of behavior by humans, for example, selecting passwords that are difficult to guess, locking off systems that are not in use, and not sharing sensitive information with anyone outside.

The process of cryptography involves encryption and decrypting information.

Cryptographic algorithms


Cryptosystems utilize cryptographic algorithms, also known as ciphers, that encrypt and decrypt messages to protect communication between computer systems, devices, applications, and devices

A cipher suite utilizes one encryption algorithm and another to authenticate messages and exchange keys. This method, which is embedded in protocols and written into software running within the operating system (OSes) and networked computer systems, includes:

● public & private keygen for data encryption decryption

● digital signing & verification for message authentication

● important exchange

Different types of cryptography

One-key and symmetric-key encryption algorithms produce a predetermined length of bits, referred to as a block cipher, with an encrypted secret key used by the creator/sender to decrypt information (encryption) and then the receiver utilizes to decipher it. One example of symmetric critical security is that of one Advanced Encryption Standard ( AES). AES is a specification created on November 1, 2001, by the National Institute of Standards and Technology (NIST) as a Federal Information Processing Standard (FIPS 197) to safeguard sensitive data. The standard is required by the U.S. government and widely utilized in the private sector.

The specification was approved in June of 2003. The U.S. government had approved AES for classified information. It is a royalty-free standard utilized in both hardware and software worldwide. AES can be described as the successor to Data Encryption Standard ( DES) and DES3. It has longer key lengths (128-bit, 192-bit, or 256 bits — to protect against brute force attacks and other attacks.

Symmetric cryptography employs one key, whereas Asymmetric cryptography utilizes an encryption key pair that can encrypt and decrypt information.

Asymmetric-key encryption or public-key encryption algorithms employ two keys: one of which is a public key that is associated with the creator/sender of the message to secure it and a private one that only the creator/sender has access to (unless it’s disclosed or they opt to disclose the key) to decrypt the information.

Examples of cryptography with public keys include:

● RSA is extensively used on the internet

● Elliptic Curve Digital Signature Algorithm used by BTC

● Digital Signature Algorithm (DSA) was adopted as a Federal Information Processing Standard for digital signatures by NIST in FIPS 186-4

● Diffie-Hellman key exchange

To ensure the integrity of cryptography data, hash functions provide a reliable output from input and are employed to convert data to a specific size. Different cryptographic hash functions comprise SHA-1 (Secure Hash Algorithm 1)), SHA-2 and SHA-3.

Security concerns with cryptography

Hackers can bypass encryption algorithms, penetrate the computers responsible for the encryption of data and decryption, as well as exploit weaknesses in the implementation, for example, the default keys. But, encryption hinders attackers from accessing messages and data secured through an encryption algorithm.

Growing concerns over the power of quantum computing that could compromise current cryptography encryption standards caused NIST to release an invitation to submit papers to the scientific and mathematical community in 2016 to develop new standards for public key cryptography.

In contrast to the current computers, quantum computing makes use of quantum bits ( qubits) that can be used to represent 1s and 0s, which means that they can simultaneously perform two calculations. Although a quantum computer of a massive scale is unlikely to be developed shortly, the current infrastructure needs the standardization of widely known and well-understood algorithms that provide security according to NIST. The deadline for submission was November 2017. The analysis of the ideas will take anywhere from up to three years.

The history of cryptography

The term “cryptography” is derived from the Greek”Kryptos,” which means hidden.

Prefix (crypt-) means (hidden) or (vault) and the suffix (-graphy) stands for (writing`)

The beginnings of cryptography are generally dated around 2100 B.C., with the Egyptian practice of hieroglyphics. They were made up of intricate pictograms, the significance of which was only available to a select few.

The first recorded use of modern ciphers was by Julius Caesar (100 B.C. up to 44 B.C. ), who was not a fan of his messengers in communicating with his officers and governors. To avoid this, he devised a system in which every character he used in messages would be replaced with an alphabet three places ahead of it within the Roman alphabet.

In recent years it has been an area of contention for the best mathematicians and computer researchers. Securely storing and transferring sensitive data has proven to be crucial to success in the business world and war.

Since governments don’t want specific entities within and outside of their country to have access to obtain and transmit information that could pose detrimental to national security and security, cryptography is subject to a variety of restrictions in a variety of countries, from the limitations of the use or exportation of computer software to the diffusion of mathematic concepts which can be used in the development of cryptosystems.

The internet, however, has enabled the dissemination of powerful software and, most importantly, the fundamental cryptography algorithms so that many of the most sophisticated cryptosystems and concepts are now available in the public access domain.

Be the first to comment

Leave a Reply

Your email address will not be published.


*