• Around The HomeToggle Menu

    • Entertainment
    • Productivity
    • Smart Home
  • FamilyToggle Menu

    • Parenting
    • Toys
    • Pets
    • Travel
  • Product ReviewsToggle Menu

    • Phones
    • Tablets
    • Laptops
    • Desktops
    • Wearables
    • Audio
    • Cameras
    • Headphones
    • Printers
    • Smart Home
    • TVs
    • Gaming and Video
  • One Cool ThingToggle Menu

    • Frugal Tech
    • Kickstarters
    • Videos
Techwalla
  1. Home
  2. Around The Home
  3. Productivity
  4. What Is Digital Coding?

What Is Digital Coding?

March 31, 2015
By: Steve Johnson
  • Share
  • Share on Facebook

Digital coding is the process of using binary digits to represent letters, characters and other symbols in a digital format. There are several types of digital codes widely used today, but they use the same principle of combining binary numbers to represent a character.

...
You might not see it, but digital coding is everywhere.

Digital and Binary Coding

Computers and electronic devices need a systematic and precise algorithm to read information. This system requires that each character, letter or symbol is unique and easily distinguishable from other characters. To do this, digital coding is required. In digital coding, letters or symbols are represented by specific sets of binary numbers or characters. For example, the numbers 01000001 represent the character "A" in a binary code. Binary code, although not a specific digital coding technique, offers the simplest explanation to understand digital coding.

Basic Coding Technique

As the term suggests, digital coding turns information into digits easily recognizable by computers and other electronic devices. These digits are sets of information divided into very small pieces known as bits. A bit -- short for binary digit -- is the smallest measurement assigned. The most common digital coding techniques use around 8 to 16 bits per character. This means that each character has at least eight alphanumeric symbols set in a distinct progression.

Commonly Used Digital Codes

There are several types of digital codes used in computers today, but three of the most widely used are the American Standard Code Information Interchange, Extended Binary Coded Decimal Interchange Code and Unicode. ASCII contains about 128 different codes that represent the American letters, symbols and numbers. For example, the letter "M" is represented in ASCII as "077" in digital code. EBCDIC and Unicode use the same coding process, but they assign a different set of symbols for each character.

Alphanumeric Coding

The most common practice in creating digital codes uses alphanumeric characters. Alphanumeric coding combines letters and numbers to create a specific representation of a character in a computer program. For example, the code "U+0041," which represents "A" in Unicode, has letters, numbers and the "+" symbol.

Show Comments

Related Articles

The Disadvantages of Lossless Encoding Techniques

The Disadvantages of Lossless Encoding Techniques

Around The Home
Productivity
By: David Dunning
How to Use Macrons in MS Word

How to Use Macrons in MS Word

Around The Home
Productivity
By: Anni Martin
How to Make a Butterfly on the Keyboard

How to Make a Butterfly on the Keyboard

Around The Home
Productivity
By: Isobel Phillips
How to Put in a Multiplication Sign on the Computer

How to Put in a Multiplication Sign on the Computer

Around The Home
Productivity
By: David Wayne
How to Create a Hex File

How to Create a Hex File

Around The Home
Productivity
By: DanH
Choose a Password Manager to Protect Your Security

Choose a Password Manager to Protect Your Security

Around The Home
Productivity
By: Jackie Dove
  • HOW WE SCORE
  • ABOUT US
  • CONTACT US
  • TERMS
  • PRIVACY POLICY
  • COPYRIGHT POLICY
  • Advertise

An error occurred. Try again later.

Thanks for signing up!
© 2017 Leaf Group Ltd. Leaf Group Media.

Get great tech advice delivered to your inbox.

Keep your family productive, connected, entertained, and safe.

Please enter a valid email.