Computer Science
Grade 7
20 min
4. Representing Text: ASCII and Unicode
Understand how text characters are represented using ASCII and Unicode encoding schemes.
Tutorial Preview
1
Introduction & Learning Objectives
Learning Objectives
Explain why computers need a standardized system to represent text characters.
Define ASCII and describe its primary limitation.
Define Unicode and explain why it was created as a successor to ASCII.
Convert a simple, short word from English text into its 8-bit ASCII binary representation using a reference table.
Compare the character capacity of 7-bit ASCII, 8-bit ASCII, and Unicode.
Identify examples of characters that require Unicode, such as emojis and non-English letters.
Ever wonder how your computer understands a text message with emojis like 😂 or words in different languages like 'Hola'? 🤔
Computers only understand numbers, specifically binary (0s and 1s). This lesson explores how we turn letters, numbers, and symbols into a binary fo...
2
Key Concepts & Vocabulary
TermDefinitionExample
Character EncodingA system that matches each character (like 'A', '?', or '🚀') with a unique number so a computer can store and process it.In the ASCII encoding system, the character 'A' is matched with the number 65.
BinaryThe number system computers use, which only has two digits: 0 and 1. Each digit is called a bit.The decimal number 10 is written as 1010 in binary.
ASCIIAn early character encoding standard that uses 8 bits to represent 256 characters, mostly for the English language, numbers, and common symbols.The letter 'B' is represented by the number 66, which is 01000010 in 8-bit ASCII.
Limitation of ASCIIBecause ASCII only has 256 possible codes, it cannot represent characters from most other languages (lik...
3
Core Syntax & Patterns
Character to Number Mapping
Character → Decimal Code Point
The first step in encoding is to look up the character in a standard table (like an ASCII table) to find its unique decimal number. Every character, including spaces and punctuation, has a number.
Decimal to Binary Conversion
Decimal Code Point → Binary String
After finding the character's number, the computer converts that number into its binary (base-2) equivalent. For 8-bit ASCII, the binary number must be padded with leading zeros to be exactly 8 digits long.
Encoding Capacity
Capacity = 2^n (where n is the number of bits)
An encoding system that uses 'n' bits can represent 2 to the power of 'n' unique characters. 7-bit ASCII can represent 2^7 = 128 characters, while 8-bit ASCII c...
4 more steps in this tutorial
Sign up free to access the complete tutorial with worked examples and practice.
Sign Up Free to ContinueSample Practice Questions
Challenging
A program designed to work only with standard ASCII receives a text file containing the name "François". The character 'ç' is not in standard ASCII. What is the most likely result?
A.The program will automatically translate 'ç' to a regular 'c'.
B.The computer will automatically upgrade its software to Unicode.
C.The program will display the name perfectly without any issues.
D.The program will likely crash, show an error, or display a placeholder like '?' instead of 'ç'.
Challenging
A new writing system is discovered that has 50,000 unique characters. To create a computer encoding for it, what is the MINIMUM number of bits required per character? (Reference: 2^15 = 32,768; 2^16 = 65,536)
A.16 bits
B.15 bits
C.50,000 bits
D.8 bits
Challenging
You see the binary ASCII sequence for two characters: `01001010 01101011`. You know the first character is an uppercase letter ('J'). You are told that in ASCII, all lowercase letters have a decimal value of 97 or higher. What can you deduce about the second character, `01101011`?
A.It must be a number.
B.It must be a punctuation symbol.
C.It must be a lowercase letter.
D.It must be another uppercase letter.
Want to practice and check your answers?
Sign up to access all questions with instant feedback, explanations, and progress tracking.
Start Practicing Free