Data Representation

Cards (26)

  • How do computers store characters in a text document?
    As binary numbers in a byte
  • Why is a standard coding system needed for documents?
    To ensure compatibility across different computers
  • What does ASCII stand for?
    American Standard Code for Information Interchange
  • What is a character set?
    A defined list of characters for a system
  • What characters does the ASCII character set include?
    All English characters and control characters
  • What is the range of values for ASCII characters?
    0 to 127
  • What does ASCII character 32 represent?
    The space character
  • How many bits does ASCII use in a byte?
    Seven bits
  • What is the purpose of the eighth bit in ASCII?
    For parity check or special characters
  • What happens when the letter 'A' is typed on the keyboard?
    It sends ASCII code 65 to the computer
  • What is parity checking?
    An error checking method using an additional bit
  • What does a mismatch in parity bits indicate?
    The byte may have been corrupted
  • Why is agreement on parity protocol important?
    To ensure meaningful parity checks
  • What is even parity?
    Parity bit makes total number of 1s even
  • What does extended ASCII do with the eighth bit?
    It doubles the number of available characters
  • How many characters can extended ASCII represent?
    256 characters
  • How is the letter 'J' stored in extended ASCII?
    As 01001010 in binary
  • How many bytes does a text document encoded in ASCII use per character?
    One byte
  • What is the ASCII code for the word 'Hello'?
    72 101 108 108 111
  • How are ASCII codes converted into binary numbers?
    By representing each code as 8 bits
  • What is Unicode designed to address?
    Insufficient characters in extended ASCII
  • What is the most common encoding for Unicode?
    UTF-8
  • How many characters can Unicode represent?
    65,536 characters
  • What is a benefit of Unicode?
    It eliminates the need for multiple character sets
  • How does ASCII compare to Unicode in terms of bits used?
    ASCII uses fewer bits than Unicode
  • What is a potential advantage of using ASCII?
    Faster processing and reduced memory requirements