Digital Data Flashcards
(16 cards)
Describe how a number is converted to a binary pattern for storage in compute
Computers Memory can be ON or OF
On is indicated by 1 and off is indicated by 0
Data is sorted character by character with each character being represented by a code, which will consist of 1’s and 0’s and each code is sent to the CPU
Define a Bit
Smallest unit of storage on a computer
it is a single digit in a binary number(0 or 1)
Define a Byte
a group of 8 bits
Define a nibble
A nibble is half a byte, consisting of 4 bits
How many bytes in a kilobyte
1024
How many kilobytes in Megabytes
1024
How many Megabytes in a gigabytes
1024
how many gigabytes in a terabyte
1024
What are the two types of character representation
ASCII(7- bit and 8-bit)
Unicode
Expand ASCII
American Standard Code for Information Interchange
Explain ASCII(7 bit)
Uses 7 bits to represent each character,only 128 characters could be represented in the character set
Explain ASCII(8 bit)
Uses 8 bits to represent each character, a further 128 characters can be represented in the character set, totalling 256 characters
Explain Uni Code
Unicode typically uses 16 bits to represent a character and so can represent up to 65000 character
Explain the meaning of Overflow
Overflow happens when computerized calculations produce an answer that it too big to represented
Define a Variable
A memory location given a name used to hold data which can change value