browse words by letter
a
b
c
d
e
f
g
h
i
j
k
l
m
n
o
p
q
r
s
t
u
v
w
x
y
z
unicode |
1 definition found From The Free On-line Dictionary of Computing (13 Mar 01) [foldoc]: Unicode 1.A 16-bit {character set} standard, designed and maintained by the non-profit consortium Unicode Inc. Originally Unicode was designed to be universal, unique, and uniform, i.e., the code was to cover all major modern written languages (universal), each character was to have exactly one encoding (unique), and each character was to be represented by a fixed width in bits (uniform). Parallel to the development of Unicode an {ISO}/{IEC} standard was being worked on that put a large emphasis on being compatible with existing character codes such as {ASCII} or {ISO Latin 1}. To avoid having two competing 16-bit standards, in 1992 the two teams compromised to define a common character code standard, known both as Unicode and {BMP}. Since the merger the character codes are the same but the two standards are not identical. The ISO/IEC standard covers only coding while Unicode includes additional specifications that help implementation. Unicode is not a {glyph encoding}. The same character can be displayed as a variety of {glyphs}, depending not only on the {font} and style, but also on the adjacent characters. A sequence of characters can be displayed as a single glyph or a character can be displayed as a sequence of glyphs. Which will be the case, is often font dependent. See also Jürgen Bettels and F. Avery Bishop's paper {Unicode: A universal character code (http://www.digital.com/info/DTJB02/DTJB02SC.TXT)}. 2. Pre-{Fortran} on the {IBM 1103}, similar to {MATH-MATIC}. [Sammet 1969, p.137]. (1997-11-15)