TextEncoding
public enum TextEncoding : UInt32
TextEncoding determines whether text specifies character codes and their encoded size, or glyph indices. Characters are encoded as specified by the Unicode standard.
Character codes encoded size are specified by UTF-8, UTF-16, or UTF-32. All character code formats are able to represent all of Unicode, differing only in the total storage required.
UTF-8 (RFC 3629) encodes each character as one or more 8-bit bytes.
UTF-16 (RFC 2781) encodes each character as one or two 16-bit words.
UTF-32 encodes each character as one 32-bit word.
font manager uses font data to convert character code points into glyph indices. A glyph index is a 16-bit word.
TextEncoding is set to .utf8
by default.
-
Uses bytes to represent UTF-8 or ASCII.
Declaration
Swift
case utf8 = 0
-
Uses two byte words to represent most of Unicode.
Declaration
Swift
case utf16
-
Uses four byte words to represent all of Unicode.
Declaration
Swift
case utf32
-
Uses two byte words to represent glyph indices.
Declaration
Swift
case glyphId