|  | 
| typedef unsigned short | JSChar | 
|  | A UTF-16 code unit. One, or a sequence of two, can encode any Unicode character. As with all scalar types, endianness depends on the underlying architecture. 
 | 
|  | 
◆ JSStringCreateWithCharacters()
Creates a JavaScript string from a buffer of Unicode characters. 
 
 
◆ JSStringCreateWithUTF8CString()
      
        
          | JSStringRef JSStringCreateWithUTF8CString | ( | const char * | string | ) |  | 
      
 
Creates a JavaScript string from a null-terminated UTF8 string. 
 
 
◆ JSStringGetCharactersPtr()
Returns a pointer to the Unicode character buffer that serves as the backing store for a JavaScript string. 
 
 
◆ JSStringGetLength()
Returns the number of Unicode characters in a JavaScript string. 
 
 
◆ JSStringGetMaximumUTF8CStringSize()
      
        
          | size_t JSStringGetMaximumUTF8CStringSize | ( | JSStringRef | string | ) |  | 
      
 
Returns the maximum number of bytes a JavaScript string will take up if converted into a null-terminated UTF8 string. 
 
 
◆ JSStringGetUTF8CString()
      
        
          | size_t JSStringGetUTF8CString | ( | JSStringRef | string, | 
        
          |  |  | char * | buffer, | 
        
          |  |  | size_t | bufferSize ) | 
      
 
Converts a JavaScript string into a null-terminated UTF8 string, and copies the result into an external byte buffer. 
 
 
◆ JSStringIsEqual()
Tests whether two JavaScript strings match. 
 
 
◆ JSStringIsEqualToUTF8CString()
      
        
          | bool JSStringIsEqualToUTF8CString | ( | JSStringRef | a, | 
        
          |  |  | const char * | b ) | 
      
 
Tests whether a JavaScript string matches a null-terminated UTF8 string. 
 
 
◆ JSStringRelease()
Releases a JavaScript string. 
 
 
◆ JSStringRetain()
Retains a JavaScript string. 
 
 
◆ JSChar
A UTF-16 code unit. One, or a sequence of two, can encode any Unicode character. As with all scalar types, endianness depends on the underlying architecture. 
 
 
Go to the source code of this file.