Age | Commit message (Collapse) | Author |
|
Among other things, this empowers ll_convert() and ll_convert_to() to accept a
string literal (which might contain non-ASCII characters, e.g. __FILE__).
Without this, even though we have ll_convert_impl specializations accepting
const char*, passing a string literal fails because the compiler can't find a
specialization specifically accepting const char[length].
|
|
|
|
As a function parameter, an assignment expression or a `return` expression,
`ll_convert()` can infer its target type.
When it's important to specify the TOTYPE explicitly, rename the old
`ll_convert()` function template to `ll_convert_to()`. Fix existing usage.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
It's a little distressing how often we have historically coded S32 or U32 to
pass a length or index.
There are more such assumptions in other viewer subdirectories, but this is a
start.
|
|
|
|
and correspondingly, ll_convert<std::wstring>(const wchar_t*).
Now that we're using ll_convert() for single-argument stringize(arg), make
sure it can efficiently handle the simple case of constructing a string from a
const char pointer.
|
|
clang allows us to specify, as a default function parameter, an expression
involving a preceding parameter, e.g. (char* ptr, size_t len=strlen(ptr)). The
Microsoft compiler produces errors, requiring more overloads to address that.
Also #undef llstring.h's declaration helper macros at the bottom of the file.
Once we've used them to declare stuff, they need not (should not) be visible
to the consuming source file.
|
|
Use new ll_convert_forms() macro in llstring.h to declare, for each
wide-string conversion function of interest, four overloads. The real one, the
nontrivial one, is (const char*, size_t len), implemented in llstring.cpp. Then
(const string&, size_t len), (const char*) and (const string&) are each
trivially implemented with an inline call to (const char*, size_t len).
Notably, we change all S32 len parameters to size_t. Using S32 is old skool.
Tweak each nontrivial implementation in llstring.cpp to accept (const char*,
size_t len) instead of (const string&) with or without explicit length.
Eliminate from llstring.cpp trivial overloads (deriving length from either a
const char* or from a string), since those are now inline in the header.
Of course three of those overloads will be unified once we enable C++17 and
change each relevant parameter to std::string_view, but we're not yet there.
Meanwhile, this suite of overloads minimizes, to the best of our ability, new
string allocations solely for parameter passing. And use of a macro means we
need only change the macro once we get std::string_view.
We take this step because some use cases require (const char*), some require
(const string&, size_t len), others (const char*, size_t len) ... We were
missing some key overloads, and had to work around them by instantiating new
string objects (necessitating both allocation and character copying) just to
pass the desired parameter. Using the macro ensures this consistent set of
overloads for every wide-string conversion function.
Additionally, knowing that the ugly-name overloads exist, ll_convert_forms()
implicitly defines corresponding ll_convert<TARGET>() overloads.
Streamline declarations of utf16str_to_wstring(), wstring_to_utf16str(),
utf8str_to_utf16str(), utf16str_to_utf8str(), utf8str_to_wstring(),
wstring_to_utf8str(), ll_convert_wide_to_wstring() and
ll_convert_wstring_to_wide() using ll_convert_forms().
Use corresponding new ll_convert_cp_forms() macro to declare consistent
overloads for conversion functions accepting an optional unsigned int
code_page parameter. We used to delegate to the .cpp file the implementation
of each overload accepting code_page so llstring.h need not include the
Windows header defining the CP_UTF8 default; this is more simply accomplished
by introducing a small ll_wstring_default_code_page() function to retrieve it
from the .cpp file. That lets us specify the code_page parameter as optional,
using that function as its default value.
Use ll_convert_cp_forms() to streamline declarations of
ll_convert_wide_to_string() and ll_convert_string_to_wide().
Introduce real implementations of ll_convert_wide_to_wstring() and
ll_convert_wstring_to_wide(). The previous implementations merely copied
individual characters, which is wrong: when we convert UTF16LE to UTF32, we
can and should fold multi-character UTF16LE encodings to the corresponding
single UTF32 character. The real implemenations leverage our awareness that
both llutf16string and Windows std::wstring (either variant) use UTF16LE
encoding, so we can reuse the corresponding llutf16string conversions.
Introduce generic ll_convert_length() function, specialized as either
std::strlen() or std::wcslen() depending on parameter type. (Even if
std::wcslen() is derived from classic C, why doesn't the C++ standard library
define a std::strlen(const wchar_t*) overload to call it?)
Fix ll_convert_alias()'s ll_convert_impl specialization's operator() to accept
boost::call_traits::param_type, so we can pass (e.g.) const std::wstring& but
also const wchar_t* instead of const wchar_t*&.
|
|
In llpreprocessor.h, consider the case of clang on Windows: #define
LL_WCHAR_T_NATIVE there as well as for the Microsoft compiler with /Zc:wchar_t
switch.
In stdtypes.h, inject a LLWCHAR_IS_WCHAR_T symbol to allow the preprocessor to
make decisions about when the types are identical.
llstring.h's conversion logic deals with three types of wide strings
(LLWString, std::wstring and utf16string) based on three types of wide char
(llwchar, wchar_t and U16, respectively). Sometimes they're three distinct
types, sometimes wchar_t is identical to llwchar and sometimes wchar_t is
identical to U16. Rationalize the three cases using ll_convert_u16_alias() and
new ll_convert_wstr_alias() macros.
stringize.h was directly calling wstring_to_utf8str() and utf8str_to_wstring(),
which was producing errors with VS 2019 clang since there isn't actually a
wstring_to_utf8str(std::wstring) overload. Use ll_convert<std::string>()
instead, since that redirects to the relevant ll_convert_wide_to_string()
function. (And now you see why we've been trying to migrate to the uniform
ll_convert<target>() wrapper!) Similarly, call ll_convert<std::wstring>()
instead of a two-step conversion from utf8str_to_wstring(), producing LLWString,
then a character-by-character copy from LLWString to std::wstring. That
isn't even correct: on Windows, we should be encoding from UTF32 to UTF16.
|
|
|
|
https://docs.microsoft.com/en-us/cpp/c-runtime-library/reference/snprintf-snprintf-snprintf-l-snwprintf-snwprintf-l?view=vs-2017
"Beginning with the UCRT in Visual Studio 2015 and Windows 10, snprintf is no
longer identical to _snprintf. The snprintf function behavior is now C99
standard compliant."
In other words, VS 2015 et ff. snprintf() now promises to nul-terminate the
buffer even in the overflow case, which is what snprintf_hack::snprintf() was
for.
This removal was motivated by ambiguous-call errors generated by VS 2017 for
library snprintf() vs. snprintf_hack::snprintf().
|
|
Twemoji as the viewer's fallback for all emoji blocks
|
|
|
|
instead of a variable of type decltype(expression).
Using SHGetKnownFolderPath(FOLDERID_Fonts) in LLFontGL::getFontPathSystem()
requires new Windows #include files.
A variable with a constructor can't be declared within the braces of a switch
statement, even outside any of its case clauses.
|
|
|
|
Move Windows-flavored llstring_getoptenv() to Windows-specific section of
llstring.cpp.
boost::optional type must be stated explicitly to initialize with a value.
On platforms where llwchar is the same as wchar_t, LLWString is the same as
std::wstring, so ll_convert specializations for std::wstring would duplicate
those for LLWString. Defend against that.
The compilers we use don't like 'return condition? { expr } : {}', in which we
hope to construct and return an instance of the declared return type without
having to restate the type. It works to use an explicit 'if' statement.
|
|
Add ll_convert<TO, FROM> template, used as (e.g.):
ll_convert<std::string>(value_of_some_other_string_type);
There is no generic template implementation -- the template exists solely to
provide generic aliases for a bewildering family of llstring.h string-
conversion functions with highly-specific names. There's a generic
implementation, though, for the degenerate case where FROM and TO are
identical.
Add ll_convert<> specialization aliases for most of the string-conversion
functions declared in llstring.h, including the Windows-specific ones
involving llutf16string and std::wstring.
Add a mini-lecture in llstring.h about appropriate use of string types on
Windows.
Add LL_WCHAR_T_NATIVE llpreprocessor.h macro so we can detect whether to
provide separate conversions for llutf16string and std::wstring, or whether
those would collide because the types are identical.
Add inline ll_convert_wide_to_string(const std::wstring&) overloads so caller
isn't required to call arg.c_str(), which naturally permits an ll_convert
alias.
Add ll_convert_wide_to_wstring(), ll_convert_wstring_to_wide() as placeholders
for converting between Windows std::wstring and Linden LLWString, with
corresponding ll_convert aliases. We don't yet have library code to perform
such conversions officially; for now, just copy characters.
Add LLStringUtil::getenv(key) and getoptenv(key) functions. The latter returns
boost::optional<string_type> in case the caller needs to detect absence of a
given environment variable rather than simply accepting a default value.
Naturally getenv(), which accepts a default, is implemented using getoptenv().
getoptenv(), in turn, is implemented using an underlying llstring_getoptenv().
On Windows, llstring_getoptenv() returns boost::optional<std::wstring> (based
on GetEnvironmentVariableW()), whereas elsewhere, llstring_getoptenv() returns
boost::optional<std::string> (based on classic Posix getenv()).
The beauty of generic ll_convert is that the portable LLStringUtilBase<T>::
getoptenv() template can call the platform-specific llstring_getoptenv() and
transparently perform whatever conversion is necessary to return the desired
string_type.
Add windows_message<T>(error) template, with an overload that implicitly calls
GetLastError(). We provide a single concrete windows_message<std::wstring>()
implementation because that's what we get from Windows FormatMessageW() --
everything else is a generic conversion to the desired target string type.
This obviates llprocess.cpp's previous WindowsErrorString() implementation --
reimplement using windows_message<std::string>().
|
|
Instead of returning a wchar_t* and requiring the caller to delete it later,
return a std::basic_string<wchar_t> that's self-cleaning. If the caller wants
a wchar_t*, s/he can call c_str() on the returned string.
Default the code_page parameter to CP_UTF8, since we try to be really
consistent about using UTF-8 encoding for all our internal std::strings.
|
|
|
|
especially for animated objects.
|
|
|
|
clipboard
|
|
|
|
shouldn't be removed
|
|
clang has started to reject our non-const comparison operator methods used
within standard algorithms.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
cleaning up build
moved most includes of windows.h to llwin32headers.h to disable min/max macros, etc
streamlined Time class and consolidated functionality in BlockTimer class
llfasttimer is no longer included via llstring.h, so had to add it manually in several places
|
|
|
|
|
|
|
|
We didn't have any tokenizer suitable for scanning something like a bash
command line. We do have a couple hacks, e.g. LLExternalEditor::tokenize() and
LLCommandLineParser::parseCommandLineString(). Both try to work around
boost::tokenizer limitations; but existing boost::tokenizer support just
doesn't address this case. Neither of the above is available as a general
scanner anyway, and parseCommandLineString() fails outright when passed "".
New getTokens() also distinguishes between "drop delimiters" (e.g. space,
return, newline) to be discarded from the token stream, versus "keep
delimiters" (e.g. "+-*/") to be returned as tokens in their own right.
There's an overload that honors escapes and a more efficient one that doesn't;
each has a convenience overload that returns the scanned string vector rather
than requiring a separate declaration.
Tweak and comment older getTokens() implementation.
Add unit tests for both old and new getTokens() implementations.
Break out StringVec and std::ostream << StringVec from
indra/llcommon/tests/listener.h to StringVec.h: that's coming in handy for a
number of different TUT test sources.
|