[tpm] Re. Tainted data
stuart at morungos.com
Wed Apr 11 08:01:22 PDT 2012
I've done localization with Locale::Maketext, and its Locale::Maketext::Lexicon component. That takes a fair bit more work, but it uses external (non Perl) files, and allows a bit more control, including paramerized texts. The advantage: you can add additional languages by adding .po files with translations (i.e., using GNU gettext). The intent was to make it possible for non-programmers to localize systems, i.e., contracting it out. For a few locales it might not be worth it, but for large-scale internationalization it almost certainly is.
All the best
On 2012-04-11, at 9:36 AM, arocker at Vex.Net wrote:
>> Ideally, the website could be multi-lingual by placing the phrases in a
>> database instead of a flat, two language text file.
> Why not load a hash? That would be practical for every known human
> language, (c 6,000
> http://sciencenetlinks.com/science-news/science-updates/human-language/ ),
> let alone the subset likely to have web access, (< 100, at a guess).
> toronto-pm mailing list
> toronto-pm at pm.org
More information about the toronto-pm