We have a growing collection of .NET Web applications that have been localized in a few different languages over the years. Often there is a need to localize updates, new applications or new features, and this is so far done by sending all the new text to translators.

Much of the text we send out to translate is probably already localized somewhere in one of the existing apps. I'm looking for an automated way to build a multilingual glossary (or translation memory) from our existing localized content, presumably by scanning the .resx files. This would then allow us to automatically recycle the translations without requiring translators and/or to provide the matches to the translators in order to reduce the translation effort and help lower the risk of linguistic inconsistency.

Does anyone know of a suitable tool to achieve this?

有帮助吗?

解决方案

There are a number of web applications that do exactly that, for example:

The point is that it is normal (and best) practice to keep an up-to-date translation memory with a dedicated piece of software or service, precisely to avoid re-translating the same content over and over, and to keep translations consistent across modules/products and over time.

其他提示

There are open-source solutions providing the required functions. They may work in a more simplier way compared to professional software, but they will reduce the amount of texts required to be translated as well. For example:

许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top