I did a little bit of research, and learned that the character in question is a unicode dash U+002D. Now if the data looked fine in your input file and was corrupt upon importing the problem may due to PB not handling the data as Unicode so you can fix the situation using functions in PB.
It could be that the database interface you are using doesn't support the conversion between ansii and unicode (see page 7).. not sure if you were using a pipeline object or anything where database drivers come into play.
Either way knowing it is a character encoding issue, fixing this should be pretty simple, just use the the "EncodingASNI!" or "EncodingUnicode!" enumerated argument on the String and Blob methods prior to importing the text into the datawindow. If that isn't possible than you could write a quick routine to read through the file, convert, and save before importing.
If you don't want to convert before importing you can do it by looping through the datawindow/datastore before actually performing the update to the database.
You can find examples of code on my blog on converting between ANSI and Unicode, but basically you just use one of these encoding parameters on String and Blob functions.
- EncodingANSI!
- Encoding UTF8!
- EncodingUTF16LE! – UTF-16 Little Endian encoding (PowerBuilder 10 default)
- EncodingUTF16BE! – UTF-16 Big Endian encoding