(Asp.Net MVC3) How to Define StringLength for ASCII and Unicode?
-
05-03-2021 - |
Frage
I have following define in model
[Required]
[StringLength(100, MinimumLength = 10)]
[DataType(DataType.Text)]
[Display(Name = "XXX")]
public string XXX{ get; set; }
Now I want it treat ACSII and Unicode input differently, for ASCII, each char consider length 1, so need min-length 10 and max length 50. But for Unicode char, I want to consider it length 2, so 5 unicode chars is enough for the minimal requirement.
How do I do it?
I guess I might need two approach, first overwrite the length check in asp.net, then I need to overwite the length check in jquery. Isn't it?
Is anyone here have a working sample, thanks.
Lösung
To do what you want, you should be able to introduce a custom validation attribute:
class FooAttribute : ValidationAttribute
{
private readonly int minLength, maxLength;
public FooAttribute(int minLength, int maxLength) : this(minLength, maxLength, "Invalid ASCII/unicode string-length") {}
public FooAttribute(int minLength, int maxLength, string errorMessage) : base(errorMessage)
{
this.minLength = minLength;
this.maxLength = maxLength;
}
protected override ValidationResult IsValid(object value, ValidationContext ctx)
{
if(value == null) return ValidationResult.Success;
var s = value as string;
if(s == null) return new ValidationResult("Not a string");
bool hasNonAscii = s.Any(c => c >= 128);
int effectiveLength = hasNonAscii ? (2*s.Length) : s.Length;
if(effectiveLength < minLength || effectiveLength > maxLength) return new ValidationResult(FormatErrorMessage(ctx.DisplayName));
return ValidationResult.Success;
}
}
Lizenziert unter: CC-BY-SA mit Zuschreibung
Nicht verbunden mit StackOverflow