[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [idn] URL encoding in html page



> I meant exactly what I said: ACE is a "downgrade" for machine read.
> Downgrade here have the same meaning as base64/qp is used to "downgrade"
> mail when 8BITMIME is not supported. It does not mean it is better or
worst.
>
> UTF-8 is another encoding of Unicode and plainly just UTF-8. I did not
make
> any comment to say UTF-8 is better or worst. You read too much.

UTF8 was there for the first place is to make the transition from 8 bits to
16/32 bits Unicode smoother, so the existing 8 bits system will be able to
use it. Same as base64 is to downgrade the 8bits into ASCII to make the
transition smoother...

So should we say that a good design should have fallback(downgrade) for
TRANSITIONAL period only. Then a good IDN design should be able to use ACE
as the fallback for old system, BUT SHOULD also be able to step forward to
use 8/16/32 bits or even more bits when needed.

So we should not say that ACE should be a long term solution for IDN, it
should ONLY be a TRANSITION solution that allows  the LONG TERM solution of
using UTF8 or 16/32 bits Unicode to work.

> I didnt say that "UTF-128" is better than "UTF-8" is better than ACE
either.
> UTF-128 is just plainly UTF-128. You jumped too quickly into conclusion.
>
> If we going to have a flag to change everything to 8-bit clean, why stop
at
> 8-bit? 8-bit does not even have the basic bit to support ISO10646. You
need
> some encoding mechanism to translate ISO10646 to 8-bit, aka, UTF-8.
>
> So why not make all 128bit (or more) clean?
More bits is good, but when we plan for things more than we need, then it
should be considered to be a waste of resource. So why do we need 128bits
now(i dont think the combined total of characters in all languages in the
world would require that much, not unless we want to include scripts from
oter planets : > ), whereas we need 8/16/32 bits for Unicode, so why not
design a system able to accept ACE as a fallback and also 8/16/32 bits? If
you can justify why designing a system that can handle ASCII as a fallback
and can automatically support 8/16/32 bits Unicode is not a good design,
then I think my thinking is wrong.