Still on the whole trust metric thing, I wonder if
interesting application could be for network games like QuakeWorld.
For those who haven't been following,
in the ideal case,
network servers would never tell a client anything they
be trusted to know. But for efficiency, they'll *actually*
tell you lots of stuff, like where the other players are,
when bullets are coming from behind you and whatever else,
so that if you turn around, there's no lag while the client
fills in the blanks: it already just knows. This means
people with no skills at playing the game can write clever
clients that let them dodge bullets, and see behind them
and generally just plain cheat. Which isn't good.
There are at least two possible solutions. The usual
solution is obfustication. Release binary only clients, keep
the servers to yourself, make the on-the-wire protocol
compressed and encrypted, and generally strange, and just
generally hope no one can be bothered working out how to
break the system. The nice theoretical one is as described
above: just treat the client as completely untrustworthy,
and only tell it things the human that's using it is allowed
I wonder, though, if the trust metric here could be
useful. If instead of certifying free-software gurus, you
certify something more akin to `honesty' or so.
Once you've got a bunch of people certified,
rather than trying to certify binaries, you can
establish trust pretty easily. After all, no person
has to give their secret identification stuff to anyone else
(thanks to the wonders of digital signatures and public key
cryptography) unlike binaries whose "secret"
identification stuff has to be available to everyone who has
a copy of that binary.
Possibly linear growth (size of attack versus number of
clueless certifiers) is still too troublesome though. Some
method of negative feedback may be necessary here, which in
turn would probably require more granularity than just
`Honest' and `Not honest'. Still, it could be a novel
solution to a fairly tricky issue as far as free games go.