It's possible to make digital signatures using a secure hash as the only primitive. This is very reassuring, since the sheer number of moving parts in modular group based signatures is huge.
The basic trick is to publish n pairs of hashes, where n is the number of bits in the secure hash. To sign something, you publish the pre-hashes of the one of each pair, picking which one based on the bits of the secure hash to be signed.
This technique can only sign one value, but can be improved considerably.
A straightforward improvement is to publish n + log(n) pre-hashes, and sign using a subset of exactly half of those, which there are about 2^n of. This doesn't change the number of things which can be hashed any, but does make the hashes smaller.
More things can be signed by making the public key be the root of a hash tree. This is fairly computationally intensive on the part of the signer - the entire tree has to be generated up front, but it does allow multiple things to be signed, and doesn't increase the size of the signature much. A new restriction is that with each signature there is a number, whose value isn't significant other than that the same number can't be used twice the system becomes insecure.
Even more things can be signed by making the first value signed be another public key. This can be repeated for any number of iterations, and results in a total number of signatures which can be done equal to the number which can be signed at each iteration to the power of the number of iterations. The algorithm for computing the later public keys must be deterministic based on the private key, to keep different public keys from getting signed by a single intermediate key.
Put together, those tricks result in performance which is quite practical for many applications. A completely excessive safety margin results in 20 kilobyte signatures. I have a functioning implementation, which unfortunately has not to my knowledge been vetted by anyone, but that should be straightforward to do.
Unfortunately the best known public key encryption algorithms still are along the lines of merkle puzzles, and require bandwidth equal to the square root of the computational ability they're trying to defend against. Bloom filters can reduce the value by a constant factor of about three, but it's still very far from practical. It would be nice to have a complete crypto toolset which didn't use any of that modular exponentiation nobody really trusts.