Older blog entries for tgw (starting at number 4)

Two days ago I attended an "Informal Roundtable Discussion" with the title "The Internet in Power - Networked Governance or Virtual Disconnect?". It was facilitated by Steven Clift, the moderator of the 1400-subscriber Democracies Online Newswire. We were using facilities provided by the Center for Democracy and Technology. There were 23 people in the room and 2 more who teleconferenced in.

It was a good event - a new group of people that I hadn't been around before. Several people there had previously worked on Capitol Hill (for the US Congress), one gentleman was from the White House, and I sat next to Owen Ambur from the still-forming XML.gov There were also people from other various other groups, universities, and consulting companies.

The gentleman from the White House (i didn't catch his name) spoke about how no government agency is responsible for creating e-government solutions. A lot of agencies have partial responsibility, but there is no person or organization in the US government to act as a "hub" for the whole government's e-government initiatives. Another gentleman suggested that this would be the job of a government-wide CIO.

A few minutes later I was able to speak up and tell them that they were describing TDP. Among other things, TDP is intended to be exactly what the man from the White House described - a "hub" to facilitate the creation of e-democracy and e-government Open/Free software. I had said that it doesn't make sense for 50 state governments, plus however many provincial, national, continental and local governments to all be building basically the same pieces of software from scratch. Everyone is - very inefficiently - re-inventing the wheel. It makes more sense to create one application with 95% of the functionality needed by everyone, then everyone adds their own 5% of customized functionality. This makes much more sense economically and in other ways, too.

I stayed after and was able to have good discussions with Steve Clift and Owen Ambur. I had wanted to speak with some of the others, but they got out the door before I was able to.

I justed finished up writing an article on VoteAuction.com & The Whack-A-Mole Defense. I also put together some quotes and links explaining the Whack-A-Mole technique. Been up all night working on this. It's 8:00am, time to get some sleep.

Last week I attended an e-Voting Workshop here in Washington DC. It was sponsored by the Internet Policy Institute and featured a panel with quite a lot of big guns (big credentials) on it. Big credentials don't impress me much, competence and contribution does. This panel, however, produced some excellent dialog on the topics of e-voting and Internet voting.

Last week I was able to attend a meeting on Capitol Hill hosted by the Congressional Internet Caucus Advisory Committee. It was about Internet Voting. The panel consisted of Gary McIntosh of the National Association of State Election Directors, Jim Adler of VoteHere.net, Marc Strama of Election.com, Tony Wilhelm of the Benton Foundation, and Deborah Phillips of Voting Integrity Project. Dr. Lorrie Cranor of AT&T Labs was the moderator.

Just like most of the panel discussions that I've been to over the past nine months, this one was an introductory-level look at Internet Voting. Information-wise, they tend to be pretty worthless for people like myself who work with this stuff every day. However, they're a great place to network and meet new people who work in the same problem space.

A good thing about last week's meeting, in particular, is that it raised the level of visibility and understanding of Internet Voting among the 120 or so Congressional staffers who were in attendance.The ironic thing about it is that elections in the United States are controlled at the state and local level. So, there's a very limited amount that Congress can do when it comes to Internet Voting.

The main thing Congress can do is create permanent, ongoing funding for the FEC to keep pace with the rapidly changing nature of current technology and update their Voting Systems Standards (VSS) on an ongoing, yearly basis. Currently, the FEC needs to seek out special funding to update their VSS -- so it happens very infrequently. Too infrequently. The original FEC VSS was completed in 1990. Only now, 10 years later, are they being updated. The 1990 standards are pretty useless when you try to apply them to modern client/server and Internet-based voting systems. Congress definitely needs to create ongoing funding for this.

9 Sep 2000 (updated 18 Sep 2000 at 17:20 UTC) »

I just completed an article entitled Introduction to Open Source and Free Software. It is the most comprehensive introduction to Open Source, Free Software, and the differences between them - that I am aware of.

The article explains the key terms, definitions, people, organizations, and acronyms of Open Source and Free Software in a way which both non-technical and technical people can understand. Plus it offers an explanation of "What is Open Source and Free Software?", "Why does this matter?", and "Does this approach really work?". There are also plenty of links out to other sites for readers to dig deeper on the topics of interest to them.

The article is scheduled to be published in the September 2000 issue of The Bell newsletter. It is also available online at www.technodemocracy.org/people/tgw/docs/ossfs.html.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!