Archive for April, 2010

[berkman] [2b2k] Beth Noveck on White House open government initiatives

Beth Noveck is deputy chief technology officer for open government and leads President Obama’s Open Government Initiative. She is giving a talk at Harvard. She begins by pointing to the citizenry’s lack of faith in government. Without participation, citizens become increasingly alienated, she says. For example: the rise of Tea Parties. A new study says that a civic spirit reduces crime. Another article, in Social Science and Medicine, correlates civic structures and health. She wants to create more opportunities for citizens to engage and for government to engage in civic structures — a “DoSomething.gov,” as she lightly calls it. [NOTE: Liveblogging. Getting things wrong. Missing things. Substituting inelegant partial phrases for Beth's well-formed complete sentences. This is not a reliable report.]

Beth points to the peer to patent project she initiated before she joined the government. It enlists volunteer scientists and engineers to research patent applications, to help a system that is seriously backlogged, and that uses examiners who are not necessarily expert in the areas they’re examining. This crowd-sources patent applications. The Patent Office is studying how to adopt peer to patent. Beth wants to see more of this, to connect scientists and others to the people who make policy decisions. How do we adapt peer to patent more broadly, she asks. How do we do this in a culture that prizes consistency of procedures?

This is not about increasing direct democracy or deliberative democracy, she says. The admin hasn’t used more polls, etc., because the admin is trying to focus on action, not talk. The aim is to figuring out ways to increase collaborative work. Next week there’s a White House on conf on gov’t innovation, focusing on open grant making and prize-based innovation.

The President’s first executive action was to issue a memorandum on transparency and open gov’t. This was very important, Beth says, because it let the open gov folks in the administration say, “The President says…” President Obama is very committed to this agenda, she says; after all, he is a community organizer in his roots. Simple things like setting up a blog with comments were big steps. It’s about changing the culture. Now, there’s a culture of “leaning forward,” i.e., making commitments to being innovative about how they work. In Dec., every agency was told to come up with its own open govt plan. A directive set a road map: How and when you’re going to inventory all the data in your agency and put it online in raw, machine-readable form? How are you going to engage people in meaningful policy work? How are you going to engage in collaboration within govt and with citizens? On Tuesday, the White House collected self-evaluations, which are then evaluated by Beth’s office and by citizen groups.

How to get there. First, through people. Every agency has someone responsible for open govt. The DoT has 200+ on their open govt committee. Second, through platforms (which, as she says, is Tim O’Reilly’s mantra). E.g., data.gov is a platform.

Transparency is going well, she thinks: White House visitor logs, streaming the health care summit, publishing White House employee salaries. More important is data.gov. 64M hits in under a year. Pew says 40% of respondents have been there. 89M hits on the IT dashboard that puts a user-friendlier interface to govt spending. Agencies are required to put up “high value” data that helps them achieve their core mission. E.g., Dept. of Labor has released 15 yrs of data about workplace exposure to toxic chemicals, advancing its goal of saving workers’ lives. Medicare data helps us understand health care. USDA nutrition data + a campaign to create video games to change the eating habits of the young. Agencies are supposed to ask the public which data they want to see first, in part as a way of spurring participation.

To spur participation, the GSA now has been procuring govt-friendly terms of service for social media platforms; they’re available at apps.gov. It’s now trying to acquire innovation prize platforms, etc.

Participation and collaboration are different things, she says. Participation is a known term that has to do with citizens talking with govt. But the exciting new frontier, she says, is about putting problems out to the public for collaborative solving. E.g., Veterans Benefits Admin asked its 19,000 employees how to shorten wait times; within the first week of a brainstorming competition, 7,000 employees signed up and generated 3,000 ideas, the top ten of which are being implemented. E.g., the Army wikified the Army operations manual.

It’s also about connecting the public and private. E.g., the National Archives is making the Federal Registry available for free (instead of for $17K/yr), and the Princeton Internet center has made an annotatable. Carl Malamud also. The private sector has announced National Lab Day, to get scientists out into the schools. Two million people signed up.

She says they know they have a lot to do. E.g., agencies are sitting on exebytes of info, some of which is on paper. Expert networking: We have got to learn how to improve upon the model of federal advisory commissions, the same group of 20 people. It’s not as effective as a peer to patent model, volunteers pooled from millions of people. And we don’t have much experience using collaboration tools in govt. There is a recognition spreading throughout the govt that we are not the only experts, that there are networks of experts across the country and outside of govt. But ultimately, she says, this is about restoring trust in govt.

Q: Any strategies for developing tools for collaborative development of policy?
A: Brainstorming techniques have been taken up quickly. Thirty agencies are involved in thinking about this. It’s not about the tools, but thinking about the practices. On the other hand, we used this tool with the public to develop open govt plans, but it wasn’t promoted enough; it’s not the tools but the processes. Beth’s office acts as an internal consultancy, but people are learning from one another. This started with the President making a statement, modeling it in the White House, making the tools available…It’s a process of creating a culture and then the vehicles for sharing.

Q: Who winnowed the Veterans agency’s 3,000 suggestions?
A: The VA ideas were generated in local offices and got passed up. In more open processes, they require registration. They’ve used public thumbs up and down, with a flag for “off topic” that would shrink the posting just to one link; the White House lawyers decided that that was acceptable so long as the public was doing the rating. So the UFO and “birther” comments got rated down. They used a wiki tool (MixedInk) so the public could write policy drafts; that wiki let users vote on changes. When there are projects with millions of responses, it will be very hard; it makes more sense to proliferate opportunities for smaller levels of participation.

A: We’re crowd-sourcing expertise. In peer to patent, we’re not asking people if they like the patent or think it should be patented; we’re asking if they have info that is relevant. We are looking for factual info, recognizing that even that info is value-laden. We’re not asking about what people feel, at least initially. It’s not about fostering contentious debate, but about informed conversation.

A: What do you learn from countries that are ahead of the curve on e-democ, e.g., Estonia? Estonia learned 8 yrs ago that you have to ask people to register in online conversations…
A: Great point. We’re now getting up from our desks for the first time. We’re meeting with the Dutch, Norway, Estonia, etc. And a lot of what we do is based on Al Gore’s reinventing govt work. There’s a movement spreading particularly on transparency and data.gov.

Q: Is transparency always a good approach? Are there fields where you want to keep the public out so you can talk without being criticized?
A: Yes. We have to be careful of personal privacy and national security. Data sets are reviewed for both before they go up on data.gov. I’d rather err on the side of transparency and openness to get usover the hump of sharing what they should be sharing. There’s value in closed-door brainstorm so you can float dumb ideas. We’re trying to foster a culture of experimentation and fearlessness.

[I think it's incredible that we have people like Beth in the White House working on open government. Amazing.]

Tags:

[2b2k] Facts and networked facts

Harry Lewis [blog], one of my faves and someone who does not put up with any of my guff, had me in as a guest lecturer in one of his courses today. We talked about knowledge on the Net, and, in particular, whether the Net is leading us to flock with others who are like us, thus making us stupider and more extreme, rather than smarter and more open. It’s hard to know what the data actually are about this; Harry, who worries that the Net is just enabling us to confirm our ignorances, nevertheless pointed us to the David Brooks column that references some more optimistic studies. But, as I think Harry agrees, this is an area where the meaning of such studies is up for grabs — ironically, if we cite the studies that confirm our beliefs (which, btw, is the opposite of what Harry was doing), and ironically with a double salchow in light of what I’m about to say about facts.

This discussion was quite useful for me. I’m writing the last section of the chapter on facts. The echo chamber argument (i.e., we flock with similar birds and chirp our way into stupidity) often expresses a nostalgia for the Enlightenment, which includes, in the modern era, a belief that knowledge rests on a bedrock of facts. Facts are bedrock because they cannot be disputed. Facts, after all, straddle the line between the world and our knowledge of the world: They are what are knowable about the world. They are what makes a true statement true. They are not dependent on our knowledge (they are true whether or not we know them), but they enable our knowledge. Because facts are facts regardless of whether any one of us recognizes them, they are true for everyone. Thus: Bedrock because they are independent of us, and bedrock because they are nonetheless knowable.

So, this makes a big stinking problem for the book, for a few reasons.

First, I don’t want to be dealing with this question. It’s too hard. This was supposed to be a relatively easy book about expertise and knowledge, and now I’m smack up against big questions that are way way past my pay grade.

Second, I think the metaphysics in which the “facts are bedrock” argument is embedded is a misguided metaphysics. I fully believe that facts do not depend on us, and that facts are just one (particularly useful) “mode of discourse” — one way the world shows itself to us if we ask about it in a particular way. The Enlightenment set-up of the problem doesn’t let us have our fact-based cake and eat it too, which is what’s required. But I don’t want to deal with metaphysics (see point #1 immediately above)). So, I’m thinking about talking in the book about “networked facts” that include their links and context, for facts are always (?) taken up in context, and once taken up by us, they no longer serve as a self-sufficient bedrock, because you take them up one way and I take them up the other. Facts in a networked world always (= almost always, often, can) point back into the source from which they emerge and ahead into the social stew that makes sense (= tries to make sense, pretends to make sense, makes no sense) of them. (I do want to make sure that the reader doesn’t feel let off the hook when it comes to facts; facts matter.)

Third, I realized after the class that I’m right back in the topic of my doctoral dissertation of 30+ years ago, which was about Heidegger’s ontology of things (= material objects, roughly). My question then was how do we make sense of phenomena that show themselves in our experience as being beyond our experience. Apparently, I still don’t know.

Tags:

[2b2k] What I would have said at Nature

I had just finished a draft of the informal talk I was scheduled to give at Nature in London when I heard that our flight had been canceled. I’m very disappointed because getting to talk with Nature folks about what the Web is doing to knowledge is a pretty great opportunity for me to learn from very thoughtful people on the front line. Also, I was looking forward to seeing my friend Timo Hannay there. Not to mention some unNatural folks we were looking forward to having a meal with. Anyway, this is what I planned on saying in my brief conversation-opener.

I was going to begin with laying out the issue that Too Big to Know seems to be addressing these days: Now that — thanks to the Web — we can’t know anything without simultaneously knowing that there is waaaaay too much for us to ever know, knowledge and knowing are changing. The old strategy of reducing knowledge to what our medium (paper) can handle is being replaced by new strategies appropriate for the inclusive nature of knowledge in a medium built out of links. (Links are about including more and more; books are about excluding everything except what really really counts.)

After a lot of failed outlines for the talk, I had decided on narrowing my focus (oh, the irony of having to narrow one’s focus in a talk about the extravagance of knowledge!) to two changes to the nature of expertise and knowledge, both based on the assumption/presumption that expertise is becoming networked and is thus taking on properties of networks.

First, transparency (although that probably isn’t the best word to sum up this point). I wanted to say something broad and vague about a change from thinking of knowledge as a reflection of the world, known to be true because the method that derived it is repeatable. (This applies to scientific knowledge, but I was going to be talking to Nature after all.) Of course, few experiments are actually repeated; if they all were, we’d cut the pace of science in half. But, designing an experiment as if it were to be repeated creates a useful methodology for scientific work. We live in a post-Bacon world, however. After Watson and Crick, after Kuhn, after philosophers such as Latour, we no longer think of science as merely a neutral mirror, invisible in itself. Now the Web is changing the topology of science. Science will still use repeatable methodologies, but authority is increasingly coming from the social world in which the work is embedded. Indeed, we can now see how the work is appropriated by others, which used to be pretty much a black box. We can thus see the value of the work, whereas before much of that value was hidden. We can also see distressingly how works are misappropriated and rejected by their culture. This is a type of transparency to and fro: From the scientific work to the world, and forward from the work into the culture. This is a 180 degree turn from the old regime that viewed authority as a stopping point for inquiry; links are continuations, not stopping points.

Second, I was going to point to networked knowledge taking on the Net’s embrace of differences: nothing goes uncontroverted on the Net. On the Net, every statement has an equal and opposite reaction. Something like that. There are some good consequences of this. Lots more views get aired. We are filling out the ecology of knowledge, with well-vetted bastions such as Nature, to unvetted bastions like Arxive.org. But we also don’t really know what to do with the fact of difference. Some groups rule out of discourse those who disagree too fundamentally. In fact, we all do, and I can see the sense in that; do we have to include Creationists in every evolutionary science conference? But denying the legitimacy of difference also has a cost. We don’t have the metaphysics and possibly the genetic neural set-up to deal with fundamental differences. So, I don’t know what comes of this.

But now I don’t have to because Odin blew up a mountain in Iceland and my trip to London has been scrubbed.

(I’m blithe about the volcano because I basically have no discretionary Internet access, so I’m just assuming there weren’t any deaths or major destruction caused by the eruption. I do realize that some things are more important than my travel plans.)

Tags:

[2b2k] Mr. Denham’s defense of child chimney sweeps

From the summary of the remarks in 1819 of a Mr. Denham during the British House of Commons debate of a bill that would have limited the use of young boys as chimney sweeps — as young as four years old, stuck into chimneys 7″ square for up to six hours at a time [source]. How many modern arguments can you spot?

If chimneys could be swept by machinery cheaper and better than by boys, he could not conceive that the people of this country were so attached to cruel treatment merely because it was cruel, as to continue to sweep with children, when if would be better to sweep with machinery. If, as had been stated by an hon. gentleman, on the authority of the fire-offices, that machinery was safer and better, he should think it was quite enough to state this to the public in order to induce them to adopt machinery. When he found in this bill a series of clauses, empowering a single justice to convict on the evidence of a single witness, and the functions of a jury superseded, he could not help viewing it as extremely objectionable. He must see a strong case of necessity made out before he could vote for such a measure. … [H]e thought the good sense of the public was sufficient to correct the evil without loading the statute book with another penal law, every penal law being in his opinion a great evil. … It might be proper that children of tender age, either with their parents or as parish paupers, ought not to be bound out to this employment; but he thought that parents might in general be trusted with the guardianship of their own children; and he submitted, whether it would not be better that they should be employed in sweeping chimneys than in idleness, in the workhouse, or in the fraud and pilfering which was now so common among boys of tender age. With respect to the convictions for breach of covenant before a magistrate, he could not see why this, like any other covenant, should not go before a jury. He did not wish to give such enormous powers to magistrates…

Tags:

[2b2k] The book of the future has arrived!

Yikes. All this talk about the future of books and the future of ebooks. Will it be like the Kindle? Will it be like the iPad?

The book of the future is already here. It’s been here for about 15 years. It’s called The Web.

That’s taking books as the medium by which we develop, preserve, and communicate ideas and knowledge. The Web is already that book — distributed, linked, messy, unstable, self-contradictory, bottom up and top down, never done, unsettled and unsettling, by us and of us. The book of the future has a trillion pages and trillions of links, and is only getting started.

If, however, you mean by “book” a bounded stretch of an authorial monologue, we have plenty of those on the Web, and some have great value. But they now get their value by being linked into the roiling universe of their peers.

The book of the future isn’t on the Web; it is the Web.

Tags: