Saturday, July 11, 2015

Create an integrated Open Gov/Open Data application for California and beyond

Having tried since 1994 to bring government up to the highest levels  of computerized operation, my latest efforts revolve around legislation now pending in the California Legislature to partially rationalize and integrate the data collection, processing, storage, and release operations of the state and local governments.

            Now pending in Sacramento are bills that would require local governments to make their collected data available to users in formats that can be easily understood  (SB 169); require local government entities to prepare an inventory of their “enterprise” data and the IT systems that supports them (SB 272); and mandate a transition of neighborhood voting stations into an all-mail ballot model complemented by dropoff boxes and vote centers connected to the VoteCal computerized statewide registration database  (SB 450).

            Within these bills is the germ of an idea:  that local and state governments and bureaucracies can integrate their data systems into a fully-functioning, cloud-based, 21st century network that will allow easy use by government officials and ordinary users.

            The movement to gain easy public access to government records goes under the name “Open Gov.”  California could take the national lead on this issue by passing these bills and then moving rapidly to design and implement a comprehensive data management system that would connect all government computers in the state in one interoperative network and which would be structured in a way that gives public users easy access to the data hosted by this network.

            The long time it has taken to get the VoteCal system up and running testifies to the difficulties in bringing government up to contemporary data processing standards.  But the time and money that would need to be invested in building an integrated state data system would be amply repaid by subsequent efficiencies and cost-savings, and improved performance based on the ability to access and analyze in real-time the data held by the government.
Parsing the Law and Making it Intelligible

            Judicata is a legal analytics start-up whose efforts, according to its CEO, “are around parsing the core information that is in the case law and then building tools around that.”

            Adding the data analytics capabilities of Judicata and similar companies to a corpus of data accessible via the Web from all California state and local government as well as from legislative monitoring sites like LegiScan would allow for the creation of a system that could read, parse, semantically understand, and output in plain language answers to questions about existing law, pending law, and government policies and regulations.

            For more about the advantages of having computers parse and explain law in plain language, look at “Is it good enough for the law to be written for lawyers?”..  

            Parts of the federal government, like the National Security Agency, know a lot about hacking computer systems.  Others, like the Office of Personnel Management, apparently not so much.  Why not assign experts from this government spy agency, whose stock-in-trade is violating personal privacy in the search for potentially useful data, to instruct and advise those in the government whose job it is, inter alia, to  protect private and personal information from hostile attacks.  After all, it takes a thief…

            Meanwhile, creating an integrated network of government data centers whose data can be queried by ordinary users and used for machine learning by powerful artificial intelligence to surface actionable insights leading to increased governmental efficiency and better outcomes for everyone would be a worthwhile use of government funds and technical resources, opening up new vistas for Open Government and Open Data.

            Passing SB 169, SB 272, and SB 450 could be the first steps on a swift path to this future.

No comments: