Home PublicationsCommentary Event Recap: How Big Data is Driving the Next Wave of E-Government Innovation

Event Recap: How Big Data is Driving the Next Wave of E-Government Innovation

by Sue He
by
Panelists at event

Last week the Center for Data Innovation held a panel discussion on “How Big Data is Driving the Next Wave of E-Government Innovation” to talk about the use of data in government to improve services, operations and responsiveness. The panel featured Ariel Gold, program manager in the global public sector at Amazon; Taha Kass-Hout, chief health informatics officer at the U.S. Food and Drug Administration (FDA); Eric Knutson, senior product marketing manager at Sitecore; Peter Schmidt, senior director at Oracle; and Dean Silverman, senior advisor to the commissioner of the office of compliance analytics at the Internal Revenue Service (IRS).

To open the conversation, the panelists talked about early wins and current directions for big data in government. Gold mentioned recovery.gov, a site which follows the activities of the Recovery Accountability and Transparency Board and Hurricane Sandy funds, as a success for open data and big data analytics. Kass-Hout described the role of big data at the FDA, where counterfeit ingredients in drugs are identified, in part, by monitoring online sources such as news and social media platforms in various languages as well as prices of ingredients and their alternatives.  Around 80% of the ingredients in drugs made in the United States actually come from outside the country—making it crucial to monitor and identify points in the drug supply chain where counterfeit ingredients may come into play. Emphasizing the utility of social media in big data, Schmidt discussed how social media, health, and weather data help schools decide if and when to close school.  For example, in the face of a flu outbreak, schools could proactively close to prevent a larger number of absences in the future. Looking outside the United States, Knutson cited the United Arab Emirates’ e-government services which successfully established a portal where agencies can create new online services for the public. Next year the country hopes to establish a system where its citizens will have a single login for all government applications. Silverman called tax administration the “epitome” of big data, with over 140 million returns filed by individuals and 40 million by businesses. Refund fraud is a big challenge for the IRS, as even small errors can be very costly, but it has had some notable successes. For example, over the last three years, the IRS has saved $2 billion by reducing improper payments. He stressed that big data is crucial for anomaly detection because it reduces the burden on taxpayers, allows decisions to be made very quickly, and provides greater certainty.

The panel also discussed the challenges that agencies need to address to take advantage of big data. Schmidt argued that although some patients are reluctant to share personal health information, new capabilities to de-identify data mitigate most of their concerns, and a significant amount of good could come from better health data sharing. Gold said that it is crucial to hone in on a specific problem to bring together the necessary tools and communities to tackle the project together, akin to the collaborative 1,000 Genomes Project. Kass-Hout agreed, adding that “crowd-sourcing”—working with the public, researchers, innovators, and academia—is necessary. Silverman touched on budget issues, noting that the introduction of new technologies is constrained by a flat budget. In particular, he cited the area of fraud detection, an area where sophisticated tools require investment. He also mentioned the need for the government to take a “test and learn” approach, described essentially as the ability to create new data through “A/B” testing—a commonplace practice used in the private sector—where new datasets are generated by testing an option and an alternative for further analysis.

Finally, the conversation ended by looking forward towards opportunities for big data in the public sector. Gold began by saying that given the pace of innovation, we cannot necessarily predict the best use of big data; instead, she remarked that it really comes down to the mission and vision of government agencies to bring together the ecosystem of data and users on a platform which enables innovation within a community. Kass-Hout stressed the advent of personalized medicine taking into account environmental, genetic, and lifestyle factors. Schmidt also made a point of healthcare, from a cost standpoint, and discussed the possibility of bending the cost curve by using better data from providers and the effectiveness of treatments. Silverman explained the IRS’s goal is to give taxpayers a better experience by creating an interactive online resource for users to raise and resolve issues in a secure, interactive, and efficient manner.

One overarching message from all of the panelists was the importance of an iterative, collaborative, and agile approach to big data in government. Large scale IT projects are uniquely dynamic challenges with many moving parts, so shortening the time frame to the first working implementation and interfacing with users, analysts and technologists to create and update requirements is necessary.  Implementing new tools and strategies to solve problems is key, and ultimately this means that a culture of innovation and responsiveness needs to be embraced. Continuing to empower the workforce, encourage an open community of engaged users pooling resources and coordinating dialogue between public and private organization will be crucial.

You may also like

Show Buttons
Hide Buttons