Home PublicationsData Innovators 5 Q’s for Greg Povolny, Mindshare Technology CEO

5 Q’s for Greg Povolny, Mindshare Technology CEO

by Travis Korte
by
New York street children in 1890.

The Center for Data Innovation spoke with Greg Povolny, CEO of Mindshare Technology, a Florida-based data analytics company that works with state and local child welfare authorities to predict which cases are most likely to have negative outcomes. Povolny discussed some of the things Mindshare can predict and highlighted the question of whether authorities who have access to information about the riskiest cases should have an obligation to act on them.

Travis Korte: Can you introduce Mindshare, what you make, and who uses it?

Greg Povolny: We’ve been in business since 2002, primarily focused on data analytics and data interoperability. We apply that technology to health and human services almost exclusively. Since about 2004 we’ve been focused on child welfare—in particular, giving child welfare data system administrators the ability to use data analytics for high-risk situations and giving states and agencies the ability to do quality assurance to measurably impact outcomes using technology.

Right now we’re deployed across about 70 percent of the state of Florida, tracking about 30,000 kids daily. We’re also working with some other states right now on an implementation: Nebraska, Pennsylvania, Oklahoma, and a couple of others we’re just getting started on. States are contacting us based on our success in Florida. We’re going through some education on how they could better use technology. We also partner with community-based care organizations to offer case practice services in addition to our technology.

TK: What exactly do you track?

GP: We’ve been lucky enough to get daily access to Florida’s statewide automated child welfare information system, the Florida Safe Family Network, and we get access to every bit of case information from there. There are some things we look at to determine a troubled case. First of all, there’s compliance. There are several federal statutes a state has to abide by, such as making face-to-face visits with children under care, conducting supervisor reviews, contacting the biological parents, entering case notes, et cetera. We scan a lot of that detail and have dashboards and alerts that authorities can use to escalate certain issues with case management. On top of that we do a lot of predictive analytics to look at patterns of children in the system of care, past and present, and look at things such as permanency, maltreatment, risk of re-abuse, risk for re-entry into care, things of that nature.

And we produce what we call hotlists that agencies can use to perform quality assurance. We can say, “In probability order, these are the cases most likely to encounter certain issues, such as violence, re-abuse, et cetera.” They can then target those cases. In Florida they call that approach rapid safety feedback: they look at all the case details every quarter to score the case and then we can re-run the analytics against our original baseline and see if the risk is going down or going up. We also align case activities to changes in risk to see what’s working and what’s not working.

TK: Can you tell us a little more about some of the predictive modeling you do?

GP: Well, we can consume the data and using machine learning and very complex algorithms determine the patterns in the data to predict future events. So, for example, we can look at all the reunifications [in which children leave foster care and return to their parents] that have occurred in the last 3 years. We’ll look at the the patterns with ones that were successful and the ones that failed. We can use that model to look at the current cases that are nearing reunification or perhaps just reunified and predict the probability of the ones that are likely to fail and have the children re-enter the system of care. Some of the predictive models that we’ve got are the probability of a failed reunification, the probability that a child will be re-abused, the likelihood of a child exiting out of care unsuccessfully (aging out, for example), and the likelihood a child will leave care without a diploma. And now we’re also getting into models for potential human trafficking.

TK: Are you working directly with government agencies here, or is it mainly contractors using your services?

GP: Some states are privatized, so when I say agency that could be a contracted care management organization or a state agency. But nevertheless the technology is being used by whoever manages the cases on a daily basis. Data folks, case management supervisors, senior stakeholders within the system of care are using this to look at their current case load.

TK: What are some of the insights you’ve gleaned from working with this data?

GP: It’s more of a policy question, but we’ve been thinking about what happens when we produce a prediction and now somebody has in their hands the riskiest cases in the system of care. Are they then obligated to take action? If we’re working with agencies that have this capability and the system tells them these children are likely to encounter violence, what is their responsibility? What are they required to do? Those are some of the very interesting questions we’re having right now.

You may also like

Show Buttons
Hide Buttons