The is trying to make its way through the data maze.
For years, much of the school system’s data on attendance, academic performance, and student behavior has been stuck in silos, where few teachers and administrators could access it, much less make practical use of it.
And when data is available in the district, too often it’s out of date, and administrators have questioned its validity and accuracy.
Another complication: When the district has purchased new tech products in the past, it’s had to figure out if they mesh with existing data systems—and if they don’t, there’s a scramble to create a workaround.
Now, San Francisco district officials have set out to end those disconnects through the pursuit of “interoperability,” a strategy for making data more useful and accessible—and a driver of classroom improvements, rather than an impediment.
Melissa Dodd, the San Francisco district’s chief technology officer, is running the point on those efforts. Now in her third year in the 57,000-student district, Dodd and other administrators are trying to build a system that churns out actionable data that helps the district reach its . Those goals include creating personalized academic pathways for students, finding new ways to motivate them, integrating technology throughout the school day, and re-imagining how space can be used in school buildings.
Education Week Associate Editor Sean Cavanagh recently interviewed Dodd about her district’s pursuit of interoperability and what other districts can learn from it.
This interview has been edited for brevity and clarity.
Where is the San Francisco district in its efforts to move toward interoperability?
We’re beyond exploring, and we’re at a stage of implementation. The journey started before I arrived, in terms of identifying the pain points that teachers, principals, and our senior leadership team were experiencing around data and information. Interoperability gives us a framework and a terminology to describe what we’re all attempting to do—which at the end of the day is to make all of our data and information meaningful to our communities.
What’s the core problem you are trying solve?
When I started in San Francisco, one of the first things I heard was that we had a lot of data, but it was in a lot of different places and systems, and that made it not user-friendly. A new system would come online, and we’d say, OK, let’s figure out how to make our student information system talk with that system. Or how to get another system to talk with this other system. We didn’t have a comprehensive, coherent, systematic approach for how we were going to integrate data and make it easily accessible to the end-users.
So where did you begin, in terms of trying to fix the situation?
We’re approaching it from a cultural/organizational transformation, as well as from a technology-technical transformation. I don’t want to say the technology side is easy, because none of it is really easy. But the harder part is building a common vision, understanding, approach for what we want our end goal to be and how we organize ourselves as a district to enable that. It was really focused on the cultural aspect, the change management that’s needed for interoperability to take hold.
We started by looking at building coherence across ourselves as an organization. We formed a data-governance committee, and implemented stronger technology governance, so that when we were interested in bringing a data system online, we asked questions first. One question: Do we have a system that can already serve this purpose and need? If we’re bringing in another system, how are we integrating data and making it interoperable with our existing system. These are questions we didn’t necessarily ask beforehand.
And what stage of the journey are you at now?
Now we’re at the second part—the technical transformation. We’re really building in the technical infrastructure to support this work moving forward. Part of it is we’ve been consolidating our systems down to an essential few and then implementing data standards, and a technology infrastructure that will help make our data interoperable across all of our systems.
In pursuing interoperability, are you following one standard?
We’re implementing the . We’re a part of the alliance. We’re in year one of a three-year partnership with the Michael and Susan Dell Foundation to help us implement our Ed-Fi standard and to start up an operational data store, as well as a data vault in the cloud. We’re starting with our core student data and building out from there, then bringing in our education human-resources data, then ultimately bringing in our operational and financial data.
How long will it be before you get where you want to go?
This year, we’re focused on setting up the infrastructure, the environment. That’s about a four-month process. But in terms of bringing all of our data in, we’ve mapped that out over a three-year period. We’re also looking how we serve up and make accessible that data and the practices and culture through which we use that data.
What’s an example of how data in your district ends up isolated and of little use, and how interoperability might help?
Here’s a great example: Two of our schools recently went through school board-approved name changes. We need to have a common and consistent way of applying the names of those schools into our core data systems. Right now, we’d have to do that manually, because a data field in one system doesn’t speak to a data field in another.
How would interoperability improve your use of attendance data?
Our attendance data starts with a teacher or school clerk entering data into our system. And there can be multiple pieces of attendance data assigned to that student on any given day. Our students typically start the day in our system as “present” and our teachers have to take a step to change that to “absent” or “tardy.” And then at the secondary level, we’re taking attendance for each period.
District chief technology officers who are trying to implement interoperability offered suggestions for peers in other school systems.
The data resides within our student information system, and it then gets sent out to various other systems we have. For instance, we have another system that makes attendance calls at a certain time of day, which notifies the parent whether their child is in school that day. It goes to our behavioral system so interventions can take place for that student. And then various reports are run at a school level, at a student level, and at an aggregate level—such as showing us the percentage of all students who are chronically absent over a period of time. All of those things require that data from one system gets to another. Currently, to get data from our SIS to our behavior system, there’s a 24-hour lag time. And right now, if two people are running an attendance report and doing a report from the start of school to a specific date, then their report is going to be different, because the data is not real-time. Data is more accurate in one system than in another. So having data interoperable means that our codes, our logic, our definitions are consistent across systems, so that it’s apples to apples.
Are there multiple ed-tech vendors that operate those data systems, and what will you need them to change?
Yes, we’re working with our student information system [that is produced by a vendor]. They’ll be adopting and implementing Ed-Fi standards, just as we are as a district. Because of how the information flows out of our SIS and into other systems, we can write the interfaces, but we need the core SIS in place and to have an API [Application Programming Interface] in place to pull the data out in a consistent way and into our other systems.
Is that approach similar to how you will try to make interoperability happen through other data systems?
The approach we look for in our data systems is that they either have open systems or APIs in place that can allow districts to extract the data and interact with other systems. We’re extracting information from our core SIS. We’re then bringing it into an operational data store. And then we’re serving it up into the other systems we use. So we get away from having to create separate interfaces between systems. We want one interface where all the data comes in and we can then push it out.
How do you believe your interoperability efforts will help teachers and students in the classroom?
That’s ultimately who we’re serving. We want to make it easier and less burdensome at the school site and in the classroom for our principals and teachers. We want to leverage so that we’re not sending them through so many systems to get information on their students. They should be able to access information through databoards and make meaning out of it. We want the data they have to be more accessible and accurate, and more real-time. They shouldn’t have to wait for us to generate some big report for them, so that it’s a month or two later and it’s hard for them to make meaning from it.
Where is the San Francisco district when it comes to “single sign-on”?
This Education Week examination of school districts’ pursuit of interoperability is the first of three special reports focused on the needs of K-12 district technology leaders, including chief technology officers. Each report in the series features exclusive results of a new, nationally representative survey of CTOs, conducted by the , which represents K-12 district technology officials.
We do have single sign-on. All users—students and adults—have an SFUSD ID and password and our core systems all have single sign-on.
For our teachers, we leverage the same ID and password for the most part. We do that for security purposes. We don’t want to have too much access if a teacher steps away from their laptop [in terms of what someone else] would be available to access.
What concerns have you heard from education companies about your interoperability efforts?
Honestly, the vendors we’ve been working with have been stepping up to the plate, because we’re clear on what our expectations are and what we need to have in place to do business with them. There has been more of a shift with vendors in that it requires them to rethink their technology and how they play with other vendors. I think groups like , , and are pushing the vendors in a way that makes it clear this is the new normal for districts.
You mentioned your district is using Clever. What is your relationship with that company?
We use Clever for instant log-in, and rostering of digital learning applications and tools for students. We pass data from our SIS and class rosters so that if a teacher is using a particular math digital learning tool, they don’t have to set up their classes as well. It comes from their master schedule from our SIS. There’s interoperability around student information. It’s really to relieve the burden and extra steps on teachers and for students, and they’re going to one place—we call it our digital backpack—where they can access our learning tools that they use in our classrooms.
One concern we’ve heard from districts about interoperability is a fear that if they go with one data standard, such as Ed-Fi or IMS Global, they might be prevented from using an innovative e-tech product down the road. Is that a worry for you?
It’s definitely a factor, but I’d also say we have a pulse and a check on the likelihood of that in the timeframe we’re focused on. We didn’t feel that it was reason enough to not go with the Ed-Fi standard. Our goal is to be as flexible and adaptable as possible, because technology changes and evolves. But the reality for us is that to bring out some new platform or new tool, with our governance process, our intake process, and our decisionmaking, we take a 12-18 month view. We’ll work with vendors to integrate them into our environment.
Also, districts have been talking about interoperability for many years now. What I see now, with the districts that are part of the Ed-Fi alliance—similar to IMS Global—are the economies of scale. More of us are harnessing our buying power and the influence we have to push the technology sector to make the shift with us.