Is convenience a good reason to remove patient data controls? What's the alternative?
A line in The Guardian's reporting on Palantir's access to NHS England's Federated Data Platform--stopped us short this morning.
"With hundreds of different datasets in the FDP system, it was becoming time-consuming for contractors, including Palantir engineers, to apply for individual permissions."
Two things can be true at once. The friction is real — anyone who has worked inside a public-sector data programme has felt it, and the teams delivering FDP are working under serious pressure to make a complex system useful at speed. And: the answer to slow approvals might not be to remove them altogether. It could be to make them work better.
The people on the ground know this better than anyone. The question is whether they are being given the tools to do it. Could there be an alternative to the 'blanket access' approach reportedly taking place on the FDP project?
What good looks like
Granular, per-dataset access controls generally exist for a reason. They are how a data controller proves, after the fact, that the right person saw the right rows for the right purpose. They are the difference between "we trust our contractors" and "we can show, in an audit, exactly what our contractors saw."
For external consultants and engineers — including those employed by the platform vendor — the bar should be higher:
- Access requested at the dataset level, with a stated purpose.
- Decisioned by the data controller, not the platform team.
- Logged. At minimum, who looked at the data, when, and why.
- Time-bound, with automatic expiry.
None of this is controversial in public-sector data governance. It is the standard the Caldicott Principles, the UK GDPR, and the NHS's own information governance framework already point towards.
Now of course, we know the teams involved will have scrutinised the 'unlimited access' decision extensively, and they may well have come an informed conclusion that for whatever reason, Palantir can be trusted with access to these data sets in their entirety, or that without access the system (which, debates aside, was ultimately born out of a desire to improve patient care), is untenable. The argument could be that the engineers need access to all of this data to create the system and ultimately any approval requests would be accepted, it would simply slow the project down.
Lets consider for a moment though, that there may be an alternative solution which would maintain trust and keep the project on track...
The friction is a solvable problem
We know this because we have solved it. Our work with the Office for National Statistics on the Project Accreditation System for the Secure Research Service (PASS) shows that approval times can be cut by >50% — without weakening a single control. The route is well-trodden. A clear catalogue of what exists. Machine-readable metadata describing sensitivity and lawful basis. A workflow that routes the request to the right approver in a few steps, not ten.
The contractor still asks. The controller still decides. The log is still written. Nobody waits months for a permission that should have taken an afternoon.
That is the trade-off worth making. Faster decisions, same controls, full audit trail.
Trust takes a long time to build, and moments to lose
Public trust in data sharing is not in surplus. I'm sure we can all remember high profile recent cases leading to citizens losing confidence that their data is responsibly handled.
Blanket access to hundreds of datasets, even for engineers building the platform, is not how that confidence is rebuilt. It is how it is spent. The colleagues doing the delivery work know this. They will be the ones answering the questions in select committees and patient forums if the next headline lands badly.
The Federated Data Platform is one of the most consequential pieces of public-sector data infrastructure the NHS has commissioned this decade. Without getting into a debate about who is building it and the 'openness' of the technology stack - the case for it rests on a promise: that joining data up will save lives, and that the safeguards will keep up. The safeguards are of paramount importance. Anything that looks like loosening them — even for good operational reasons — needs to come with a clear story about what replaced them.
Where to from here for the team running the FDP work?
Less a list of demands, more a short checklist a programme team could publish themselves to take the heat off:
- Which datasets are in scope for each project team and why access has been granted.
- What auditing is in place — at dataset, not system, level.
- How long the change is in force, and what would trigger a return to per-dataset approval.
- Who, outside the platform team, is responsible for each decision.
This is not unachievable. It is what data controllers in regulated sectors already publish. And it is the kind of detail that turns a difficult story into a defensible one — protecting the people doing the work as much as the patients whose data is in scope.
We have done this work before, on PASS with ONS, and we would be glad to help anyone inside the FDP team thinking through the same problem. Faster decisions, same controls, full audit trail.
Move fast and keep your safeguards — patient data is one of the few places it really does matter.
MetadataWorks builds the cataloguing infrastructure used by NHS England, ONS, ADR UK and Genomics England to manage access to sensitive data at scale. If you're working on a similar problem, get in touch.
By