|
In terms of formats, NDJSON (or JSONL) is provided, which can handle huge data files efficiently. Notorious XML enables stream processing by default and is not a bad choice for vendors that are not new to it (in Open Data, Justice provides XML datasets with higher GB units). The vendor also has experience in providing separate documentation for each entity - for open data from the ARES system. Somewhat strangely, the dataset is not provided in tabular form (e.g. CSV).
Odd, since the data is currently stored in a relational database, and it's not hard to assume Benin Mobile Number List that most processors will store it in a relational database again. Based on my own needs, I prepared a script that can process data even on machines without tens of GB of operating memory, otherwise data processing using standard tools is necessary. Somewhat paradoxically, it is still valid (it depends on how the data was exported) because the Home Office has decided to block access to open data from servers outside the EU.

Why does this happen? It is a shame that we find ourselves in this situation after so many years of delays in the rollout of electronic collections. We have a multi-page proposal for a technical solution, a project that has been likened to developing a Bugatti car, a budget that has reached hundreds of millions of crowns, and that has been years in the making. The worst thing about branding the whole thing is that almost any expert can detect the above error in a matter of minutes (hours at most). The question is whether the problem is that testing is limited to a minimum, or if vendors are turning a deaf ear to comments.
|
|