Über Open CoDE Software Wiki Diskussionen GitLab

Skip to content

Use larger chunks when streaming all datasets to improve encoding efficiency

Adam Reichold requested to merge OC000014987132/metadaten:stream-chunks into main

While the endpoint is certainly more efficient than repeatedly searching with pagination, it is also rather slow when used through our reverse proxy (apparently effectively limited to 30 kB/s). Hence this change aims for a middle ground where 100 datasets (same as the maximum search results page size) are buffered and serialized as a single chunk for the response.

It also adds the missing content type header and uses axum's more specialized StreamBody type as well as cleaning up our dependencies on the futures-* crates.

Merge request reports

Loading