speedup postgres pg_restore

Hi I have to update monthly my database using this big dump (100gb)
I don’t have the control over how it’s done I only receive a .zip with a lot of .gz
i want to speed up the restoring since I only really need 1 table.
I have a nvme for my disk, but it’s only getting written at like 200 mb/s but it’s capable of a lot more.
I think this is because it’s CPU bound the decompression of the .gz and I can’t find a way to paralyze the decompression.
the postgres version is 10.20
how can i check where is the table stored, in what file i mean.

How to test rendering speed in Angular

We’re building an Angular app, and we’re trying to figure out how to get some benchmarks on how long it takes to render various pages. I’ve read about performance.timing here, but that seems to only be useful with non-single page applications as when I navigate to new views in our app, the timing numbers do not change.

ng-include, ng-template or directive: which one is better for performance

I would like to know the best way to design an angular app regarding performance, for building an HTML template with reusable widgets like header, sidebar, footer, etc. Basically the main content is the central DIV which will have its content varying between routes, header and footer will be almost always the same, sidebar can vary in certain pages.

How to get around angulars performance issue with large content

I’m familiar with angular’s digest cycle and how it affects performance with long lists and large model values. I’m just curious if you have any workaround for the problem specifically for my case.
I’m building an app which may or may not require user to enter large text such as error log in the textarea, but due to angular inherent issue associated with 2-way data binding on large model its causing my app to hang.