So, actually a update to my problem rendering hi-res images with gpu (or rather a correction)
I just figured I can render hi-res on gpu, but only without denoiser.
The actual error I get(only if denoiser is enabled) is:
The Denoiser sample histogram buffer is too big for GeForce GTX 1060 6GB Intersect device (i.e. CL DEVICE MAX MEM ALLOC SIZE=1610612736): try to reduce related parameters
Using tiled rendering does makes rendering 4K+ images possible on gpu+cpu with denoiser enabled, however I get some nasty artifacts (some tiles are way too blurry as if some colors are leaking from neighboring tiles, and some tiles are black unless I have something like 300+ samples) where the image looks a lot better without denoiser.
Tried to tweak the settings of denoiser but I just made it worse
Rendering same scene using cpu only works fine using about 7GB ram during rendering and 11.5GB during denoising. Which I think is pretty heavy ram usage for just two hi poly objects (1.7M faces, 3.2M tris) and a single texture. (just a basic testscene)
Rendering same scene using cpu only but tiled rendering uses 5GB ram during rendering and denoiser won't run.