I was comparing 4K to 1080p TVs in a Best Buy electronics store
Do you know for sure it was a real test? Most of those places don't know the difference from the front end of a monitor from the back.
It's like saying lawyer A is better than lawyer B based on my opinion that lawyer A managed to get a jaywalking charge dismissed when I was in another country, but lawyer B failed to get my D Trump hunting license approved.
It like trying to compare 24 bit color to 32 bit color.
It's 24/30/36.... 24 = 3 colors at 8bit each. 30 = 3 colors at 10bit each. From what I've read, most people can tell up to 12 bits per color with a further approximately 5% of the population need 13 bits. You can tell, you just don't know you can tell..... but we've strayed off topic there a bit.
Anyway, the point is the quality of your picture will come from the data within the image, not the resolution. Whether a picture is 1080p or 4k has little do to with image quality.
I am sticking with 1080p hardware. It's more in my budget.
For editing: While you're probably not going to believe me, your hardware has less to do with what you can edit than your workflow. Early this month I finished working on a film that was shot with 4k h264 using an i5 so old, it was picked up at one of those recycle centers for $10 (some extra ram to bring it up to 12gig and an old video card was put in it). With an appropriate workflow, it cut through the project like butter.
With that story in mind, I'm not sure what you mean by sticking with 1080p hardware.
The more of a clue you get, the less of a fat wallet you need.