Right, it's one of the techniques of anti-aliasing, but how one of AA is applied doesn't mean all other techniques do it the same way. If you've read the Nvidia article, then you should've gotten the sense that it is quite old and outdated, that there should be better ways to apply AA today.
Furthermore, that article states that SSAA doesn't increase the game's resolution either as it downsizes the supersized image back to its original resolution.
There are far better and efficient methods to apply AA now. They don't mess with having to supersize and downsize resolutions or at least not as much as SSAA does: http://gaming.stackexchange.com/questions/31801/what-are-the-differences-between-the-different-anti-aliasing-multisampling-set
What CombatMuffin said is partially true. Most AA, specifically Multisampling and its derivatives, applied in most modern games don't resort to the brute force method that Supersampling does such that resolution and AA are somewhat independent from each other.
I know it's only one method of AA, and other methods differ, but FXAA and SSAA are the ones used by Company of Heroes 2 and that's what my post was very clearly about.
I was just responding to the "high-end cards should be able to run with max AA" statement which doesn't really take into account how much of a performance hit SSAA really is.
If you want to make a case for Relic adding MSAA or any of the other 10+ methods and variants go ahead, I welcome more options.
I'm interested to know how a display that physically has 1920 * 1080 ~ 2MP can physically, hell virtually even, go up to 4.6MP to replicate 2880x1620 resolution.
I'm saying it gets rendered at 2880x1620 then downscaled to 1920x1080 with a filter to minimize jagged edges.
Whether the monitor can display it or not you still take the performance hit of rendering 4.6MP rather than 2MP, which is what my post was meant to say.
My apologies if my post made it seem like you magically got a higher visible resolution, that obviously isn't the case.