It seems to be an increasingly common belief that because the PS4 and Xbox One use x86, an architecture familiar to programmers, unlike previous consoles, who used less well known architectures, that there won’t be as much they can learn or exploit throughout the generation, resulting in better looking and performing games.
A badly known architecture like the PS3 had will result in poorly performing games at first, compared to similarly powered PCs, this is true, and learning that architecture will improve performance over time. This is also true, but it’s not the most significant thing developers can do to improve performance.
One of the biggest problems with developing on PCs is you can’t know what CPU and GPU the player will have. This requires you to write very generic code that is compatible with a large variety of CPUs and GPUs. Furthermore, they can’t predict what ratio of CPU power to GPU power you’re going to have, so there’s a good chance PC games run with GPU at 100% but CPU at a very low %, or the other way around. This severely bottlenecks performance on PCs, and it’s something that can easily be overcome on consoles, since that hardware is known. That will be the first major optimization they can perform that will immediately make the consoles more efficient with their hardware than comparable PCs.
However, it doesn’t stop there. As I said before, developers write code in a very generic form to support all CPUs. While all CPUs on PCs are x86 compatible, they’re also all very different. Intel makes CPUs differently than AMD, and each generation of CPUs is different from each other as well. If Sony/MS had used previously well known CPUs such as the AMD FX series, this would have been less of an issue, but they used a mostly unknown brand of CPUs known as the Jaguar. Each CPU has the basic x86 instruction set, but also typically have their own extra instructions as well. Not only that, but they all p...