Each generation the "legacy" decode-portion in the CPU gets smaller relative to the size of all the other bits. The GPU bits (hmm... not literal bits... portions
) already make the decoder seem like a "meh" part of the design these days. And both power consumption and heat are pretty much directly related to the relative size of the implementation...
No, IA architectures simply evolve, they won't disappear as long as Intel is around and even then the IPR would certainly be sold on to someone who would start licensing it to vendors (and I'm sure they would literally queue to get to it).
That's not to say the current state of "daily use" IA has much to do with what we used for MS-DOS during the late nineties, for example... "evolution" is just a pretty word for the process that actually means adding new stuff so cool it makes the old stuff practically irrelevant for most uses.
But incidentally this is something that never ceases to amaze me: The backwards compatibility of both Intel CPUs and Windows! The only thing coming even close in this regard is VMS. This really is something totally built into the DNA of both MS and Intel and a feedback loop of sorts.
I suspect at some point they will start sawing off the redundant old rotten limbs from their CPU designs, but no-one will probably even notice. In fact, I think I read from somewhere that such changes have already been done, but they don't really touch the average consumer who doesn't expect the latest Corei5 to boot DOS6.22 (given a suitable BIOS to begin with).
Should the architecture be "stale" it would have died already, that happens to all computing related things. But we have to remember, when we download a *-x86.tgz package that it is using a naming convention that Intel itself abandoned years ago already... so... did x86 actually die already?
Oh and I don't believe the ages old RISC/CISC-thing is a thing anymore... Venting about that was me back in the nineties when we were cluttered with Alphas, MIPSes, Sparcs and all the coolness... Then progress just happened, Intel went RISCish under the hood, the others died away and that was all folks :-(. No revolution where none was needed to begin with...
Edit addition:
Oh, should biocomputers ever become relevant, the likely scenario is: Intel buys a relevat startup, integrates the tech to what their current iteration of their architecture is and then the bacteria start talking familiar opcodes
. This might not happen if the "base" in biocomputing stops being 2 (which is sort of the case with anything quantum at the moment)... but even then my bets would be for a compatibility layer (which is sort of the targetcase with anything quantum at the moment?-).