No.
CPUs don’t get weaker with age – They run or they die. Generally a CPU works or it doesn’t., Partial failures are rare – The chips themselves don’t slow down from honest use.

What happens is the world around them goes to shit – Dust builds up on heatsinks–Cheap thermal paste dries out and cracks. Heat can’t escape – The CPU runs hot and throttles itself to keep from burning.
There are enemies inside too. Windows machines get slower from bigger registries, disk fragmentation, and extra process loads–Your operating system is the problem, not the chip.
Your computer is like that old car that needs a tune-up; Clean the dust–Replace the thermal paste.
Clear the junk from your hard drive; Do these things and your machine will run good as new.
Only them overclockers need to worry about real damage. Push a chip past its limits with too much voltage and heat, and yes, it will break down over time.
A CPU retains the same ability to process data from the day it was made until the day it ceases to operate or is permanently deactivated.
What happens is that the requirements of software increase over time, as software developers take advantage of advances in CPU, memory, and graphics technology. An older CPU will therefore have to do more work to run later generations of software, and will eventually get to the point where the demands of software are too formidable for it to handle.
A good example of this is an old Compaq Presario laptop I own that has an AMD K6-III+ 400 MHz CPU installed. At the time this CPU was introduced, it was a fairly high-end mobile CPU, capable of handling anything that was current at the time. Today, if I start that laptop, it will boot up and can still run Windows 98 and applications contemporary with it in exactly the same way that it did 16 years ago.
However, while this laptop was an outstanding Web browsing machine up until about 10 years ago, it can barely function online now. Modern JavaScript applications bring the K6-III to its knees. It’s not because the CPU is any less capable than it was in 2001, but because the demands of the modern Web have simply left it behind.