AMD finally hits back / the future of silicon
on
For a few years now, Intel appears to have had the upper hand over AMD, when we're comparing CPUs. Recently AMD showed their latest hand: The Ryzen series of CPUs. AMD positions its new flagship product, the Ryzen 7 1800X at a price of $500, opposite the Intel i7-6900K, a CPU costing $1050. Of course, the hardware junkies jumped right in and one test after another appeared in the media, some more positive than others. The most recent 'disappointment' looks to be that the performance of AMD's showpiece disappoints at strange moments. Suspicion as to why is directed at the way the CPU is recognized in Windows 10. The Ryzen architecture, as it happens, builds a common L3-cache per 4 cores. When the scheduler (the routine that in Windows determines which task goes to which core) does not take that into account – the 1800X has 8 cores – then it is possible that a core needs data from the other L3-cache, and this can only be obtained via 'slow' connections. But it appears that a (NUMA supported) Windows patch can eliminate this problem.
When we go down the path of those things that are of more interest to the electronics engineer, we see that such multi-core developments are taking place there as well. 4-Bit microcontrollers are still numerous and certainly have their place, but their bigger brothers (or is that sisters?) are becoming increasingly complex and 32-bit processors with 8 cores have already been in the marketplace for more than 10 years, consider the Propeller made by Parallax (2006), which now already has a successor in the Propeller 2, which has no less than 16 cores. Or the ARM Cortex processors that look more like fully-fledged CPUs than microcontrollers. These silicon structures are now found in many smartphones these days, which are now many times more powerful than a complete desktop PC from, say, 10 years ago. If someone had told me in 2011, when HD-TV was not yet the standard, that in 5 years you could get a smartphone with a 4K screen and a processor comparable to an Intel i3, then I probably would have had difficulty believing them. But that is the speed of development. If you look at the development of IoT now and consider that the relatively 'dumb' sensors at present will probably experience the same kind of growth and will be built using exponentially more complex circuitry, then the scenarios of many a Hollywood disaster spectacle are not entirely unthinkable. High time to add some ethics to the electronics. O wait, we are already working on that...
Discussion (0 comments)