How is PAM4 Transforming Optical Networking?
How is PAM4 Transforming Optical Networking? Now it is time to investigate the source of the problem…
Since the introduction of PAM4 modulation, especially the concept of 100G PAM4, the optical network model has become unstable. Almost all people in the industry are involved in the 100G PAM4 business, from chips to modules to equipment. Every year people are proclaiming that they have made significant progress. Progress is progress, the reality is still very cruel, let’s start from the beginning.
The technological evolution path from 100G to 200G was clear. 200G is unique in that it lies on the threshold between NRZ modulation and PAM4 high level modulation, and it is important for researchers, architects and engineers to understand the technology behind 200G. But smart companies quickly figured out that 400G would have greater cost advantages and application value than 200G, and that the 4 x 100G PAM4 was an easy threshold in terms of the ecological chain, and an industry was on the move. It’s just that no one has used the eye of a needle to push through the layers of technological reality before.
Analysis of today’s situation, have to sigh that the past as yesterday. The confusion of the industry in recent years has not only numbed my nerves, but also made me almost lose my sharp thinking.
Representative works such as 400G FR4. The existing problems have not been effectively solved until today. This product needs to rely on superb signal integrity design on the surface, but this is not necessarily the case in essence. We now think that the low-cost PAM4 DSP does not have the function of optical path nonlinear budget compensation. Yes, some companies think their short range products today are good enough. However, I have deep doubts about how long this kind of good can last. Maybe the long-term adaptation of the product just depends on good customers, an application scenario and good luck! Of course, we can also do algorithm correction and link compensation for the transmission mechanism of 4×100G PAM4+EML, which is not an easy price. Calculating the economics of this technology by only using each bit as a unit must be untrue and remote. Even short-distance transmission needs to prevent bandwidth multipath fading and other SNR impairments. Previous industry theories about 100G PAM4 single-wavelength transmission were more or less based on idealized calculations or visions.
4×100G PAM4+Silicon Technology
the representative work is 400G DR4. The problem is the batch consistency and productivity of silicon. In other words, this is a question of craft production rather than technical principle. At its root, silicon technology provides a relatively clean signal light source, and at the same time, as the modulation depth increases, the confidence interval of this technology is higher. This technology will be very promising in the field of short-distance interconnection and future CO-package. People think that the main disadvantage is that wavelength division multiplexing cannot be done above 400G, too much waste of optical fibers, and there are actually difficulties in wiring.
Representative works such as 400G LR8. The problem is the manufacturability of the 8-way optical engine. In order to avoid this manufacturability problem, the researchers turned their attention to the 4×100G PAM4 platform, and it turns out that each technology is not easy. The 8×50G PAM4 technology is generally a very moderate technology that can be mass-produced, and the cost and power consumption are at a reasonable level. People think that 4×100G PAM4 can achieve better power consumption than 8×50G PAM4. Obviously this has not happened, but it has delayed the bright future of 8×50G PAM4. We can also analyze 200G slightly:
Representative works such as 200G FR4. So far, I still dare not conclude that this kind of module must be risk-free. 50G PAM4 is supported by mathematical theory, but the signal-to-noise ratio is still too low. According to industry agreement, this product is very easy to develop and produce. Mass deployment of such modules in a compact space requires a high level of performance due to high mutual interference.
Representative works such as 200G SR8/PSM8/LR8, because products relying on NRZ are based on zero bit errors, the equipment does not need to perform FEC processing, and the overall cost performance is still the highest. Gigalight was the first to advocate the deployment of this 8-way NRZ data center, which can obtain better cost performance than the 100G data center. At present, it is mainly deployed in special scenarios with high performance requirements, and these seem to be the past.
Another interesting topic about the hot 800G. CO-package technology can easily complete 3.2T or 6.4T transmission on the board, which will inevitably shake the ambition of 800G pluggable optical transceivers. The 800G pluggable optical transceiver intends to adopt 8×100G PAM4 modulation or 4×200G PAM4 modulation.
200G PAM4 is established from the perspective of DSP, but it is limited to short-link scenarios on the board. Therefore, the next step of the 800G pluggable ecology can only be based on 8×100G PAM4 modulation. Among them, silicon photonics technology will reflect certain advantages. Overall, 800G technology will face the same dilemma as 400G, but silicon photonics technology will accelerate its development in the 800G era. Another noteworthy question will be whether the pluggable optical transceiver will die at 800G, or to what extent, the reason is mainly whether the high-speed signal can withstand the loss of pluggability.
Since the optical network is defined by equipment manufacturers or end users, transceiver suppliers like Gigalight can only provide some friendly predictions or technical analysis, so that we can avoid some quagmire. Optical communication technology has come to this day, and there is no right or wrong, but there are some other limiting factors to build a new world of optical networks completely relying on PAM4 DSP. Since we are already cruising at the boundary of electronic bottlenecks, we need more knowledge and vision, and a vision to weigh what is today’s technology and what is tomorrow’s technology. Big data has gone through such a long period of trial and error. Under the current technical blueprint, it is not necessary to make an upward leap. Perhaps people really need to wait quietly for the real arrival of the CO-package era.