EE World spoke with Sarah Boen, Director of Technology and ASIC Strategy at Tektronix, who spoke at a panel session on how AI and silicon photonics are changing networks and the challenges facing test engineers and reliability engineers.
In OFC 2024: It’s AI or die, EE World reported on how the data generated by AI will drive the need to move data. That’s scaring the engineers developing optical and electrical links. Alongside OFC came a short conference, Photonics Solutions: Bridging technical and cost-efficiency challenges, hosted by IMEC. The conference included a panel session, Bridging the gap between technology and manufacturing: challenges and opportunities. EE World spoke with Sarah Boen, Director of Strategic and ASIC Technology at Tektronix.
Panel sessions can be rather fluid. You don’t always know what will come up. That was the case here, where the panelists first discussed AI’s impact on photonics and optical networks before steering towards manufacturing and reliability issues arising from the need to integrate photonics into network equipment rather than using pluggable optical modules.
EE World: Sarah, thank you for speaking with us. Please provide an overview of the panel session, which took place alongside OFC, March 2024 in San Diego.
Boen: The panel Bridging the gap between technology and manufacturing: challenges and opportunities was focused on how to scale optical technology. When you think about manufacturing, you think about the number of I/O ports for electrical interfaces, that’s something industry knows how to mass produce. We’re just not there with optical interfaces. That was the panel’s initial intent until conversations around AI arose. Each of the panelists talked about, from their perspective, how AI is impacting their technology portfolio and their business. Maybe half of the panel was focused on AI because it’s so relevant to optical I/O. That’s driving the need to shift to optical interfaces in the compute space where it’s already in data columns for a long time.
EE World: Let me first address AI. That was something I heard a lot at OFC and wrote about it in. People at OFC were somewhat fidgety about what they see as this, call it a data tsunami, coming in terms of “We’re going to have all this data that needs to move and we’re the people who will to have to move it.”
Optical and electrical data links have been going along getting to the next data rate, and it doubles every roughly few years. People were looking at this and saying, “that’s not going to keep up.” Even though people are starting to deploy 800G, they’re talking about 1.6T. A few people have mentioned 3.2T. One speaker even mentioned a timeline for 6.4T. Whatever it is, everybody was on edge that it’s not going to come fast enough. People kept asking how are we going to keep up? There’s a huge opportunity, but can we get there, especially where you need consensus? You need a standard. These devices must interoperate. Everybody was sort of wondering, “How are we going to do that and how is my company going to get there first?”
Boen: Interoperability and standards also came up during the panel. We’re not there yet. For those of us in test and measurement, that’s what we do. I think we there’s this big emphasis on the datacomms piece but there’s also high-performance compute. We’re now seeing computing companies such as Nvidia, AMD, and Intel at OFC. While Intel has a transceiver business, it’s interesting to see how the technology and the ecosystem is evolving.
EE World: Did the panel discuss silicon photonics?
Boen: Yes. Specifically, we talked about silicon photonics and applications outside of compute and data communications such as medical devices and quantum computing. IMEC, which sponsored the discussion, has a photonics initiative. They’re starting to look at other potential markets for the technology. At Tektronix, we’re looking at photonics and especially integrated optics.
EE World: What does that mean for tests?
Boen: The electrical interfaces are still there, they’re just integrated on the package. We’re working on how to test integrated photonics because there are no access points. That’s another big challenge: you don’t want to package all this technology and then test it at the end and have something go wrong, it’s going take too long and cost too much.
EE World: The other thing I kept hearing about at OFC was quantum computing.
Boen: Was it on computing, comms, sensing, or a combination?
EE World: Mostly about computing. We’re going to have much faster computers and a lot more data to move. It came back to how will we deal with moving all the data?
Boen: If I look at our test and measurement architecture, we also have that same challenge dealing with increasing bit rates. How do we process all that data? How do we acquire it? How do we sample it? We’re looking at an opportunity to use AI to reduce the amount of data that needs travel through the entire system. I think we might see more of that. With 6G, one of the key areas is intelligent sensing. You can imagine if you have an intelligent sensing network, how can decisions be made more locally, without having to send data back into the data center? I think there will be some architectural changes just to accommodate the sheer volume of data that’s coming.
EE World: How might that architecture change?
Boen: I think we’ll see more convergence between sensing and compute. Instead of having these devices that just collect and transmit data, they’ll have more intelligence. We’re looking at how add put compute capability through a GPU or FPGA and have that capability tied closer to the front end technology within our products. I think that is something we’re going to see more.
In the high performance compute space, there’s a lot of momentum to move towards more memory-driven architectures so that the memory is not sitting idle and is more fully utilized. I think there’ll be changes across the board because the datacom communications interfaces are meant to send data from point A to point B. I think that will become more intelligent through the convergence with computing. I also think that in the telecommunications network, that will come with the deployment of 6G, but that’s 2030.
EE World: I’ve attended conferences that discuss 6G, and nobody knows what that’s going to look like. There’s a timeline for a 3GPP standard, somewhere around 2029 but nobody really knows what that’s going to be. I have heard a lot about sensing, we’ve even published an article about that. The last I heard was that the standards were going to be more about applications than about technology. Those discussions have changed over the last two or three years.
Boen: We’re seeing different players in the ecosystem from nVidia, Ericsson, and Qualcomm. Every year, they have their annual event. One of the key pain points, of course, with 5G is monetization. I think that’s where you’re seeing that shift towards: what’s the end application? How are we going to, especially the operators, monetize their investments? Last year the whole theme was around virtual reality. And so that was sponsored by Ericsson, and that was kind of the application. This year, everything was about AI. Things are so fluid and moving so fast, but there definitely is pressure on that industry to make sure that the applications can be monetized. I think that’s part of what’s maybe different than when 5G came along.
EE World: What came up regarding manufacturing?
Boen: The manufacturing piece was around building this new technology — optics, silicon photonics, and how we are going to scale the product manufacturability. I talked from a test-and-measurement perspective, shifting to testing at the dye level. We’re working on known-good-die testing, which is kind of what I alluded to before, regarding the integration of known good die on a on a package, for example, versus waiting until the end of the chain. I focused on how we’re going to do that? Is it going to be at speed-testing or DC testing? So those are from the TNM. perspective, what we’re focused on, and I think the other piece of it was the that came up, you already mentioned was around standardization, and ensuring that these parts from different suppliers can be integrated together, manufactured and, you know, work in the real world?
EE World: When you first started talking about this. Earlier in the conversation, I was trying to figure out where you were going. Well, we already know how to manufacture optical modules and plugins, and we’ve been doing that for for years. But if now we’re looking at it, yes. At the die level at the chip level? It does. We haven’t really done much of that. There will be a learning curve. That must come with it.
Boen: This is a different part of the ecosystem. Traditional transceivers are targeted data columns, but when you start thinking about the high-performance computing (HPC) ecosystem, that’s a different ballgame. Even the fabrication of the technology is different. Intel is looking at integrating the lasers on silicon. How are they going to do that? What’s the reliability? Because if you think about with photonics, if things are more integrated, how do you replace something that goes bad? That’s another reliability challenge. Because with a pluggable transceiver, you just swap them out, but that’s not going to be the case. In the HPC world, more integration of that optical communication interface closer to switch. Broadcom has their co-packaged optics. I think it’s the tomahawk chip and then they have the optical modules are co-packaged directly with SerDes.
That opens up new, a whole new source of potential problems around reliability and the real-world operating environment. The optical communications industry needs to solve for that.
EE World: You gave me an idea because I get emails from my local chapter of the IEEE Reliability Society. How are reliability engineers going to deal with integrated lasers as opposed to pluggable optics?
Boen: That would be a relevant topic because it’s one of the big sticking points in the deployment of the technology.
EE World: Sarah Boen, thank you for speaking with EE World.