Sign in

Micron Technology - Q2 2026 Post Call

March 18, 2026

Transcript

Operator (participant)

Ladies and gentlemen, thank you for joining us, and welcome to Micron's post-earnings analyst call. After the speaker presentation, we will host a question and answer session. I will now hand the conference over to Satya Kumar, Investor Relations. Satya, please go ahead.

Satya Kumar (Corporate VP of Investor Relations and Treasury)

Yeah, thank you, and welcome to Micron Technology's fiscal second quarter 2026 post earnings analyst call. On the call with me today are Sumit Sadana, Micron's Chief Business Officer, Manish Bhatia, EVP of Global Operations, and Mark Murphy, our CFO. As a reminder, the matters we're discussing today include forward-looking statements regarding market demand and supply, market trends and drivers, and our expected results and guidance and other matters. These forward-looking statements are subject to risks and uncertainties that may cause actual results to differ materially from statements made today. We refer to documents we have filed with the SEC, including our most recent Form 10-K and upcoming 10-Q for a discussion of risks that may affect our results. Although we believe that the expectations reflected in the forward-looking statements are reasonable, we cannot guarantee future results, levels of activity, performance, and achievements.

We are under no duty to update any of the forward-looking statements to conform these statements to actual results. Operator, we can now open the call up for Q&A.

Operator (participant)

We will now begin the question and answer session. Please limit yourself to one question and one follow-up. If you would like to ask a question, please press star one to raise your hand and star six to unmute. Your first question comes from the line of Melissa Weathers from Deutsche Bank. Your line is open. Please go ahead.

Melissa Weathers (Director of Equity Research)

Hi there. Thanks for letting me ask a question. I wanted to touch on the NAND side of things. You guys are one of the only players to openly talk about greenfield capacity adds so far. There are some industry participants saying that the growth in NAND bits going forward can be served by node upgrades alone without the need for capacity adds. Can you talk about the decision to add greenfield capacity there and what kinda trends you're seeing that gives you confidence to add that capacity?

Manish Bhatia (EVP of Global Operations)

Sure. Hi, Melissa. It's Manish. Let me talk a little about the decision, and then maybe Sumit can talk about the NAND demand trends. You know, we have said that in the past that while DRAM required greenfield wafer capacity, meaning new wafer start capacity growth, to be able to meet the long-term demand trends that we saw, NAND was, you know, able to meet that with technology transitions. I think we still see that NAND technology transitions are able to provide strong bit growth going forward.

In our decision here, while yes, reflecting confidence in the market demand outlook, which Sumit will talk about, was also driven by our continued space consumption for those technology transitions and for the next technology transitions that we'll be having in the future, as well as our decision to locate more of our NAND R&D in Singapore, where it's closer to our manufacturing. Those two drivers of additional clean room space, combined with our outlook on the market, you know, were really what was behind our and our confidence, frankly, in our product portfolio, which I'm sure Sumit will talk about, were behind our decision to be able to break ground for our new NAND fab.

Sumit Sadana (Chief Business Officer)

Just on that, it's not a greenfield site, right? I mean, you mentioned greenfield in your question, and this is the existing site where we are adding this additional clean room space that Manish said. In terms of the demand side of the picture, I mean, we see very robust demand for NAND, driven by growth in the data center. AI servers are using huge amount of SSDs, high capacity SSDs, as well as high-performance SSDs. This is working really well in our favor because our portfolio is doing exceptionally well. We are the first company in the world to have a PCIe Gen6 SSD in the market. This has been something that works well with NVIDIA systems.

We have seen tremendous demand for it that we are not able to completely even come close to meeting. Even high capacity SSDs, we have seen tremendous growth and tremendous demand, and we have come out with a slew of new products that we are looking really excitedly to continue to grow our share in the data center SSD space from record levels it reached at the end of last calendar year, calendar 2025. We have been growing to record shares in the data center every year for the last four years. We are very excited about that, and our supply is nowhere close to being able to meet the demand that we see for the foreseeable future.

This expansion that Manish mentioned is something we will put to good use to continue to grow our business with obviously, you know, a focus on disciplined investing when it comes to our CapEx going forward.

Manish Bhatia (EVP of Global Operations)

I should just add that, even though we're breaking ground now, you know, that capacity, that clean room isn't expected to provide new boost to our capacity until the second half of 2028. You know, we do think that. Not just for us, but across the industry in NAND, you know, clean room space over the medium term is still gonna be a challenge for the industry, particularly as many in the industry have redirected some NAND clean room space towards DRAM.

Melissa Weathers (Director of Equity Research)

Perfect. Thank you. Then back to the DRAM side, I wanted to ask pretty candidly on the pricing that we're all seeing. Next year we have line of sight to a couple of DRAM projects coming online, including you guys bringing on a couple of those new fabs. How are you internally modeling the impact of that increase in supply going forward? Are you guys thinking pricing could come down next year or flatten out or decel? Just any color on how you guys are internally modeling the impact of all those new fabs coming online next year and the year after.

Manish Bhatia (EVP of Global Operations)

Maybe.

Sumit Sadana (Chief Business Officer)

Yeah.

Manish Bhatia (EVP of Global Operations)

Maybe I'll answer the supply side or what the industry. I think, you know, for the projects that we've talked about, which are the new ID1 shell, you know, the DRAM side, ID1 shell, as well as our ramp in the Powerchip Tongluo facility that we've now acquired. You know, we've said that the supply impact from those will be towards our fiscal year 2028 in terms of being able to meet revenue shipments. I think that's sort of largely the case for the industry in terms of new large clean room projects coming online. They will be really, you know, impacting later in 2027 and really into 2028 before you see meaningful supply, which is, you know, one of the reasons we, you know, have reiterated that we see the tight supply conditions to continue beyond 2026.

Sumit Sadana (Chief Business Officer)

Yeah. I think we obviously don't provide any kind of price modeling or anything else type of discussion in that forward-looking way. We have mentioned that, you know, the demand forecast that we get from our customers for 2026, for 2027, continue to escalate. Despite, you know, efforts that we are making to increase the amount of supply we can bring online and in to a modest extent in 2026, but some more in 2027, is not really making that much of a meaningful dent in the gap. There is a lot of demand growth driven by AI and also across different segments of the market, not just in the data center. We expect tight conditions for the foreseeable future and certainly beyond 2026, and not giving any further guidance beyond that.

Mark Murphy (CFO)

The last thing I would add, Melissa, is that, you know, as Sumit mentioned, demand far exceeds supply, as Manish mentioned. You know, these fabs talked about for us are coming on basically meaningfully in our fiscal year 2028. You know, we always have the ability to modulate tool installs, so, you know, and that would be an option as well.

Melissa Weathers (Director of Equity Research)

Perfect. Thank you, all three.

Operator (participant)

Your next question comes from the line of Aaron Rakers from Wells Fargo. Aaron, your line is open. Please go ahead.

Aaron Rakers (Managing Director and Technology Analyst)

Yeah. Can you hear me?

Mark Murphy (CFO)

Yes.

Aaron Rakers (Managing Director and Technology Analyst)

Hello? Yeah, sorry, guys.

Mark Murphy (CFO)

Yeah, we can hear you, Aaron.

Aaron Rakers (Managing Director and Technology Analyst)

Thanks for doing this. Perfect. Thanks, Mark. Thanks for doing this call. Two questions, one and a quick follow-up. You know, I guess, you know, Mark, maybe just to help model-wise, I'm curious if you could maybe frame the current quarter guidance between DRAM and NAND. What maybe assumptions you're making in this tight environment around your ability to ship bits. Will that grow sequentially and any kind of separation between the two buckets between DRAM and NAND?

Mark Murphy (CFO)

We don't break it out. I can provide you a little bit of color. I mean, let me just maybe you heard the 2Q report out. Both DRAM and NAND pricing was up strongly. NAND even more than DRAM. Both also grew volume sequentially, NAND less than DRAM. In Q3, we would expect price to again be the largest factor. You know, if we've given you enough on you know, CY 2026, industry bit shipments and that those industry bit shipments are constrained by supply. Our supply we expect to grow in line with the industry. If you do all that, you can sort of determine that you can assume some modest growth, volume growth in third quarter for both DRAM and NAND.

Aaron Rakers (Managing Director and Technology Analyst)

Okay. That's perfect. Thanks, Mark. Then as a follow-up, maybe more architecturally, in this environment, I'm curious, you know, there's a tremendous amount of demand that you guys are seeing in servers. I think you've guided low teens unit growth this year. But one of the architecture things that I'm starting to hear a little bit more about is the idea of CXL and memory pooling and whether or not some of these hyperscalers could, you know, drive better efficiency of their memory architecture. I'm just curious at a high level what your guys' thoughts are on that. Obviously, there's a lot of other factors, margins and, you know, just content growth in general, but any thoughts on CXL and having an effect on DRAM?

Sumit Sadana (Chief Business Officer)

Yeah, Aaron, CXL is likely to be experimented with at some of our customers. There are different companies that are in different phases of either assessing it or figuring out what they want to do with it, if anything. We are certainly going to have our memory be available in those CXL configurations. Now, my feeling is that the demand is in such a robust place in terms of gap between supply and demand being so significant that any and every available opportunity that can be used and deployed at scale is going to be something that customers will likely try if it is architecturally feasible and can be made to work with their software and their systems.

There is a lot of work to be done to bring these solutions to a large deployment scale, and it's not easy, and it's not something that can be pervasively used. There are lots of technical limitations as well. I'm sure those experimentations will continue and some limited deployments will be done in order to test it out and see how it performs.

Aaron Rakers (Managing Director and Technology Analyst)

Yep. Thank you.

Sumit Sadana (Chief Business Officer)

Sure.

Operator (participant)

Your next question comes from the line of Jim Schneider from Goldman Sachs. Your line is open. Please go ahead.

Sumit Sadana (Chief Business Officer)

Jim, can you hear us? Do you have a question? Operator, want to go to the next?

Operator (participant)

It seems like we may be having some technical difficulties. Just a reminder to unmute by pressing star six. In the meantime, we will move on to Atif Malik from Citi. Your line is open. Please go ahead.

Atif Malik (Managing Director and Senior Analyst)

Thank you guys for hosting the call. On the call, you mentioned that the non-HBM margins are higher than HBM. My question is really on allocation. How are you guys thinking about allocation? The reason I'm asking that question is because one of your Korean peers has been reported to be raising pricing by 100%, and I know you had pricing mid-60s for DRAM, and I understand there's always delta across different memory makers. Are you leaving money on the table by not pivoting more towards non-HBM? Just your understanding on HBM versus non-HBM allocation.

Sumit Sadana (Chief Business Officer)

Yeah, I mean, we definitely go through different quarters and these relationships between different products can change over time. There is no doubt that HBM pricing was set for a large portion of the shipments in 2026 calendar year, late in calendar 2025, as is generally the case with HBM, where the pricing is determined sometime before the start of the calendar year. That kind of a model is a good model to have. It provides good stability and visibility into the business. The pricing that we negotiate has very robust ROI and profitability, and we feel good about that business.

Of course, the upsides that we have been able to create from our operations team and supply chain teams on the HBM, we have been able to sell those for, you know, even more robust levels of pricing as those upside volumes have materialized. With that said, we don't view a lot of these HBM versus non-HBM allocations to be that tactically focused. These are strategic things that we have to do in terms of providing our customers with match sets of products so they can build AI systems. You cannot ship DDR5 into a AI system that doesn't have enough HBM in it and vice versa.

There is a natural level of balance that is needed for the market to be appropriately having match sets of of products to be able to ship these at scale. With that in mind, certainly the HBM market, you know, margins are good and the non-HBM margins of course have become even higher. Then this is not just a data center issue. The non-HBM margins outside the data center, meaning the DRAM margins outside the data center, are also exceptionally robust. We don't tend to, like, just jerk around the allocations to different customers and different segments just based on where the pricing is. We have a goal of working with our customers to meet their business needs, and we do that with an intent of helping them meet their business goals as well.

Atif Malik (Managing Director and Senior Analyst)

Thanks, Sumit. As my follow-up, NVIDIA talked about Groq chips could be like 25% of the ultra-fast inferencing market. My understanding is currently, you know, these LPU chips have embedded SRAM at one of the Korean makers. In the future, if these chips move to a Taiwanese foundry, will the SRAM be embedded on that chip, or can the SRAM be standalone and somehow bonded to the processor?

Sumit Sadana (Chief Business Officer)

You know, there are, you know, definitely on-chip SRAM approaches that are currently in use. I have seen, you know, some talk about bonded SRAM, et cetera. I would not want to comment on what directions our customers would go in terms of how they work with the SRAM over time. My main focus, our main focus here is that, you know, we look at all of these systems as being, you know, very well-balanced in the way they are evolving. These SRAM-based systems complement in small users, you know, the larger systems that are being utilized, deployed at scale, like the Vera Rubin, for example, and other similar systems that are based on ASIC accelerators.

We see these as continuing to move the ball forward in terms of making these systems more balanced, more efficient. The DRAM usage in these systems continues to grow over time and has gotten to levels which of course, you know, we don't have adequate supply for. Over time, we continue to focus on the growth of these average capacities, the growth of the high-performance tiers, and even the growth of LP DRAM in these AI systems, all of which are, you know, really big positives for us on the DRAM side of the business.

Atif Malik (Managing Director and Senior Analyst)

Thank you.

Operator (participant)

Your next question comes from the line of Vijay Rakesh from Mizuho. Your line is open. Please go ahead.

Vijay Rakesh (Managing Director)

Yeah, hi. Just a quick question on the NAND side. I know you mentioned SSD and NAND seeing a pretty strong uptick. Just wondering how to size it with the KV cache demand. What is the mix of, you know, SSD, AI data center SSD, I should say, AI data center NAND, if you look out for 2026, calendar 2026, versus last year, if you could, if there's a way to kind of size that pickup in demand from KV cache. A follow-up.

Sumit Sadana (Chief Business Officer)

You know, when we made our prior forecast of the extent of growth we are likely to see in the data center space, it did include a view that KV cache would be a meaningful driver. As that has become more in focus in terms of how our customers have been wanting to deploy their SSDs, definitely it has increased our view of the extent of demand coming from KV cache-related applications. That has continued to cause our view of the total market opportunity to continue to grow.

I'll just remind us that something we have said the last time also, which is that we have also seen a significant uptick in demand for data center SSDs coming from shortages in HDDs, and we continue to see those shortages for the foreseeable future. That has been another driver. When we put all of these together, you know, the NAND market is significantly undersupplied to the demand in the data center, and that demand continues to escalate, in part driven by KV cache, but also driven by just the insatiable appetite that these AI servers have to have, you know, fast storage capability available, as these systems get deployed more and more.

You know, the outlook is really strong and, you know, as we have mentioned earlier, our portfolio is incredibly well-positioned to continue to gain share in that space, including for KV cache applications, by the way. Yeah.

Vijay Rakesh (Managing Director)

Right. Thanks. Just to follow up on that, as you look at the CapEx, obviously last couple of years, CapEx and NAND has been lighter. As you look at the mix of CapEx for 2026, 2027 with the CapEx numbers that went up, how would you look at the mix of DRAM to NAND CapEx, given you know, both DRAM demand is up, but you're also seeing a spike in NAND because. Thanks.

Mark Murphy (CFO)

Maybe I'll start. Vijay, the CapEx is still gonna be dominated by DRAM and HBM additions. This FY 2026, we increased our outlook on CapEx to over $25 billion, which was up from the $20 billion we did on the last earnings call. The updated investment reflects this investment in the Tongluo fab, which we communicated at February conference and increases in U.S. expansion. Again, it's DRAM and HBM driven, including the increase. Today, we also provided more detail on construction, which we've said, you know, in the past was becoming a more material part of the build-out of, you know, obviously because we need greenfield capacity. On the December call, we said expect that FY 2025 or 2026 construction would double, you know, we expect FY 2026 construction to be mid to high-single-digit billions net. When I talk CapEx, we're talking net CapEx.

Vijay Rakesh (Managing Director)

Right.

Mark Murphy (CFO)

Now, as we look out to 2027, you know, we did say that we project approximately a $10 billion incremental construction cost, and also for equipment spend to increase. Now, that 2027 spend, you know, NAND will begin to increase, but it will still be a much smaller portion of the spend compared to DRAM.

Vijay Rakesh (Managing Director)

Got it. Thanks, Mark. Thanks so much.

Operator (participant)

Your next question comes from the line of Karl Ackerman from BNP Paribas. Please go ahead.

Sam Feldman (Equity Research Associate)

Hi, can you hear me?

Mark Murphy (CFO)

Yes.

Sam Feldman (Equity Research Associate)

Hi. Thanks for taking my question. This is Sam Feldman on for Karl Ackerman. We've seen continuous HBM content uplifts with each new generation of GPUs. With the growing trade ratio of HBM keeping the DRAM market tight and increasing AI memory requirements in the form of server DRAM, LPDDR, and SRAM, do you think HBM will continue to see such large content uplifts with each new generation of processors, or do you expect the content per accelerators to eventually plateau?

Sumit Sadana (Chief Business Officer)

We are not going to make long-term projections of where these average capacities will go because those are things that our customers are going to decide as their architectures evolve. What I will say is that if you look at the direction in which AI is trending and the types of things that are creating value in the AI domain for customers, they go towards, you know, more reasoning capability and more longer context windows and, you know, the ability to do more with agents and multi-agent orchestration. All of these things are really requiring more DRAM capacity and more DRAM bandwidth.

When it comes to delivering on that kind of bandwidth and being able to really optimize the system for all of the different stages of prefill and decode and you know different aspects of training as well as inference, you really come to the conclusion that you know these accelerators whether it's GPUs or ASICs do require increasing amount of HBM and increasing amount of DDR5 or LP5 capacity with time. That's the trend that we have seen thus far, and you're also seeing in the announced architectures that have been publicized. We feel you know certainly that the trend has been clear.

Not only has the trend been clear, but that the way our customers, meaning the end customers, derive value out of the AI system and AI applications is very much connected and consistent with, you know, the trend of needing more DRAM capacity and bandwidth, which HBM delivers so effectively at, you know, very good efficient power consumption levels. That's the reason we have said in the past that, you know, it, you know, memory is becoming a strategic asset in the AI era, is precisely one of the important drivers of that is precisely this trend that you really can't have a high-performing AI architecture or hardware infrastructure without all of those capabilities that DRAM and HBM bring to the table. That's something we feel pretty good about in terms of where the market is headed.

Sam Feldman (Equity Research Associate)

Great. Thank you.

Operator (participant)

Your next question comes from the line of Chris Caso from Wolfe Research. Your line is open. Please go ahead.

Chris Caso (Managing Director and Senior Analyst)

Yes. Thank you. Wondering if you care to update your view of long-term bit growth for both DRAM and NAND. I know you mentioned on the call low 20s% in DRAM, 20% for NAND, but those are obviously supply-constrained numbers. Maybe you speak to the extent to which you think long-term bit growth is increasing as a result of all the things that we've been talking about.

Sumit Sadana (Chief Business Officer)

Yeah, I mean, we haven't provided a new long-term bit growth number. I think you have seen that in the past we have spoken about, you know, high teens, mid to high teens type of ranges for DRAM. Yet you have seen that, last year, this year, you know, our forecasts have been more robust than those levels. We continue to feel like we are in an extended space of, you know, robust industry demand that, obviously, due to HBM being part of these numbers, with its trade ratio, is just stressing the entire industry and certainly our capabilities to be able to meet those demand numbers. You're right. I mean, these numbers, at least in the foreseeable future, are all supply limited numbers rather than the actual, level, true level of demand.

Yeah, I mean, that's sort of the environment we are in. We do expect that, you know, next year, again, we will have a fairly robust level of growth in calendar 2027. Yeah, we are not providing a long-term number beyond that commentary.

Chris Caso (Managing Director and Senior Analyst)

Got it. Thank you. As a follow-up, obviously, there's been a lot of discussion about cleanroom constraints. You know, as you folks have pointed out, you know, you're starting greenfield in Singapore, and it's not available till the end of 2028. I guess you probably can't speak for the industry, but for Micron, you know, at what point do you think you can get caught up with having enough cleanroom capacity, you know, that gives you some headroom for what the customers need? Then obviously it's gonna take time to move the tools into that. You know, it goes to the sustainability of what's going on right now and the extent to which the cleanroom constraints are gonna drive that sustainability.

Manish Bhatia (EVP of Global Operations)

Well, I think, Chris, the first part of the question is really around what Sumit had described in terms of, you know, the long-term demand, and we're, you know, continuing to evaluate all the different, you know, demand drivers and signals, including, you know, what all the announcements from GTC this week. In terms of the, you know, availability of cleanroom space for us and the industry, you know, certainly we see this, you know, relative constraint for all the major DRAM players to be there through this year and into next year with major meaningful improvement to cleanroom space availability only out into 2028.

Of course, as we think through our projects that we've announced, you know, go beyond that into 2028, 2029, 2030. As we think, as we talk about those projects, the timeline for us to get to the point, as you mentioned, will be a function of the demand. We will be nimble to be able to adjust equipment orders and equipment installations in order to, you know, stay aligned with whatever the demand is, as it plays out over that time horizon.

Sumit Sadana (Chief Business Officer)

I'll just add to what Manish said, which is that, you know, as we look at the demand side of the picture, and we have been engaged with our customers, several of them to be talking about these five-year, multi-year SCA agreements. In the context of that, of course, we are assessing their longer term demand, their five-year demand, for example. You know, we are assessing those against our own supply capabilities. Within that timeframe, we are also seeing the emergence and growth of some, you know, really exciting new demand vectors, including things like robotics, which we expect to become a very major demand driver.

When we put all of those things in the equation, we don't have a high confidence view yet as to when the supply will be able to catch up with demand, because the escalation of demand from these various vectors is just very phenomenal. You know, to answer the question as to when do we have a high confidence view that the supply will be able to catch up with demand, we don't really have a high confidence view yet as to when that would happen.

Chris Caso (Managing Director and Senior Analyst)

Thank you.

Operator (participant)

Your next question comes from the line of Srini Pajjuri from Royal Bank of Canada. Please go ahead.

Srini Pajjuri (Managing Director and Senior Equity Analyst)

Thank you. Mark, I wanna ask the previous question slightly differently on CapEx. You talked about CapEx being for the construction being high single digits in 2025, you know, growing by $10 billion. That suggests that, you know, roughly next year, half and half between construction and equipment. You know, for the projects that you already announced, do you think, you know, 2027 is the peak year for construction CapEx? If so, you know, what is the normalized, I guess, mix between construction equipment and through the cycles that you kinda think about?

Mark Murphy (CFO)

Yeah, Srini, we're not gonna give any more, you know, breakdown than we've provided. Yeah, it will be lumpy. Just to be clear, the $10 billion was in reference to 2026-2027. That wasn't quite. You may have said that, but I just wanna make sure it's clear. You know, we're of course going to modulate spend as we see demand and to maintain stable bit share. As Sumit mentioned, it's not clear if, you know, we're investing, you know, as we can, you know, including this recent acquisition of the Tongluo fab to put capacity in. It takes time and a lot of effort, and we can't get meaningful bits until our fiscal year 2028.

Beyond 2027, of course, we're going to be very disciplined, and, you know, 2027 could be, CapEx could go down after 2027, but we're not making that call at this point. We're just, you know, investing through this next several year period that we can try and, you know, get the supply we need for our customers.

Srini Pajjuri (Managing Director and Senior Equity Analyst)

Okay. Maybe a follow-up. For the, you know, next few quarters, there were questions on gross margins. Obviously, you don't wanna comment on pricing, but could you give us an idea how to think about any, you know, puts and takes on the cost side and also any mix dynamics, you know, that we should be aware of? Also, you know, given the higher CapEx, you know, how should we kind of think about depreciation over the next few quarters and any startup costs that you anticipate? Thank you.

Mark Murphy (CFO)

Yeah. Good questions. Maybe tackle first just on costs. We continue to execute really well on cost reductions. Nish can comment some more, but as we've talked about FY 2026, we have the benefit of you know, you know, the node transitions, 1-gamma in particular on DRAM and G9 on NAND that are driving our bit growth. We get a lot of you know, efficiency there and cost downs, and the 1-gamma, for example, replacing 1-alpha capacity. That's good, and the spend control's been outstanding. You know, as busy as we are, as good as the numbers are, the discipline in the business is very good.

Including managing, by the way, all this, you know, geopolitical, and the team has really been on top of that, and there's no impact to our operations. That's good. You know, I think to your question on startup costs, you know, as we bring on ID1, and then there's gonna be some with Tongluo. You know, I made some comments, maybe it was a year and a half ago, talking about startup costs, and those comments are still applicable. At the time, I said, I think between one point and two points of cost. Now revenue was much lower at the time.

If we dollarize that, it's probably $100 million, $100 million-$200 million per quarter starting in the next quarter or so, and then continuing on through 2027. It would come down off of that. Again, at these revenue margin levels, it's a much smaller impact, 50 basis points or less. Yeah, I think. What was the third question?

Srini Pajjuri (Managing Director and Senior Equity Analyst)

Depreciation question.

Mark Murphy (CFO)

Yeah, depreciation just depends on when, you know, when production wafers are out, and then, you know, and useful life of those, you know, those assets when they're put in service. Keep in mind it's, you know, that because this is greenfield, this is very long depreciation life. That's important to keep in mind.

Manish Bhatia (EVP of Global Operations)

Just a couple of things I'll add. Mark asked me to add some comments on cost. I think he covered a lot of them. 1-gamma is going really well. G9 are both going well, and we had positive comments on those. The only other one I'll add is HBM. I think our HBM3E 12-high has continued to execute well, you know, as we've gone through, you know, gotten to high volume over the last couple of quarters. We actually see HBM4, even though we're in the early stage of the ramp, having an even faster yield ramp than HBM3E 12-high. Our HBM cost structure, we feel very good about both HBM3E as well as HBM4, and then, you know, providing improvements for us over previous periods.

The other comment I'll just add, I think Mark referred to geopolitical, you know, managing, I think, costs relative to geopolitical issues. I think he meant about the Middle East.

Mark Murphy (CFO)

Yeah.

Manish Bhatia (EVP of Global Operations)

Some of the media reports regarding various input disruptions. You know, we, you know, we don't see any supply risks and very minimal impact to cost at this time.

Mark Murphy (CFO)

Yeah, maybe, Srini, just maybe one housekeeping issue since we're covering sort of those issues now. On OpEx, you know, we indicated that our OpEx would be ticking up due to R&D costs. In the fourth quarter, we would expect OpEx to be closer to $1.6 billion, just given that, you know, with part of it's the extra week, but part of it is just, again, this increase in R&D spend, which we think is completely appropriate to drive the technology, the increased value of memory, Micron's position, and then just customer demand on specific projects. We would expect in 2027 for that OpEx number to be, you know, over $1.6 billion, probably, you know, kind of a $1.7 billion run rate number and stabilize from there.

Srini Pajjuri (Managing Director and Senior Equity Analyst)

Thanks, Mark.

Operator (participant)

This concludes today's call. Thank you for attending. You may now disconnect.