• DT & SKT are training their own telco-specific LLM

  • Expect a broader announcement at MWC next year

  • It is all about keeping costs down

Deutsche Telekom started an artificial intelligence (AI) alliance with SKT and is now developing a large language model (LLM) for telecoms with the South Korean operator in order to keep its costs down, the mobile network operator (MNO) told Silverlinings.

In a pre-dawn call with your bleary-eyed correspondent, Jon Abrahamson, chief product and digital officer at Deutsche Telekom, explained that renting AI models from hyperscalers would become very expensive for DT, which is already planning how it can use AI in many parts of the operator. Better to build models that are adapted to the work the operator does and will cost DT less, maybe much less, in the long-term,he explained.

“Using our own models... is good because they understand us, they’re more effective, but also from a cost point of view.” Abrahamson said. “Every time you have want to have an interaction with the closed-source large language models the business model is per token, or per word basically. The more you use them the more costs you have.”

So, Abrahamson explained, DT believes that the operator will be using AI models massively across the organization in the future. “We want to own our own destiny and not be beholden to a rental model that we’d be taking from hyperscalers,” he said.

Currently, the MNO is working on AI for customer service and sales tasks, Abrahamson explained. It is currently using a Claude 2 closed source model from Anthropic, as well as the open source Llama 2 model from Meta for customer-facing tasks.

“There’s a lot of orchestration and plumbing that needs to be done on the back-end,” too, Abrahamson said.

DT is planning to move as fast as possible to its own model for customer service and personalization tasks. The DT chief product officer said that the MNO had started training its own model “We’re talking a broader announcement around [Mobile World Congress] in the new year,” he said.

The ultimate intention, Abrahamson said, is to deliver AI that can reduce DT’s operational expenses, in a retail context, customer support, and maybe eventually in networks. “It’s something we’re thinking deeply about at moment,” he said, without elaborating on any timescale.

“It’s hard to see a world where AI doesn’t change everything,” he said.


Want to discuss AI workloads, automation and data center physical infrastructure challenges with us? Meet us in Sonoma, Calif., from Dec. 6-7 for our Cloud Executive Summit. You won't be sorry.