Spotlight on Tech

Connected intelligence: Where infrastructure meets AI

By
Vaibhav Dongre
Vice President Marketing
Rakuten Symphony
September 8, 2025
3
minute read

As telecom networks evolve to support next-generation services, the convergence of AI and infrastructure – from the network core to the edge – is becoming essential. In this session, leaders from Colt Technology and Rakuten Symphony discussed how AI-driven capabilities embedded across distributed infrastructure are enabling real-time intelligence, automation, and personalized customer experiences.

Speakers:

  • Frank Miller – Chief AI Architect, Colt Technology
  • Rishi Shukla – Vice President Sales, APAC, Rakuten Symphony

Watch the full interview.

The session began by highlighting the shift from centralized to distributed intelligence. AI workloads are increasingly moving closer to where data is generated – driven by latency-sensitive use cases in security, industrial IoT, healthcare, and AR/VR. As inference moves to the edge, telecom operators are in a unique position to support hybrid AI architectures that balance cloud scale with local responsiveness.

This redistribution isn’t just about moving workloads. It’s about rethinking how intelligence, compute, and orchestration are designed – and how telcos, infrastructure providers, and partners collaborate to deliver seamless AI outcomes.

Orchestration, partnerships, and redefining the edge

Both speakers emphasized that scaling AI at the edge cannot be a CSP-only effort. It requires deep integration with hardware providers, cloud players, and software vendors. Sites are increasingly becoming mini data centers – and that brings both cost and complexity. Sharing infrastructure, co-developing orchestration layers, and embracing open ecosystems will be essential to ensure interoperability and avoid overbuilding.

From intent-based networking to AI frameworks capable of interpreting real-time data and automating topologies, the panel pointed to a future where infrastructure becomes smarter, more responsive, and more collaborative by design.

Key takeaways

  • AI workloads follow data gravity. Inference is shifting to the edge, where latency, security, and sovereignty demands are highest.
  • Collaboration is key. Telcos, OEMs, and infrastructure players must align from day one to avoid fragmented deployment.
  • Sites are becoming edge data centers. Unified orchestration is needed to manage distributed AI with efficiency.
  • Smarter architecture beats overbuilding. Efficient model design and thoughtful reference architectures will define long-term success.
  • AI-native means shared responsibility. Network intelligence must be a joint outcome across organizational and technical boundaries.


Connected intelligence: Where infrastructure meets AI
"3G, 4G operators have a tough challenge when it comes to the spectrum prices... now we are building a new infrastructure altogether, which is more edge-centric."
-Rishi Shukla, Vice President Sales, APAC, Rakuten Symphony

Follow us for more insightful perspectives on the future of cloud-native, AI-native, and programmable telecom networks.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
5G
Cloud
Cloud-Native
AI
Automation
How can we help?
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Notice for more information.